Robots.txt is a text file which contains few lines of simple code. It is saved on the website or blog's server which instruct the web crawlers to how to index and crawl your blog in the search results. That means you can restrict any web page on your blog from web crawlers so that it can't get indexed in search engines
like your blog labels page, your demo page or any other pages that are not as important to get indexed. Always remember that search crawlers scan the robots.txt file before crawling any web page.
Each blog hosted on blogger have its default robots.txt file which is something look like this:


User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: /search
Allow: /
Sitemap: http://example.blogspot.com/feeds/posts/default?orderby=UPDATED


Explanation


This code is divided into three sections. Let's first study each of them after that we will learn how to add custom robots.txt file in blogspot blogs.
  1. User-agent: Mediapartners-Google
  2. This code is for Google Adsense robots which help them to serve better ads on your blog. Either you are using Google Adsense on your blog or not simply leave it as it is.
  3. User-agent: *
  4. This is for all robots marked with asterisk (*). In default settings our blog's labels links arerestricted to indexed by search crawlers that means the web crawlers will not index our labels page links because of below code.
Categories:
Related Posts Widget For Blogger with ThumbnailsBlogger Templates

2 comments:

  1. This comment has been removed by the author.

    ReplyDelete
  2. artikel anda sangat membantu saya terima kasih sudah membuat atrikel ini tentang penerapan robots.txt

    ReplyDelete