What is robots.txt and How to use

Robots.txt File in SEO

Robots.txt file is a text file(Not Html) which tells search engine not to index one or more url from a website. So if you are a webmaster and having some sensitive data on your website and you don,t want to come it in the search result, you can use the robots.txt file.  Some sites have more than one url for same topic so website owners use robots.txt to remove Duplicate Content.
However a search engine doesn’t guarantee it to follow the instruction as we put in Robots.txt file but generally it succeed. the key to work is we should always put the file in the domain root since Google simply search for the file in domain root like www.yourdomain.com/robots.txt and if not found Google can index all the Urls.Recommended :  Create xml sitemap for free

How to create Robots.txt for a Website

As the name robots.txt explains the file is created in a txt file. So simply make a txt file and paste the structure in it. The basic structure of a robots.txt file is as below –
User-agent: *
Allow: /
sitemap: http://yourwebsite/sitemap.xml
Disallow: /link-exchange.php
Disallow: /i
  • Replace http://yourwebsite/sitemap.xml with your website’s sitemap
  • and link-exchange.php and i with the urls whom you don’t want to get index.
In the above structure : The term Allow:// is to say search engine to crawl the sitemap address. where as disallow:/link-exchange.php and Disallow: /i  will ask search engine not to crawl the address http://yourwebsite/link-exchange.php and http://yourwebsite/i. although if you want not to index more urls you can simply add those lines as the same way as I have written for the two urls.

Note : You need to save the txt file with the name robots.txt . Just upload this robots.txt file in root domain means at the same location where Index file is present. 

How to add Custom robots.txt File in Blogger

In my previous post i have discussed about how to do custom setting for Blogger Headings and Complete Guide for Onpage SEO for Blogger. This post will cover the topic – How to add Custom robots.txt. This tutorial will tell you the process to add robots.txt file in Blogger. Use the simple method as given below –

Add Custom Robots.txt in Blogger
  • Go to Blogger Dashboard
  • Click on Setting
  • Go to Search Preferences
  • Under custom robots.txt click Yes on Enable custom robots.txt content?
  • Now add the following code and then Click on Save.
User-agent: Mediapartners-Google

User-agent: *
Disallow: /search
Allow: /

Sitemap: http://www.a1bloggerseo.com/feeds/posts/default?orderby=UPDATED

Explanation :

User-agent: Mediapartners-Google : This Term is for Google Adsense leave it as it is.
Disallow: /search : This one is used for not to use Labels in Blogger. Disallow: /search will help not index the urls with address /search after the blog domain like http://www.a1bloggerseo.com/search/label/seo%20tips. Leave this term as it is in the sitemap.
Allow: / : This one is used to ask search engine to crawl sitemap of the Blog

Know how to create and Submit Blogger Sitemap

How to check robots.txt file of a Website/Blog

Google Webmaster tools has recently announced a feature which will allow webmasters to see whether robots.txt file is working or not on your Website. to see this you can simply Login to Webmaster tools >> Select your Site >> Click on Robots.txt Tester From here you can check the condition of the robots.txt file.
If you want to see the robots.txt structure of any website Just type websiteurl/robots.txt like http://www.a1bloggerseo.com/robots.txt


This post is created by Sanjay Kumar Choubey, Circle him on Google+ to get connected from his Newer posts.

Leave a Reply

Your email address will not be published. Required fields are marked *