What is Robots.txt in SEO
What is robots.txt used for
In simple words a Robots.txt is nothing but a text file or html code we put on our website to tell search engine robots which pages we would like them not to crawl.
It is very useful when search engines frequently visit our site and index our content. The file is essentially a list of commands, such Allow and Disallow, that tell web crawlers which URLs they can or cannot retrieve. For example If a URL is disallowed in your robots.txt, that URL and its contents won't appear in Google Search results.
robot.txt example file |
We can create robot.txt for blogger and robot.txt for wordpress blog. So custom robots.txt is very useful because we can set the visibility of our tutorials, articles on search engines like Google,Yahoo and Bing, even we can determine whether the article will be indexed by search engines or not.
So in this article I would like to show robot txt example and robot.txt html code for blogger and wordpress blog. Please find the robot.txt best practices with example :-
User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: /search?updated-min=*
Disallow: /search?updated-max=*
Disallow: /search/label/*?updated-min=*
Disallow: /search/label/*?updated-max=*
Disallow: /search?page=*
Disallow: /*archive.html
Disallow: /view/*
Disallow: /?m=1
Disallow: /?m=0
Disallow: /*?m=1
Disallow: /*?m=0
Allow: /
Sitemap: http://www.yourwebsite.com/feeds/posts/default?orderby=UPDATED
What is Robots.txt in SEO
Reviewed by Ravi Kumar
on
12:22 AM
Rating:
No comments: