Robots.txt

The robots exclusion standard, also known as the robots exclusion protocol or simply robots.txt, is a standard used by websites to communicate with web crawlers and other web robots. The standard specifies how to inform the web robot about which areas of the website should not be processed or scanned. Robots are often used by search engines to categorize websites. Not all robots cooperate with the standard; email harvesters, spambots, malware and robots that scan for security vulnerabilities may even start with the portions of the website where they have been told to stay out. The standard can be used in conjunction with Sitemaps, a robot inclusion standard for websites.

The robots exclusion standard, also known as the robots exclusion protocol or simply robots.txt, is a standard used by websites to communicate with web crawlers and other web robots. The standard specifies how to inform the web robot about which…

Our latest “Robots.txt” Articles:

SEO Dictionary
SEO 101 Back to Top 10 blue links: The format search engines used to display search results; ten organic results all appearing in the same format. See blue links. Web crawler: The process by which search engines discover your web pages.   
Read More

Try the DevForSEO
SEO Auto Pages Plugin


Increase your Page Count by Hundreds! Even More! Highlight a Keyword from
Existing Pages/Posts and an Auto Definition Page gets created!
More Info...