robots.txt checker

This tool validates /robots.txt files according to the robots exclusion de-facto standard.

Checks are done considering the original 1994 document A Standard for Robot Exclusion, the 1997 Internet Draft specification A Method for Web Robots Control and nonstandard extensions that have emerged over the years.

The robots.txt validator checks the syntax and structure of the document, is able to spot typos and can be used to test if a specific crawler is permitted access to a given URL.

Can be any document on your website; will automatically download http://yoursite/robots.txt

Tests if the user agent may access the URL above.

Comma separated list; in example Googlebot, Yahoo Slurp, Myrobot