Robots.txt Validator

Robots Directives Input

0 rows

Testing URLs Input

0 URLs

Robots Directives Input

This tool is a beta version. Be careful and double check everything. Use outputs on your own responsibility.

Testing URLs Input

Directives overview
Overview of all directives will be displayed after validation.
How many times directive was triggered will be displayed here after validation.


All Disallowed URLs by one or more directives.


Allowed URLs as exceptions from some Disallow.

Allowed Only

Specifically Allowed URLs which wouldn't be blocked by any Disallow.


No Disallow or Allow could be applied on these URLs.


Robots.txt Validator is a tool for validating robots.txt directives with multiple URLs. It will help you test and properly validate the new robots.txt file before deployment. And it’s just a simple web browser utility, not a crawler. Therefore it will not create redundant traffic on your website and mess your access logs data.

Input robots.txt directives and a dataset of URLs. Push Validate button to run a validation process. It can take a while. As a result, you’ll see nice statistics and grouped URLs.

This tool wasn’t appropriately tested to check the limits. It will be updated soon.

If you need to process more data, you are in trouble. The only other option is probably to use Screaming Frog, which will mess up your access logs.

This tool does not gather personal data, inputs or outputs. Usage is fully anonymous, free of charge and free of ads.