DNS Insight
robots.txt Validator
robots.txt Validator - Test Crawl Rules
Validate robots.txt syntax and test if URLs are allowed or blocked.
How It Works
The validator parses robots.txt syntax, checks for common errors, and can test if a specific URL path is allowed for a given user-agent.
Understanding Results
Results show syntax errors, warnings, and the allow/disallow status for tested URLs. Follows Google's robots.txt specification.