Robots.txt Tester & Validator
Free robots.txt tester — check which URLs are allowed or blocked by your robots.txt rules. Validates syntax, tests any URL against Googlebot or custom user-agents.
robots.txt Content
Test URLs
Robots.txt Tester & Validator
Use this free robots.txt tester, validator, and checker to see whether a URL is allowed or blocked for Googlebot, Bingbot, or any custom crawler. Paste your robots.txt, enter a URL, choose the User-agent, and get an instant rule-by-rule result.
How to test your robots.txt
- 1
Paste your robots.txt
Copy the full content from your live /robots.txt file or draft rules and paste them into the tester.
- 2
Enter a URL to test
Provide the exact path you want to check, such as /admin/dashboard, /products/item, or a full URL.
- 3
Choose a User-agent
Select Googlebot, Bingbot, Googlebot-Image, or type a custom bot name to simulate how that crawler reads the file.
- 4
Read the matched rule
The tool shows the specific Allow or Disallow rule that wins, whether crawling is allowed, and why.
What the robots.txt checker validates
Allow vs Disallow
See which directive matches the longest path and whether it permits or blocks the tested URL.
Crawler-specific rules
Compare rules for Googlebot, Bingbot, all crawlers (*), or custom bots before publishing changes.
Syntax issues
Catch malformed directives, empty groups, unexpected spacing, and confusing rule order.
Crawl, not index
Understand whether the URL can be crawled. Use noindex separately when the page should not appear in search.
FAQ
What is a robots.txt validator?
It checks your file for syntax errors and confirms rules work as intended for specific URLs and bots.
How do I check if Googlebot can crawl a page?
Set User-agent to Googlebot, enter the URL path, paste your robots.txt — the tester shows if Googlebot can crawl it.
Why does a URL show as blocked?
A matching Disallow rule may be more specific than your Allow rule, or the URL may be inside a blocked folder. Check the matched rule shown by the tester.
Why test robots.txt before deploying?
Manual rules are easy to get wrong. A tester prevents you from accidentally blocking important pages from Google.
Continue Exploring
Other Developer Tools you might like...
JSON Formatter
Format, validate, and minify JSON data with syntax highlighting
Base64 Encoder/Decoder
Encode text to Base64 and decode Base64 strings
URL Encoder/Decoder
Encode and decode URL components and query strings
UUID Generator
Generate random UUID v4 identifiers
Hash Generator
Generate MD5, SHA-1, SHA-256, and SHA-512 hashes from text
Regex Tester
Test and debug regular expressions with match highlighting
JWT Decoder
Decode and inspect JWT token header and payload
HTML Formatter
Beautify and format HTML code with proper indentation