Developer Tools

Robots.txt Tester & Validator

Free robots.txt tester — check which URLs are allowed or blocked by your robots.txt rules. Validates syntax, tests any URL against Googlebot or custom user-agents.

robots.txt Content

Test URLs

Robots.txt Tester & Validator

Use this free robots.txt tester, validator, and checker to see whether a URL is allowed or blocked for Googlebot, Bingbot, or any custom crawler. Paste your robots.txt, enter a URL, choose the User-agent, and get an instant rule-by-rule result.

How to test your robots.txt

  1. 1

    Paste your robots.txt

    Copy the full content from your live /robots.txt file or draft rules and paste them into the tester.

  2. 2

    Enter a URL to test

    Provide the exact path you want to check, such as /admin/dashboard, /products/item, or a full URL.

  3. 3

    Choose a User-agent

    Select Googlebot, Bingbot, Googlebot-Image, or type a custom bot name to simulate how that crawler reads the file.

  4. 4

    Read the matched rule

    The tool shows the specific Allow or Disallow rule that wins, whether crawling is allowed, and why.

What the robots.txt checker validates

Allow vs Disallow

See which directive matches the longest path and whether it permits or blocks the tested URL.

Crawler-specific rules

Compare rules for Googlebot, Bingbot, all crawlers (*), or custom bots before publishing changes.

Syntax issues

Catch malformed directives, empty groups, unexpected spacing, and confusing rule order.

Crawl, not index

Understand whether the URL can be crawled. Use noindex separately when the page should not appear in search.

FAQ

What is a robots.txt validator?

It checks your file for syntax errors and confirms rules work as intended for specific URLs and bots.

How do I check if Googlebot can crawl a page?

Set User-agent to Googlebot, enter the URL path, paste your robots.txt — the tester shows if Googlebot can crawl it.

Why does a URL show as blocked?

A matching Disallow rule may be more specific than your Allow rule, or the URL may be inside a blocked folder. Check the matched rule shown by the tester.

Why test robots.txt before deploying?

Manual rules are easy to get wrong. A tester prevents you from accidentally blocking important pages from Google.

Don't see what you need?

We build free tools based on community feedback. If there's a utility that would improve your workflow, suggest it today!

Robots.txt Tester — Validate & Test robots.txt Online Free | FreeTool24 | FreeTool24