Robots.txt Validator & Tester
Prüfen, ob Ihre robots.txt-Datei bestimmte URLs und User-Agents zulässt oder blockiert
robots.txt Content
Test URLs
Robots.txt Tester & Validator
Use this free robots.txt tester, validator, and checker to see whether a URL is allowed or blocked for Googlebot, Bingbot, or any custom crawler. Paste your robots.txt, enter a URL, choose the User-agent, and get an instant rule-by-rule result.
How to test your robots.txt
- 1
Paste your robots.txt
Copy the full content from your live /robots.txt file or draft rules and paste them into the tester.
- 2
Enter a URL to test
Provide the exact path you want to check, such as /admin/dashboard, /products/item, or a full URL.
- 3
Choose a User-agent
Select Googlebot, Bingbot, Googlebot-Image, or type a custom bot name to simulate how that crawler reads the file.
- 4
Read the matched rule
The tool shows the specific Allow or Disallow rule that wins, whether crawling is allowed, and why.
What the robots.txt checker validates
Allow vs Disallow
See which directive matches the longest path and whether it permits or blocks the tested URL.
Crawler-specific rules
Compare rules for Googlebot, Bingbot, all crawlers (*), or custom bots before publishing changes.
Syntax issues
Catch malformed directives, empty groups, unexpected spacing, and confusing rule order.
Crawl, not index
Understand whether the URL can be crawled. Use noindex separately when the page should not appear in search.
FAQ
What is a robots.txt validator?
It checks your file for syntax errors and confirms rules work as intended for specific URLs and bots.
How do I check if Googlebot can crawl a page?
Set User-agent to Googlebot, enter the URL path, paste your robots.txt — the tester shows if Googlebot can crawl it.
Why does a URL show as blocked?
A matching Disallow rule may be more specific than your Allow rule, or the URL may be inside a blocked folder. Check the matched rule shown by the tester.
Why test robots.txt before deploying?
Manual rules are easy to get wrong. A tester prevents you from accidentally blocking important pages from Google.
Weiter entdecken
Weitere Entwickler-Tools, die Ihnen gefallen könnten…
JSON-Formatierer
JSON mit Syntaxhervorhebung und Fehlererkennung formatieren, validieren und minifizieren
Base64 Encoder/Decoder
Text oder Dateien nach Base64 kodieren und Base64-Strings in lesbaren Text dekodieren
URL Encoder/Decoder
URL-Komponenten und Query-String-Parameter kodieren und dekodieren
UUID-Generator
Zufällige UUIDs (v1, v4) generieren oder mehrere UUIDs in Bulk erstellen
Hash-Generator
MD5-, SHA-1-, SHA-256-, SHA-512-Hashes aus Text oder Dateien generieren
Regex-Tester
Reguläre Ausdrücke mit Echtzeit-Übereinstimmungshervorhebung testen
JWT-Decoder
JSON Web Tokens dekodieren und prüfen — Header, Payload und Signaturen ansehen
HTML-Formatierer
HTML-Code mit korrekter Einrückung und Syntaxhervorhebung formatieren