robots.txt Checker for AI
Analyze your robots.txt file to see which AI bots can access your site. Check Content-Signal directives, sitemap declarations, and get a detailed score with recommendations.
We will fetch /robots.txt from this domain and analyze it for AI bot access.
Analysis Results
Valid robots.txt with AI bot rules Invalid or empty robots.txt
/ checks passed
AI Bots Status
| Bot | Company | Status | Rules |
|---|---|---|---|
checks passed
Details
- Content-Signal:
- Not found
- Policy comment:
- Yes No
- Sitemaps:
- Not found
- File stats:
- lines, bytes
Raw Content
What is robots.txt for AI?
robots.txt is a standard file at the root of every website that tells web crawlers which pages they can access. With the rise of AI agents like GPTBot (OpenAI), ClaudeBot (Anthropic), and PerplexityBot, it is essential to configure specific rules for these bots.
Content-Signal directives let you specify how AI agents may use your content: search=yes allows search indexing, ai-input=yes allows use as AI context, and ai-train=no blocks use for model training.
This tool checks your robots.txt for 7 criteria including bot accessibility, Content-Signal presence, sitemap declaration, and policy comments.