robots.txt Generator for AI

Generate a robots.txt file optimized for AI agents. Select which bots to allow or block, configure Content-Signal directives, and download the result.

AI Bots

Choose how each AI bot can access your site. Default means no specific rule is generated for that bot.

GPTBot OpenAI — ChatGPT
ChatGPT-User OpenAI — ChatGPT browsing
ClaudeBot Anthropic — Claude
Claude-Web Anthropic — Claude web search
PerplexityBot Perplexity — Perplexity AI
Google-Extended Google — Gemini
Bytespider ByteDance — TikTok / Doubao
CCBot Common Crawl — Training data
Amazonbot Amazon — Alexa / Rufus
FacebookBot Meta — Meta AI
cohere-ai Cohere — Cohere models
Applebot-Extended Apple — Apple Intelligence
| |

Content-Signal Directives

Control how AI agents may use your content. These directives are added to the robots.txt global section.

Allow search indexing by AI

Allow use as AI input/context

Allow use for AI training

Additional Settings

seconds

Generated robots.txt

allowed, blocked, bytes


        
        

What is robots.txt for AI?

robots.txt is a standard file that tells web crawlers which pages they can access. With AI agents becoming a major source of traffic, it is essential to configure robots.txt with specific rules for AI bots like GPTBot, ClaudeBot, and PerplexityBot. Content-Signal directives let you specify how AI agents may use your content for search, input, or training.