ModelTrace

AI Robots.txt Generator

Generate a customized robots.txt file to control which AI crawlers and search engines can access your website. Essential for AI SEO and Generative Engine Optimization (GEO).

Basic Configuration

Set the default behavior for all crawlers

Restricted Directories

Specify paths that should be blocked for all crawlers

/admin/
/private/
/tmp/

AI Crawlers

GPTBot
OpenAI

Trains ChatGPT models

OAI-SearchBot
OpenAI

Powers ChatGPT web search

ChatGPT-User
OpenAI

Fetches shared links

ClaudeBot
Anthropic

Claude AI crawler

anthropic-ai
Anthropic

Claude training data

claude-web
Anthropic

Fresh web content

PerplexityBot
Perplexity

AI search index

Google-Extended
Google

Gemini AI

Amazonbot
Amazon

Alexa & recommendations

Applebot-Extended
Apple

Apple AI training

Bytespider
ByteDance

TikTok's AI

DuckAssistBot
DuckDuckGo

Private AI answers

cohere-ai
Cohere

Enterprise LLMs

meta-externalagent
Meta

Meta AI crawler

MistralAI-User
Mistral

French AI company

Search Engine Crawlers

Googlebot
Google

Google Search

Googlebot-Image
Google

Google Images

Googlebot-Mobile
Google

Google Mobile Search

Bingbot
Microsoft

Microsoft Bing

Generated robots.txt

Preview and download your robots.txt file

Quick Tips

1

Test your robots.txt in Google Search Console before deploying

2

Monitor AI crawler activity to track which bots visit your site

3

Update your robots.txt quarterly as new AI crawlers emerge

4

Balance access and protection for optimal visibility