A robots.txt generator helps you build the robots exclusion protocol file that controls which pages search engine crawlers can access. Use it to block AI training bots, protect private paths, set crawl delays, and declare your sitemap location — all without writing the format by hand.

Preset Templates

User-Agent Rules

Global Directives

Specify canonical hostname (used by some crawlers)

robots.txt Preview

real-time

      

Common AI Bot User-Agents

GPTBotOpenAI Claude-WebAnthropic CCBotCommon Crawl Google-ExtendedGoogle AI anthropic-aiAnthropic OmgilibotWebz.io
Copied to clipboard!