A robots.txt file is a plain-text document placed at the root of your website that instructs search engine crawlers which pages or directories they may access. Properly configuring robots.txt helps you manage crawl budget, protect private areas from indexing, and direct bots to your XML sitemap. Use the builder below to create valid rules for any user-agent, then copy or download the finished file.

CMS Presets

User-Agent Blocks

Global Settings

robots.txt Preview