Robots.txt Generator
Presets for standard SEO, blocking AI crawlers, or custom rules. Includes sitemap URL and AI crawler blocker.
Adding your sitemap URL helps search engines discover your pages faster.
Save this as robots.txt at the root of your site - e.g. yoursite.com/robots.txt.
Need an SEO strategy, not just config?
We do programmatic SEO hands-on - up to 300 ranked pages built and shipped to your repo, planned and reviewed by a human. Starting at $300.
Your robots.txt controls which crawlers can access which parts of your site. This tool generates clean rules for standard SEO, blocks the most aggressive AI crawlers (GPTBot, ClaudeBot, PerplexityBot, etc.) if you want to protect content, and lets you customize per-bot directives.
Frequently asked questions
Should I block AI crawlers?
Depends on your goal. If you want LLM citations and traffic from AI search, allow them. If you're protecting content from being trained on, block them. The tool has presets for both.
What goes in the sitemap line?
Your absolute sitemap URL (e.g. https://example.com/sitemap.xml). Crawlers use this to find all your pages.
Long-form guides on the strategy behind the tool.
What is SEO? A Founder's Definition (2026)
SEO explained for founders: what it is, why it matters, and the easiest path to ranking.
Read guide →SEO for SaaS: A Practical Guide
Practical SEO for SaaS founders - what to do, in what order, with realistic timelines.
Read guide →SaaS SEO Strategy: The Complete Playbook
Build a SaaS SEO strategy from zero: keyword research, content clusters, on-page tactics, and pSEO that compounds.
Read guide →More free SEO tools
Free tools to plan, audit, and ship better SEO pages.
Want this done for you?
Book a 15-minute call or request a quote. We build 50+ pSEO pages for clients, starting at $300.