🤖 Robots.txt Generator

Build a valid robots.txt file with visual controls. Choose user‑agents, allow/disallow paths, crawl‑delay, and sitemap URL.

Use * for all crawlers or specify e.g. Googlebot.

How to Use

  1. Set User‑agent and Sitemap URL if available.
  2. Select Allow/Disallow paths or add custom entries.
  3. Optionally set Crawl‑delay to control bot pacing.
  4. Click Generate, then Copy or Download.

FAQ

What should my User‑agent be?

Use * for all bots. To target a specific crawler, set a name like Googlebot.

Is Sitemap required?

Not required, but recommended. Add a full URL to help crawlers discover pages.

Does Crawl‑delay work for all bots?

Not all bots honor it, but many do. Use it to reduce server load.