UtilityKit

40+ fast, free tools. Most run in your browser only; Image & PDF tools upload files to the backend when you run them.

Robots.txt Generator

Generate a valid robots.txt with user-agent, allow/disallow, sitemap, and host directives.

About Robots.txt Generator

Robots.txt Generator helps you build crawler directives without manual syntax errors. Add a target user-agent, set allow/disallow paths line by line, include sitemap URLs, and optionally define host. The output is plain text and copy-ready for your site root so you can ship crawl-control updates quickly.

Why use Robots.txt Generator

  • Generate properly formatted robots.txt files faster.
  • Reduce syntax mistakes in allow/disallow directives.
  • Manage multiple sitemap entries in one output.

How to use Robots.txt Generator

  1. Set user-agent value (or * for all).
  2. Add allow and disallow paths line by line.
  3. Add sitemap URLs and optional host.
  4. Copy the generated robots.txt output.

When to use Robots.txt Generator

  • Setting crawl directives on new site launches.
  • Adjusting blocked paths after content changes.
  • Adding sitemap references for search engines.

Frequently Asked Questions

What does user-agent control?

It determines which crawler the rules apply to.

Should I use both allow and disallow?

Use either or both depending on your crawl policy.

Can I include multiple sitemaps?

Yes, add one URL per line.

Is host required?

No, host is optional and crawler-specific.

Where does robots.txt live?

At your domain root, e.g. /robots.txt.

Related tools