Robots.txt Generator & Tester

Robots.txt Generator & Tester

Robots.txt Generator

Use one user-agent per line. All listed user-agents will receive the same rules in this version.
Optional. Use one path per line.
Optional. Use one path or pattern per line.
Optional. Use one sitemap URL per line.
Optional. Enter only the host name if you want to include the Host directive.
Optional. Some crawlers support it, but Google ignores this directive.
Optional. These lines will be appended exactly as written.

Generated robots.txt

User-agents
2
Allow Rules
2
Disallow Rules
4
Sitemaps
2

Robots.txt Tester

Normalized Path
Matched Rule
No matching rule
Matched Group(s)
No matching user-agent group
Parsed Groups / Rules
0 / 0
Host:
Sitemaps:

What is a robots.txt file?

A robots.txt file tells search engine crawlers which parts of your website they can or cannot access. It is commonly used to control crawling for admin areas, internal search results, temporary folders, or duplicate content paths.

How to use this tool

Enter one or more user-agents, add your allow and disallow rules, and generate a ready-to-use robots.txt file. You can then test any URL or path with a specific crawler user-agent to understand whether that URL is allowed or blocked.

Why it helps webmasters

This tool is useful when launching a new website, migrating sections, cleaning up crawl waste, or double-checking that important content remains accessible to search engines while private paths stay blocked.


Mod

We care about your data and would love to use cookies to improve your experience.