Robots.txt Generator
Create a custom robots.txt file to guide search engine crawlers on your website.
Generated Robots.txt
User-agent: *
Allow: /
Disallow: /admin
Disallow: /api
Disallow: /private
In-Depth Guide
Everything you need to know
A robots.txt file tells search engine crawlers which pages or files the crawler can or can't request from your site. This is used mainly to avoid overloading your site with requests.
Common Directives:- User-agent: The crawler the rule applies to (e.g., Googlebot). '*' applies to all.
- Allow: A path that should be crawled.
- Disallow: A path that should NOT be crawled.
- Sitemap: The location of your website's sitemap.