Sitemap URL will be auto-populated based on your domain
Quick Templates
Custom Rules
How to Use
- Select a User-agent (crawler type), such as Googlebot, GPTBot, Claude-Web, etc.
- Add Allow or Disallow path rules for each crawler.
- (Optional) Enter your Sitemap URL and Crawl-delay settings.
- Click 'Generate', save the output as robots.txt and upload it to your website's root directory.
Why Use This Tool?
Robots.txt is the key file for controlling how search engines and AI crawlers access your website. Proper configuration prevents sensitive pages from being indexed while ensuring important content remains visible to search engines and AI models.