Robots.txt Generator

Create a custom robots.txt file to help search engines crawl your site more effectively.

1. Default - All Robots are:

2. Crawl Delay (in seconds):

3. Sitemap (Optional):

4. Restricted Directories:

Enter one per line (e.g. /cgi-bin/)