Robots.txt Generator
Create a customized robots.txt file to control how search engines crawl and index your website. A properly configured robots.txt file helps search engines understand which parts of your site should be crawled and which should be ignored.
How to use: Fill in the fields below to generate a robots.txt file tailored to your website’s needs. Once generated, copy the code and save it as “robots.txt” in your website’s root directory.
Specify paths you want to block search engines from crawling:
Specify paths you explicitly want to allow (overrides Disallow rules):
Specify how many seconds search engines should wait between requests (not supported by all search engines):