Robots.txt Generator is an easy-to-use tool to create proper Robots.txt directives for your site: Easily copy and tweak Robots.txt files from other sites or create your own. When search engine spiders crawl a website, they typically start by identifying a robots.txt file at the root domain level. Upon identification, the crawler reads the file’s directives to identify directories and files that may be blocked. Blocked filed can be created with the robots.txt generator; these files are, in some ways, the opposite of those in a website’s sitemap, which typically includes pages to be included when a search engine crawls a website.
As you use the robots.txt file generator, to see a side-by-side comparison on how your site currently handles search bots versus how the proposed new robots.txt will work, type or paste your site domain URL or a page on your site in the text box, and then click Create Robot.txt button.
We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil.
C. A. R. Hoare