Robots.txt Generator

Online Robots Txt Creator Tool


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   


Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.

Create a robots.txt file

Free Robots.txt Generator Tool

Robots.txt Generator is an easy-to-use tool to create proper Robots.txt directives for your site: Easily copy and tweak Robots.txt files from other sites or create your own. When search engine spiders crawl a website, they typically start by identifying a robots.txt file at the root domain level. Upon identification, the crawler reads the file’s directives to identify directories and files that may be blocked. Blocked filed can be created with the robots.txt generator; these files are, in some ways, the opposite of those in a website’s sitemap, which typically includes pages to be included when a search engine crawls a website. As you use the robots.txt file generator, to see a side-by-side comparison on how your site currently handles search bots versus how the proposed new robots.txt will work, type or paste your site domain URL or a page on your site in the text box, and then click Create Robot.txt button.

Our Free Robots.txt Generator Tool has made the lives of webmasters easy by doing a complicated task on its own and with just a few clicks of the mouse, this tool will generate a Googlebot friendly robots.txt file. robots.txt is a file that can be placed in the root folder of your website to help search engines index your site more appropriately. Search engines such as Google use website crawlers, or robots that review all the content on your website. There may be parts of your website that you do not want them to crawl to include in user search results, such as admin page. You can add these pages to the file to be explicitly ignored. Robots.txt files use something called the Robots Exclusion Protocol. This website will easily generate the file for you with inputs of pages to be excluded.

Free Robots.txt Generator Comments