Robots.txt Generator | Best SEO Tools

Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

The robots.txt file is a simple text file that instructs search engine robots how to crawl and index pages on their website. It must be placed in your root directory, You can keep Web robots from accessing all or parts of your Web site that you want to keep private.


For example, if you do not want the search engines from indexing, certain scripts, or files on your site that might contain email addresses, phone numbers, or other sensitive data, you can do that using robots.txt file. Here are some other directories that might contain sensitive data and do not want to index:  /cgi-bin/, /wp-admin/, /cart/, /scripts/ etc.

When a search engine crawler comes to your site, it will look for  robots.txt file and it tells the search engine spider, which Web pages of your site should be indexed and which Web pages should be ignored.

How do I create a robots.txt file?

Use our Robots.txt generator to create a robots.txt file. Note: Google does not support the crawl delay command directly

Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.