Robots.txt Generator

Search Engine Optimization

Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

When a Search Engine's spider crawl through a site, Robots.txt is the first file it searches for. If it is found, it will start looking into the directories of this file to start indexing properly.

Why it is important?

It is very important to have a robots.txt file in the root folder of your website, otherwise, chances are that search engines will not crawl through your whole website. So basically, robots.txt is a file where you will tell the spider where to crawl and where not to. Like sometimes, on your website there are pages you don't want a search engine to index, like admin page, under maintenance page, duplicate content pages, etc. So you will use a robot.txt file to make sure that. 

How to create a robots.txt file?

You can create a robots.txt file manually, but it is not so wise if you are not much technical, so there comes our free robots.txt generator by which you can generate your robots.txt file instantly and for free. 

Sitemap Vs Robots.txt

These two look similar, but they are very different, quite opposite. A sitemap tells search engines about all the pages on your website that need to be crawled by a crawler whereas a Robots.txt file tells a crawler which page to visit and which do not. A sitemap is necessary for all sites to be indexed by search engines, whereas Robots.txt is only needed when you have pages not to be crawled. 

How to Use Free Robots.txt Generator by w3-Tools?

Some of the options are not mandatory, but you need to choose very carefully. If you don't understand anything, leave it as default.

  1. First, there is a default option for all robots. Recommended: Allowed.
  2. Crawl-Delay is used to prevent crawlers from overloading the host server. Recommended: Default
  3. Sitemap: Very Important - Make sure you have a sitemap and link it here.
    1. If you don't have one, you can create a sitemap with our Free Sitemap Generator.
  4. If you want any specific search robot to not crawl your website, you can use these settings. Recommended: Default.
  5. Restricted Directories: Here, in this option, you will restrict the crawler from visiting your specific pages. List all the pages you want the crawler not to index. 
  6. Click on Create and Save as Robots.txt.

Browse more free tools by w3academy:

1. 100% Free Backlink Maker

2. 100% Free XML Sitemap Generator

3. Meta Tag Analyzer