Free Download
"Masters Course"


Web This Site
English Translation
Chinese Translation
Deutsch Translation

French Translation

Italian Translation
Japanese Translation
Spanish Translation


Web Designing



Search Engines

Affiliate Programs

Imp. Indian links

Baby Names


About Me

Site Map

Free Download Masters Course

Web Promotion Tips & Tutorials

Search Engine Optimization

Search Engine Optimization Services

Search Engine Ranking

Search Engine Tactics

Search Engine Marketing

Search Engine Optimization Softwares

Google Adwords Guides, eBooks

Pay Per Click Search Engine Traffic - PPC

Link Popularity

Know How to Earn Money Online

Join Affiliate Programs

Two Tier Affiliate Programs

Affiliate Marketing eBooks, Tutorials

Google Adsense Tips, Tutorials

Know how to do Business from Home

Weekly Updated Articles

Internet Marketing Web Promotoin

Master Series Web Design Development

Web Design Tips Tutorials

Business Sales

Biz Tips Tutorials

Creating a Robots.txt file

By Sumantra Roy

Some people believe that they should create different pages for different search engines, each page optimized for one keyword and for one search engine. Now, while I don't recommend that people create different pages for different search engines, if you do decide to create such pages, there is one issue that you need to be aware of.

These pages, although optimized for different search engines, often turn out to be pretty similar to each other. The search engines now have the ability to detect when a site has created such similar looking pages and are penalizing or even banning such sites. In order to prevent your site from being penalized for spamming, you need to prevent the search engine spiders from indexing pages which are not meant for it, i.e. you need to prevent AltaVista from indexing pages meant for Google and vice-versa. The best way to do that is to use a robots.txt file.

You should create a robots.txt file using a text editor like Windows Notepad. Don't use your word processor to create such a file.

Here is the basic syntax of the robots.txt file:

User-Agent: [Spider Name]
Disallow: [File Name]

For instance, to tell AltaVista's spider, Scooter, not to spider the file named myfile1.html residing in the root directory of the server, you would write

User-Agent: Scooter
Disallow: /myfile1.html

To tell Google's spider, called Googlebot, not to spider the files myfile2.html and myfile3.html, you would write

User-Agent: Googlebot
Disallow: /myfile2.html
Disallow: /myfile3.html

You can, of course, put multiple User-Agent statements in the same robots.txt file. Hence, to tell AltaVista not to spider the file named myfile1.html, and to tell Google not to spider the files myfile2.html and myfile3.html, you would write

User-Agent: Scooter
Disallow: /myfile1.html

User-Agent: Googlebot
Disallow: /myfile2.html
Disallow: /myfile3.html

If you want to prevent all robots from spidering the file named myfile4.html, you can use the * wildcard character in the User-Agent line, i.e. you would write

User-Agent: *
Disallow: /myfile4.html

However, you cannot use the wildcard character in the Disallow line.

Once you have created the robots.txt file, you should upload it to the root directory of your domain. Uploading it to any sub-directory won't work - the robots.txt file needs to be in the root directory.

I won't discuss the syntax and structure of the robots.txt file any further - you can get the complete specifications from

Now we come to how the robots.txt file can be used to prevent your site from being penalized for spamming in case you are creating different pages for different search engines. What you need to do is to prevent each search engine from spidering pages which are not meant for it.

For simplicity, let's assume that you are targeting only two keywords: "tourism in Australia" and "travel to Australia". Also, let's assume that you are targeting only three of the major search engines: AltaVista, HotBot and Google.

Now, suppose you have followed the following convention for naming the files: Each page is named by separating the individual words of the keyword for which the page is being optimized by hyphens. To this is added the first two letters of the name of the search engine for which the page is being optimized.

Hence, the files for AltaVista are


The files for HotBot are


The files for Google are


As I noted earlier, AltaVista's spider is called Scooter and Google's spider is called Googlebot.

A list of spiders for the major search engines can be found at

Now, we know that HotBot uses Inktomi and from this list, we find that Inktomi's spider is called Slurp. Using this knowledge, here's what the robots.txt file should contain:

User-Agent: Scooter
Disallow: /tourism-in-australia-ho.html
Disallow: /travel-to-australia-ho.html
Disallow: /tourism-in-australia-go.html
Disallow: /travel-to-australia-go.html

User-Agent: Slurp
Disallow: /tourism-in-australia-al.html
Disallow: /travel-to-australia-al.html
Disallow: /tourism-in-australia-go.html
Disallow: /travel-to-australia-go.html

User-Agent: Googlebot
Disallow: /tourism-in-australia-al.html
Disallow: /travel-to-australia-al.html
Disallow: /tourism-in-australia-ho.html
Disallow: /travel-to-australia-ho.html

When you put the above lines in the robots.txt file, you instruct each search engine not to spider the files meant for the other search engines.

When you have finished creating the robots.txt file, double-check to ensure that you have not made any errors anywhere in it. A small error can have disastrous consequences - a search engine may spider files which are not meant for it, in which case it can penalize your site for spamming, or, it may not spider any files at all, in which case you won't get top rankings in that search engine.

An useful tool to check the syntax of your robots.txt file can be found at While it will help you correct syntactical errors in the robots.txt file, it won't help you correct any logical errors, for which you will still need to go through the robots.txt thoroughly, as mentioned above.

About the Author
Article by Sumantra Roy. Sumantra is one of the most respected search engine positioning specialists on the Internet. To have Sumantra's company place your site at the top of the search engines, go to For more advice on how you can take your web site to the top of the search engines, subscribe to his FREE newsletter by going to

Website Links who provide FREE SUBMISSION & Softwares.

The all-in-one web site promotion tool for your web site success. IBP is the award-winning web site promotion and search engine submission software tool that helps you to get more revenue with high search engine rankings in Google, the new Yahoo and all other major search engines. For More Details Click here 
Click here to download the free IBP 4.1.5 demo

ARELIS is a top Rated Software Program that helps you to build a powerful business network quickly and easily. Get steady, free targeted traffic, higher link popularity and more sales with ARELIS 4...For More Details Click here Click here to download free ARELIS 4.4.2 trial

ultimate software suite for Web site promotion
Falling Traffic Don't Panic Web CEO can handle it !
Download the Free Edition Now

Download the latest Internet & web promotion software
Trellian Software is Award Winning Web Site Promotion Software can dramatically increase your web site traffic. Submit your web site details to over 500,000 Search engines and Directories. Many latest Softwares are Avialable to free download. Freeware 1) Trellian WebPage, 2) CodePad, 3) Traceroute, 4) Trellian Toolbar. All this 4 products are freeware and will always remain FREE.
Click here to download your Free Copy

WebLink SEO is a powerful software package that enables you to analyze your competitor's websites and promote your website on the Internet to generate improved search engine results. Search Engine Optimization (SEO) used to be a time-consuming and difficult process, but WebLink makes light work of it by providing you with a wide variety of tools that not only help to analyze and optimize your website, but also promote your website on the Internet - no other package available offers such a comprehensive variety of features. Click here to download your Free Copy

Tagging Secrets. The Fastest, Easiest Way to Get Targeted Traffic t to Your Site, Discover Secrets to Using Technorati and Social Bookmarking And Flood Your Site With Qualified Buyers in Minutes!


| How to determine the link popularity of a web page | Worldwide, Countrywide, Global & Local Major Search Engines | Learn How to Optimize Page to Get Top Rankings in Search Engines | How to Design Good Looking Website | Effective Web Site Promotion - Is An Art Learn How To Promote your Website | Affiliate Revenue Information How To Earn From your Website


Address : Station Road, Behind Nagrik Stores, Pandit House, Thane (W) - 400601 Maharashtra ( India )

Please Send your Suggetions & Comments on


Skyrocket Your Search Engine Ranking! 200+ pages of search engine strategies all FREE to you immediately Download
Do You want to get Top 10 Ranking on Google then You Must Download this Informative eBook on " Successful Search Engine Marketing"
Download Free eBook

Earn Affiliate Commissions From Even Those Visitors Who Don't Click Your Affiliate Link!" Click Here to find out How

India Herbs 2 Tier Affiliate Programs

Search Engines : Australia, Canada, China, France, Germany, Italy, India, Japan, Netherlands, New Zealand, Russia, Spain, Latin America, United Kingdom ( UK ), Gulf, Middle-East, Africa, Europe, Indonesia, Malaysia, Singapore, Vietnam, Costa Rica, Chile, Turkey, Thailand, United States ( US ), Bangladesh, Nepal & Global