All about "Robots.txt"
 by Ian Cook

How to tell a Search Engine spider to politely "Go Away" -- and, why you might want to.

Believe it or not, there are some circumstances under which you don’t want search engines to index portions of your web site. In such a case you need to be able to tell their "spiders" which section(s) of your web site are "hands off" and NOT to be included in their index.

For Instance, you may have some test pages that you're working on, or perhaps you don't want your competition to find your site's web traffic logs indexed on a search engine. Another very good reason may be that you are "joint venturing" a product or service with another company. Maybe you are paying them a commission based on sales from a particular web site that they are referring customers to. If a Search Engine should happen to index the site, then you might find yourself paying an undeserved commission to a partner on sales that were in fact generated by a search engine.

Whatever the reason, here are the primary methods that web designers use to shield pages from web spiders and thus preventing SE's from indexing them:

Password Protection

The most secure method of restricting access to web pages, whether it’s a spider or someone with a browser is to password protect the directory. This is the same format we use to authenticate you when the pages of this newsletter are accessed.

Instead of trying to explain how to configure this option in detail here, NCSA has a good reference available to help you out http://hoohoo.ncsa.uiuc.edu/docs/tutorials/user.html.  Many sites have a way for your to do this from their "Control Panel". You should check with you web host -- they may already have a easy way for you to p...

TO READ THE FULL ARTICLE