Robots.txt Primer: Get Your Pages Indexed Faster by Controlling Google’s Spider
Robots.txt Primer: Get Your Pages Indexed Faster by Controlling Google's Spider
One of the most critical SEO tasks is to control the search engine spiders (like Googlebot) that crawl and index your Web site. Mastery of these spiders is paramount to preventing duplicate content while ensuring that search engines focus mainly on your most important pages.
|Spider? Bot? Crawler?|
|The terms spider, crawler, bot and robot all generally refer to the same thing. Technically, a bot is any program that downloads pages off the web, while a spider is a bot that the search engines use to build their index. But you'll often hear one being used to refer to the other, and the distinction isn't especially important.|
Controlling Search Spiders with Robots.txt
Picture your robots.txt file as the tour guide to your site for the search engines. It provides a map that tells search engines where to find the content you wa...