Put Your Robots.txt Files to the Test With Google's Testing Tool


A robots.txt file can make or break your site and diving in to find errors can be a complex and daunting task. Well, Google has stripped away a layer of complexity by releasing an update to their robots.txt testing tool. The tool allows you to test the URLs within your robots.txt file and highlights the line within the file that is disallowing each specific page tested. You can then fix the directive as you go, test it to make sure the fix works and then upload the new file for use.

The tool will also allow you to view older versions of your robots.txt file so you can see when the changes took place that caused the crawling errors to begin with. The testing tool is located within Webmaster Tools under the Crawl section.

Robots.txt is a not something to take on lightly. It's an advanced strategy that is essential for anyone serious in SEO to get familiar with. However, what you need to understand is that when done incorrectly you could end up with your site completely dropped from Google. Use our featured article to get a clear picture on how to approach your robots.txt file.

...

TO READ THE FULL ARTICLE