3 strategies to prevent your tracking URLs from creating duplicate content.
 by Casey Markee

3 strategies to prevent your tracking URLs from creating duplicate content.

  • We have several clients who use various web analytics tracking tools to track events such as the number of visitors who click on a banner ad and go to another page on the site.

    In order to track these events our clients often append a tracking code to the destination page, which in effect means that the same destination page could be seen as duplicate content as the tracking code creates multiple URLs for the same page. Sometimes the tracked URL even gets a higher PageRank than the actual destination page.

    Do you have any thoughts on how these events can be tracked without creating duplicate content?

Answer: We see a lot of sites having problems with URL-based tracking. Most commonly, the link juice that should be flowing to a specific page is, instead, being diluted between the regular URL and a duplicate tracking URL.

However, this can also cause your total referral tracking to be skewed due to tracking URLs being served to users coming from Google or other search engines, rather than the regular URL. That can render your stats meaningless and negate the whole reason for using tracking URLs in the first place.

There's three ways you can fix these problems:

First, you can use JavaScript-based tracking, such as with Google Analytics, WebTrends or ClickTracks. These don't require you to modify your URLs, and you can still see where traffic is coming from.

Second, if you want to use URL-based tracking, you can block the URLs that have your tracking code from being indexed via your robots.txt file. Just make sure you're blocking the version of the URL with the tracking extension and not the root URL itself. If you accidentally block the page without the tracking URL, that page won't get indexed at all.

Third, let the engines sort it out. Google is generally pretty good at determining which URLs are tracking URLs and filtering them out of the search results. Yahoo and Microsoft, however, still have a significant problem with listing these URLs as duplicate content, ...

Already a member? Sign in here

Read the rest of this article,
and get all this for only $1.

  • The Search Engine Strategies Updates for September 2016
  • Ultimate Guide to Avoiding Google Penalties
  • The Complete Site Audit Checklist
  • The Definitive Local Search Audit Checklist
  • 100's of Strategic SEO Articles and Q&As
  • The Professional Engine Master's Chart
  • The Internet Marketing Glossary
  • The Ultimate Directory Submission List
  • The Pro SEO's Local Search Directory List
  • 16 years of SEN Archives
  • PLUS, as a Full SEN Basic Member, you'll be eligible for hundreds of dollars of discounts on SEO courses ranging from beginner to master on a variety of topics including organic search, local search, and social networking.

  • Your $1 Trial is good for 7 days at which time your card will be charged $29/mo unless you cancel before October 6th, 2016

 This form is encrypted for your security