Omniture tracking code URLs creating duplicate content
-
My ecommerce company uses Omniture tracking codes for a variety of different tracking parameters, from promotional emails to third party comparison shopping engines. All of these tracking codes create URLs that look like www.domain.com/?s_cid=(tracking parameter), which are identical to the original page and these dynamic tracking pages are being indexed. The cached version is still the original page.
For now, the duplicate versions do not appear to be affecting rankings, but as we ramp up with holiday sales, promotions, adding more CSEs, etc, there will be more and more tracking URLs that could potentially hurt us.
What is the best solution for this problem?
If we use robots.txt to block the ?s_cid versions, it may affect our listings on CSEs, as the bots will try to crawl the link to find product info/pricing but will be denied. Is this correct?
Or, do CSEs generally use other methods for gathering and verifying product information?
So far the most comprehensive solution I can think of would be to add a rel=canonical tag to every unique static URL on our site, which should solve the duplicate content issues, but we have thousands of pages and this would take an eternity (unless someone knows a good way to do this automagically, I’m not a programmer so maybe there’s a way that I don’t know).
Any help/advice/suggestions will be appreciated. If you have any solutions, please explain why your solution would work to help me understand on a deeper level in case something like this comes up again in the future.
Thanks!
-
Thanks for the detailed response and confirmation about the canonical being the best solution. This definitely helps.
Some of the tracking URLs are actually being indexed. It doesn't seem to be negatively affecting anything right now, but I'd prefer to prevent any potential future problems if possible.
Thanks again.
-
I think the canonical probably your best bet here. You can solve it with a 301-redirect, too, but it's a lot trickier. If you're really running into trouble, parameter blocking in GWT is ok here. Again, it's not my first choice, but it's not a black-and-white issue (just ideal vs. not-so-ideal).
If your pages are truly static, you'd have to write a canonical tag for each one, but most sites at least have a shared header and some dynamic components. In other words, your 1000s of pages may only actually be a few physical pages of code. In that case, you may be able to add the canonical tags on as little as one template (with some code). Unfortunately, this is completely dependent on the platform you're on - there's no universal answer (and the code is completely dependent on your URL structure). You'll probably need some quality time with your coders on that one.
The first thing I'd do, though, is to monitor your site with the "site:" operator in Google, along with "inurl:s_cid". In some cases, Google doesn't crawl these tracking URLs (or knows they're common to an analytics package). If they aren't being indexed, you may not have a problem here.
-
Thanks for the response.
The article doesn't deal with my specific issue exactly, but it does suggest using a rel=canonical in similar cases (affiliate tracking).
Using GWT to block parameters is a useful suggestion too, but isn't "recommended as a first line of defense" according to that article. I'll definitely use it in addition to whatever is best though.
Right now, the canonical tag seems like the best solution. Does anyone have any ideas on implementing these across the site's unique pages dynamically using code? Is this even possible?
Thanks!
-
I think a previous article deals with this pretty well. I would read the whole article but also take a look at utilizing GWT to not index particular URL Parameters. Here is the link and I hope it helps.
http://www.seomoz.org/blog/duplicate-content-in-a-post-panda-world
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Home page duplicate content...
Hello all! I've just downloaded my first Moz crawl CSV and I noticed that the home page appears twice - one with an appending forward slash at the end: http://www.example.com
Technical SEO | | LiamMcArthur
http://www.example.com/ For any of my product and category pages that encounter this problem - it's automatically resolved with a canonical tag. Should I create the same canonical tag for my home page? rel="canonical" href="http://www.example.com" />0 -
Location of Content within the Code Structure
Hi guys,
Technical SEO | | artdivision
When working with advanced modern websites it many times means that in order to achieve the look and feel we end up with pages that has almost 1000 lines of code or more. In some cases it is impossible to avoid it if we are to reach the Client's visual and technical specifications. Say the page is 1000 lines of code, and our content only starts at line 450 onwards, will that have an impact from a Google crawlability, hence affect our SEO making it harder to rank? Thoughts? Dan.0 -
Duplicate Content Issues - Where to start???
Dear All I have recently joined a new company Just Go Holidays - www.justgoholidays.com I have used the SEO Moz tools (yesterday) to review the site and see that I have lots of duplicate content/pages and also lots of duplicate titles all of which I am looking to deal with. Lots of the duplicate pages appear to be surrounding, additional parameters that are used on our site to refine and or track various marketing campaigns. I have therefore been into Google Webmaster Tools and defined each of these parameters. I have also built a new XML sitemap and submitted that too. It looks as is we have two versions of the site, one being at www.justgoholidays.com and the other without the www It appears that there are no redirects from the latter to the former, do I need to use 301's here or is it ok to use canonicalisation instead? Any thoughts on an action plan to try to address these issues in the right order and the right way would be very gratefully received as I am feeling a little overwhelmed at the moment. (we also use a CMS system that is not particularly friendly and I think I will have to go directly to the developers to make lots of the required changes which is sure to cost - therefore really don't want to get this wrong) All the best Matt
Technical SEO | | MattByrne0 -
Https Duplicate Content
My previous host was using shared SSL, and my site was also working with https which I didn’t notice previously. Now I am moved to a new server, where I don’t have any SSL and my websites are not working with https version. Problem is that I have found Google have indexed one of my blog http://www.codefear.com with https version too. My blog traffic is continuously dropping I think due to these duplicate content. Now there are two results one with http version and another with https version. I searched over the internet and found 3 possible solutions. 1 No-Index https version
Technical SEO | | RaviAhuja
2 Use rel=canonical
3 Redirect https versions with 301 redirection Now I don’t know which solution is best for me as now https version is not working. One more thing I don’t know how to implement any of the solution. My blog is running on WordPress. Please help me to overcome from this problem, and after solving this duplicate issue, do I need Reconsideration request to Google. Thank you0 -
Duplicate Content Due to Pagination
Recently our newly designed website has been suffering from a rankings loss. While I am sure there are a number of factors involved, I'd like to no if this scenario could be harmful... Google is showing a number of duplicate content issues within Webmaster Tools. Some of what I am seeing is duplicate Meta Titles and Meta Descriptions for page 1 and page 2 of some of my product category pages. So if a category has many products and has 4 pages, it is effectively showing the same page title and meta desc. across all 4 pages. I am wondering if I should let my site show, say 150 products per page to get them all on one page instead of the current 36 per page. I use the Big Commerce platform. Thank you for taking the time to read my question!
Technical SEO | | josh3300 -
Duplicate page content
Hello, My site is being checked for errors by the PRO dashboard thing you get here and some odd duplicate content errors have appeared. Every page has a duplicate because you can see the page and the page/~username so... www.short-hairstyles.com is the same as www.short-hairstyles.com/~wwwshor I don't know if this is a problem or how the crawler found this (i'm sure I have never linked to it). But I'd like to know how to prevent it in case it is a problem if anyone knows please? Ian
Technical SEO | | jwdl0 -
Duplicate content with same URL?
SEOmoz is saying that I have duplicate content on: http://www.XXXX.com/content.asp?ID=ID http://www.XXXX.com/CONTENT.ASP?ID=ID The only difference I see in the URL is that the "content.asp" is capitalized in the second URL. Should I be worried about this or is this an issue with the SEOmoz crawl? Thanks for any help. Mike
Technical SEO | | Mike.Goracke0 -
Solution for duplicate content not working
I'm getting a duplicate content error for: http://www.website.com http://www.website.com/default.htm I searched for the Q&A for the solution and found: Access the.htaccess file and add this line: redirect 301 /default.htm http://www.website.com I added the redirect to my .htaccess and then got the following error from Google when trying to access the http://www.website.com/default.htm page: "This webpage has a redirect loop
Technical SEO | | Joeuspe
The webpage at http://www.webpage.com/ has resulted in too many redirects. Clearing your cookies for this site or allowing third-party cookies may fix the problem. If not, it is possibly a server configuration issue and not a problem with your computer." "Error 310 (net::ERR_TOO_MANY_REDIRECTS): There were too many redirects." How can I correct this? Thanks0