Duplicate page titles in SEOMoz
-
My on page reports are showing a good number of duplicate title tags, but they are all because of a url tracking parameter that tells us which link the visitor clicked on. For example, http://www.example.com/example-product.htm?ref=navside and http://www.example.com/example-product.htm are the same page, but are treated as to different urls in SEOMoz. This is creating "fake" number of duplicate page titles in my reports.
This has not been a problem with Google, but SEOMoz is treating it like this and it's confusing my data. Is there a way to specify this as a url parameter in the Moz software?
Or does anybody have another suggestion? Should I specify this in GWT and BWT?
-
The best way to handle this, for all crawlers including Google, Yahoo and Moz, is to make sure you have proper canonical tags on those URLs that point to the non-parameterized URL.
So http://www.example.com/example-product.htm?ref=navside will have a canonical that points to http://www.example.com/example-product.htm
-
My understanding is that the Moz crawlers should be checking the canonical, in which case it will ignore duplicate content and title tag issues. If you find this is not the case with your crawl, please let our help team know at help @ moz.com
-
No, Moz's tool won't check the canonical to see if that would ignore this.
-
Do you think that setting a canonical url tag might help fix this?
-
Hi Robert,
Yes, I'm sorry but you're overlooking something. Ignoring parameters is something you should do in regards to SEO. It won't stop Google Analytics tracking these parameters.
-
I'm not sure why I'd want to ask google to ignore those parameters... we're explicitly adding the ones that they suggested we use from here:
The issue that I'm having is that Moz analytics is showing duplicate pages as an issue to resolve when the only difference is that these params exist.
Am I overlooking something here?
-
Hi Robert,
I wouldn't handle this via robots.txt if you only want to do this for Rogerbot. The best way to tell Google to ignore your UTM parameters is via Google Webmaster Tools. Under Crawl > URL Parameters you've got the option to add parameters that don't change any of the content and are solely used for tracking purposes.
-
I'm having a similar issue. Is there an example of how to add this to the robots.txt file to ignore the utm stuff for RogerBot?
Our scenario is that we send out PDFs with links to pages on our site and those links have utm parameters included... and are showing up as duplicate content.
Thanks in advance,
Robby
-
I think this is the answer I was looking for... Yeah, GWT already has a bunch of our parameters added, and hasn't had a problem with this one. It's not showing these pages as duplicate like SEOMoz does.
Thanks guys!
-
Hi,
What you might want to do to get rid of this issues within SEOMoz is add the parameters to your robots.txt file and specifically target the user agent of SEOMoz: Rogerbot. This way SEOMoz won't crawl the links with this parameter and by doing that also won't warn you about these duplicate titles.
Hope this helps!
Btw. As James already mentioned I would also recommend to configure these parameters within Google Webmaster Tools.
-
You could set it up in GWT but it sounds like you are using utm tags on internal links so you can see which physical links on a page are driving clicks. If that's the case a cleaner solution is to upgrade your Google Analytics code for enhanced link attribution. I'm assuming you are using GA but if so this will allow you to see which links are driving which clicks and won't create tons of duplicate page titles in SEOmoz.
See link: http://support.google.com/analytics/bin/answer.py?hl=en&answer=2558867
Let me know if you have questions,
JS
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Solving 'duplicate content' for page 2 of X for 1 blog post
Hi to all SEO wizards, For my Dutch blog google-plus-marketing.nl I'm using WordPress Genesis framework 2.0 with news theme pro 2.0 responsive theme. I love the out of the box SEO friendliness and features. One of those features is that it allows for a blog post or page to be divided into several pages. This results in MOZ signaling duplicate titles for all pages after the 1st page. Now I was thinking that a canonical url set to the first page should do the trick for me as I reason that the rank will go the the first page and the rest will not be seen as duplicates. Genesis does some good stuff on it's own and places the following meta tags in the header for the first page. All looks well and my question is about the same meta tags for the 2nd page and higher that I pasted below this one for the 1st page. Meta tags page 1 of X for blog post Meta tags page 2 of X for the same blog post Would it not be better to point the canonical url for page 2 till X to always point to the first page? In this case:
Moz Pro | | DanielMulderNL0 -
Page Rank vs Page and Domain Authority - who wins?
A client has found another SEO agency promising various things to do with link building. Most of these promises are based upon links from sites with allegedly high page ranks. So my questions: Page rank seems to be fading out am I safe to stay with PA and DA metrics instead? I don't agree with link building tactics and feel that it should more a networking activity to provide USEFUL links to users... am I being too white hat and missing opporunities? The other company have promised long list of links including 100 SEO friendly web directory listings, 200 PR 8 back links from Pinterest (which i thought was no follow) & 10 long lasting and high quality mini web sites (with three pages/posts, video and pictures). Am I right that this all sounds a little spammy or is this really what I should be doing for me clients?
Moz Pro | | SoundinTheory0 -
Does the Crawl Diagnosis - Duplicate Page Content account for a canonical meta tags?
I see the same page listed 3 time (with different query params). But on each I have a meta tag pointing to the correct canonical url. By still seeing all three listed, does that mean there is an error with my meta tag?
Moz Pro | | Simantel0 -
Are seomoz ranking in jeopardy?
A few weeks ago Raventools announced that they would no longer be able to provide tools based on the adwords API, then withdrew that. Rand commented on a thread here discussing that and made similar noises. Today Raventools made an announcement as SERP tracking with wording that sounded a lot like it had more than a little influence from Google. Would I be really paranoid to be wondering whether the Rankings tools in seomoz could be at risk?
Moz Pro | | matbennett1 -
Duplicate content pages
Crawl Diagnostics Summary shows around 15,000 duplicate content errors for one of my projects, It shows the list of pages with how many duplicate pages are there for each page. But i dont have a way of seeing what are the duplicate page URLs for a specific page without clicking on each page link and checking them manually which is gonna take forever to sort. When i export the list as CSV, duplicate_page_content column doest show any data. Can anyone please advice on this please. Thanks <colgroup><col width="1096"></colgroup>
Moz Pro | | nam2
| duplicate_page_content |1 -
SEOMoz Private Message Settings
Hi all, Is there a way to be notified (to my personal email address) when I get sent a private message? Thanks,
Moz Pro | | Unity
Davinia0 -
Does SEOMoz ever work?
Hi, I've signed up for the free 30 day trial and I'm on the edge of not actually subscribing to the service. I go through the Q & A boards which I find really interesting and hope I can add value there in the future, but the tools interest me more (and this is where the issue lies. Do they ever work? The Keyword difficulty tool just constantly says to come back in 20 minutes and I don't think the Rank Tracker has worked for at least half my freebie 30 days. Have the tools always been this flaky or is it a blip?
Moz Pro | | orlandovisiting1 -
Solving duplicate content errors for what is effectively the same page.
Hello,
Moz Pro | | jcarter
I am trying out your SEOMOZ and I quite like it. I've managed to remove most of the errors on my site however I'm not sure how to get round this last one. If you look at my errors you will see most of them revolve around things like this: http://www.containerpadlocks.co.uk/categories/32/dead-locks
http://www.containerpadlocks.co.uk/categories/32/dead-locks?PageSize=9999 These are essentially the same pages because the category for Dead Locks does not contain enough products to view over more than one resulting in the fact that when I say 'View all products' on my webpage, the results are the same. This functionality works with categories with more than the 20 per page limit. My question is, should I be either: Removing the link to 'show all products' (which adds the PageSize query string value) if no more products will be shown. Or putting a no-index meta tag on the page? Or some other action entirely? Looking forward to your reply and you showing how effective Pro is. Many Thanks,
James Carter0