Duplicate page titles in SEOMoz
-
My on page reports are showing a good number of duplicate title tags, but they are all because of a url tracking parameter that tells us which link the visitor clicked on. For example, http://www.example.com/example-product.htm?ref=navside and http://www.example.com/example-product.htm are the same page, but are treated as to different urls in SEOMoz. This is creating "fake" number of duplicate page titles in my reports.
This has not been a problem with Google, but SEOMoz is treating it like this and it's confusing my data. Is there a way to specify this as a url parameter in the Moz software?
Or does anybody have another suggestion? Should I specify this in GWT and BWT?
-
The best way to handle this, for all crawlers including Google, Yahoo and Moz, is to make sure you have proper canonical tags on those URLs that point to the non-parameterized URL.
So http://www.example.com/example-product.htm?ref=navside will have a canonical that points to http://www.example.com/example-product.htm
-
My understanding is that the Moz crawlers should be checking the canonical, in which case it will ignore duplicate content and title tag issues. If you find this is not the case with your crawl, please let our help team know at help @ moz.com
-
No, Moz's tool won't check the canonical to see if that would ignore this.
-
Do you think that setting a canonical url tag might help fix this?
-
Hi Robert,
Yes, I'm sorry but you're overlooking something. Ignoring parameters is something you should do in regards to SEO. It won't stop Google Analytics tracking these parameters.
-
I'm not sure why I'd want to ask google to ignore those parameters... we're explicitly adding the ones that they suggested we use from here:
The issue that I'm having is that Moz analytics is showing duplicate pages as an issue to resolve when the only difference is that these params exist.
Am I overlooking something here?
-
Hi Robert,
I wouldn't handle this via robots.txt if you only want to do this for Rogerbot. The best way to tell Google to ignore your UTM parameters is via Google Webmaster Tools. Under Crawl > URL Parameters you've got the option to add parameters that don't change any of the content and are solely used for tracking purposes.
-
I'm having a similar issue. Is there an example of how to add this to the robots.txt file to ignore the utm stuff for RogerBot?
Our scenario is that we send out PDFs with links to pages on our site and those links have utm parameters included... and are showing up as duplicate content.
Thanks in advance,
Robby
-
I think this is the answer I was looking for... Yeah, GWT already has a bunch of our parameters added, and hasn't had a problem with this one. It's not showing these pages as duplicate like SEOMoz does.
Thanks guys!
-
Hi,
What you might want to do to get rid of this issues within SEOMoz is add the parameters to your robots.txt file and specifically target the user agent of SEOMoz: Rogerbot. This way SEOMoz won't crawl the links with this parameter and by doing that also won't warn you about these duplicate titles.
Hope this helps!
Btw. As James already mentioned I would also recommend to configure these parameters within Google Webmaster Tools.
-
You could set it up in GWT but it sounds like you are using utm tags on internal links so you can see which physical links on a page are driving clicks. If that's the case a cleaner solution is to upgrade your Google Analytics code for enhanced link attribution. I'm assuming you are using GA but if so this will allow you to see which links are driving which clicks and won't create tons of duplicate page titles in SEOmoz.
See link: http://support.google.com/analytics/bin/answer.py?hl=en&answer=2558867
Let me know if you have questions,
JS
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content : what are best solutions
Hello i am a beginner in website, I got my first report the report saying that there are some duplicate content. I would like to know What can I do to solve that issue ?
Moz Pro | | Dieumerci0 -
Dynamic contents causes duplicate pages
Technical help required - please!
Moz Pro | | GBCweb
In our Duplicate Content Pages Report I see a lot of duplicate pages that are created by one URL plus several versions of the same page with the dynamic content, for example,
http://www.georgebrown.ca/immigranteducation/programs
http://www.georgebrown.ca/school-program.aspx?id=1909&Sortid=Study
http://www.georgebrown.ca/school-program.aspx?id=1909&Sortid=Term
http://www.georgebrown.ca/school-program.aspx?id=1909&Sortid=Certification
http://www.georgebrown.ca/school-program.aspx?id=1909&Sortid=Title How do we solve it?0 -
Can't figure out why some of my pages are duplicate content
Within the crawl diagnostics area I'm getting duplicate page content issues on several pages. I don't know why, would anyone be able to tell me how these links are duplicate so I can fix them? http://www.sagenews.ca/Column.asp?id=3010 http://www.sagenews.ca/Column.asp?id=2808 http://www.sagenews.ca/Column.asp?id=2998 http://www.sagenews.ca/Column.asp?id=2837 http://www.sagenews.ca/Column.asp?id=2981
Moz Pro | | INMCA0 -
Port 80 and Duplicate Content
The SEOmoz Web App is showing me that every single URL on one of my clients' domains has a duplicate in the form of the URL + :80. For instance, the app is showing me that www.example.com/default.aspx is duplicated in the form of www.example.com:80/default.aspx Any idea if this is an actual problem or just some kind of reporting error? Any help would be appreciated.
Moz Pro | | AnthonyMangia0 -
On page links tool here at Seomoz
Hi Seomoz - first of all, thanks for the best SEO tools I have ever worked with (this is my first question in this forum, and also I just subscribed as a paying customer after the 30 days trial you guys offer). My question: After having worked for several weeks on getting the numbers of links in our forum on www.texaspoker.dk down, we are somewhat surprised to see that we didn't succeed in getting lower numbers. For instance, this page: http://www.texaspoker.dk/forum/aktuelle-konkurrencer/coaching-projekt-bliver-du-den-udvalgte has (that's what Seomoz seo tool tells us): 239 on page links. Can this really be true? We can't find these links, and we actuually did a lot to lower the numbers of links, for instance the forum members picture was a link before, and also there was a "go to top" link in each post in the forum. Thanks a lot.
Moz Pro | | MPO0 -
How do i increae my page authority
Hi i am still finding my way around semoz and still learning what the tools do and i am trying to find out how i can increase my page authority. My competition page authority is 52 while mine is 36 and i would like to learn how i can increase mine to beat my competition. If anyone could give me some step by step instructions on what tools i should use and how i should use them to increase my page authority many thanks
Moz Pro | | ClaireH-1848860 -
SEOmoz crawl diagnostics report - what are the duplicate pages urls?
I just see the number of duplicates but not what the urls of the duplicates are? I don't see it in the export either, but maybe I'm missing it Cheers S
Moz Pro | | firstconversion0