Why are my pages getting duplicate content errors?
-
Studying the Duplicate Page Content report reveals that all (or many) of my pages are getting flagged as having duplicate content because the crawler thinks there are two versions of the same page:
http://www.mapsalive.com/Features/audio.aspx
http://www.mapsalive.com/Features/Audio.aspx
The only difference is the capitalization. We don't have two versions of the page so I don't understand what I'm missing or how to correct this. Anyone have any thoughts for what to look for?
-
Dr. Pete doesn't cover case (though it's mentioned in the comments), but just about everything else you might want to know about duplicate content is talked about at http://www.seomoz.org/blog/duplicate-content-in-a-post-panda-world, including ways to remedy it. It sounds like you've got a plan here, but I'm also adding it for the benefit of others looking at this thread.
-
I think this is one of the most overlooked duplicate content issues. Not sure why it's not talked about as much as it is. I quite often have been using upper and lowercase intermittently. E.g., mysite.com/Las-Vegas/ and mysite.com/las-vegas/, not knowing it made any difference.
I guess a .htaccess rewrite to all lowercase is in order. Thanks SEOMoz. You guys rock.
-
Glad to be of help Janice.
From a readability perspective, in which case I'd suggest to have all lower case.
-
Well, it is a Windows server and my understanding is that it is case-insensitive, but I'll verify this with our hosting provider. Nevertheless, would it be preferable to set up the rewrite from the mixed case names to all lowercase names or vice versa? Or perhaps it doesn't matter.
Thanks for your help with this - lots to learn and work through with these tools.
-
If the server allows upper case and lower case then from a technical perspective they could both be different files. Like having www.domain.com and domain.com point to the same home page - they may be the same, but technically they could be two different places.
The solution should be set up to not require having to do a rewrite every time a new page is created. It should be automatic.
-
I understand your answer and about setting up rewrites, but what I really want to know is why there are two pages listed (one uppercase, one lowercase) when there is only one physical page on the site. All links within the site point to the page using the uppercase name.
I don't want to have to add a rewrite for the lowercase name every time I add a page to the site - this doesn't seem right which is why I'm wondering if there is something else wrong.
-
Janice,
The proper solution would be to have the site set up at the server level to automatically rewrite URLs so they have one consistent pattern (typically all lower case). And to make sure all links within the site pointing to other pages on the site use that preferred capitalization method. While having Canonical tags can help alleviate the problem, they're not a best practices "only" solution. So speak with the site administrator or programmer to get the rewrite functionality implemented.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Getting high priority issue for our xxx.com and xxx.com/home as duplicate pages and duplicate page titles can't seem to find anything that needs to be corrected, what might I be missing?
I am getting high priority issue for our xxx.com and xxx.com/home as reporting both duplicate pages and duplicate page titles on crawl results, I can't seem to find anything that needs to be corrected, what am I be missing? Has anyone else had a similar issue, how was it corrected?
Technical SEO | | tgwebmaster0 -
Duplicate Pages , Do they matter ?
I have been told buy the company who created my site that duplicate page warning are not a problem ? my site is small and only has 50 pages ( including product pages etc ) yet the crawl shows over 6500 duplicate pages am I right to be concerned?
Technical SEO | | Gardening4you0 -
Duplicate page content & titles on the same domain
Hey, My website: http://www.electromarket.co.uk is running Magento Enterprise. The issue I'm running into is that the URLs can be shortened and modified to display different things on the website itself. Here's a few examples. Product Page URL: http://www.electromarket.co.uk/speakers-audio-equipment/dj-pa-speakers/studio-bedroom-monitors/bba0051 OR I could remove everything in the URL and just have: http://www.electromarket.co.uk/bba0051 and the link will work just as well. Now my problem is, these two URL's load the same page title, same content, same everything, because essentially they are the very same web page. But how do I tell Google that? Do I need to tell Google that? And would I benefit by using a redirect for the shorter URLs? Thanks!
Technical SEO | | tomhall900 -
Tips and duplicate content
Hello, we have a search site that offers tips to help with search/find. These tips are organized on the site in xml format with commas... of course the search parameters are duplicated in the xml so that we have a number of tips for each search parameter. For example if the parameter is "dining room" we might have 35 pieces of advice - all less than a tweet long. My question - will I be penalized for keyword stuffing - how can I avoid this?
Technical SEO | | acraigi0 -
How to avoid duplicate content penalty when our content is posted on other sites too ?
For recruitment company sites, their job ads are posted muliple times on thier own sites and even on other sites too. These are the same ads (job description is same) posted on diff. sites. How do we avoid duplicate content penalty in this case?
Technical SEO | | Personnel_Concept0 -
Mitigating duplicate page content on dynamic sites such as social networks and blogs.
Hello, I recently did an SEOMoz crawl for a client site. As it typical, the most common errors were duplicate page title and duplicate content. The client site is a custom social network for researchers. Most of the pages that showing as duplicate are simple variations of each user's profile such as comment sections, friends pages, and events. So my question is how can we limit duplicate content errors for a complex site like this. I already know about the rel canonical tag, and rel next tag, but I'm not sure if either of these will do the job. Also, I don't want to lose potential links/link juice for good pages. Are there ways of using the "noindex" tag in batches? For instance: noindex all urls containing this character? Or do most CMS allow this to be done systematically? Anyone with experience doing SEO for a custom Social Network or Forum, please advise. Thanks!!!
Technical SEO | | BPIAnalytics0 -
Does duplicate content on word press work against the site rank? (not page rank)
I noticed in the crawl that there seems to be some duplicate content with my word press blog. I installed a seo plugin, Yoast's wordpress seo plugin, and set it to keep from crawling the archives. This might solve the problem but my main question is can the blog drag my site down?
Technical SEO | | tommr10 -
Duplicate content
Greetings! I have inherited a problem that I am not sure how to fix. The website I am working on had a 302 redirect from its original home url (with all the link juice) to a newly designed page (with no real link juice). When the 302 redirect was removed, a duplicate content problem remained, since the new page had already been indexed by google. What is the best way to handle duplicate content? Thanks!
Technical SEO | | shedontdiet0