Duplicate Page Content
-
Hi there,
We keep getting duplicate page content issues. However, its not actually the same page.
E.G - There might be 5 pages in say a Media Release section of the website. And each URL says page 1, 2 etc etc. However, its still coming up as duplicate. How can this be fixed so Moz knows its actually different content? -
Thanks all - will give those options a try and see which works the best for us.
-
Hi!
I suggested the noindex in order to deindex pages that maybe are already indexed. But, yes, the rel="canonical" should be doing the same (the problem is that Google may not respect it).
The nofollow is order to not letting the crawler wasting budget crawl following the links of those (many) pages.
-
Gianluca,
Wouldn't be much more work to identify if the parameter is set and then add the noindex meta? Wouldn't be easier to just set the canonical? I'm sure that's a dynamic site, just one canonical cal without using any extra code (PHP or whatever).
Why the nofollow? If I just preventing that page for being indexed as it would constitute a duplicate content issue, why the nofollow?, noindex should be enough in this case.
We recently fixed a similar issue with our blog tags, showing duplicate content on about 400 pages. We fixed that by adding the noindex (they already had the canonical but it wasn't enough as the canonical couldn't point to a definite version as that changed if the tag had or not another post on it). Within a few days all those pages were deindexed, we noticed a loss in search traffic and I decided to run a small test removing the noindex tag. Results: 2 weeks later none of those pages returned to the index (I added the noindex tag back as it was just a test to see if we could regain that traffic, but ultimately decided it wouldn't help to have a duplicate content issue for that lost traffic).
-
Federico is right.
Your duplicated content issue is due to the date parameters, hence you are potentially duplicating every page having that calendar for all the possible combination of dates... and that is an huge issue.
You should implement the rel="canonical" in order to have all these kind of URLs having as canonical the URL without the parameter.
Or, even better, you should implement the meta robots "noindex,nofollow" in every date parametered URL.
Said that, the most logical thing to do was to block these URLs via robots.txt when launching the site. Unfortunately, now blocking these URLs is not enough, as they are already indexed (even if they not appear in the index because they are filtered out by Google).
-
Ah you mean that if the dates of the reservation changes then it creates a duplicate page content?
If that's the case, you should use the rel="canonical" the definite page, no dates selected, just the page that shows the property.
-
Did you try adding the rel="canonical" tag to the pages?
-
So they might look at this page: http://www.hihh.com.au/property-details?hihhpropertyId=HCP006&checkin=2013-08-06&checkout=2013-08-09&search=checkindate%3D2013-08-06%26checkoutdate%3D2013-08-09
Then the same page would come up on the error list but with different dates.
-
Can you provide us with some examples? It would make our job easier
-
Its basically all seperate pages/URL's with different information on each. However it seems to be crawled for each possible range of that page. e.g for check in/check out dates. It will search a range of dates and think that each page has different information. However, its all exactly the same.
-
Is the issue on the pagination? as sometimes some pages from categories/tags/etc can have the same content within an exact page.
If that's the issue, I would recommend you add a noindex meta to the least important pages (tags for example).
Hope that helps.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hi - How do you get rid of duplicate content that was accidentally created on a tag url? For example, when I published a new article, the content was duplicated on: /posts/tag/lead-generation/
the original article was created with: /posts/shippers-looking-for-freight-brokers/ How can I fix this so a new URL is not created every time I add a tag to a new posting?
On-Page Optimization | | treetopgrowthstrategy0 -
Does a JS script who scroll automaticaly into pages could make some content "hidden" ?
Hello everybody, Sorry for my english (I'm French), I will try to do my best... We've got an e-commerce website : kumulusvape.fr
On-Page Optimization | | KumulusVape
On each categories, to improve our conversion rate, we put a javascript to automaticaly scroll into the page to the product list. You can see an example here : http://www.kumulusvape.fr/44-e-liquide-savourea-smookies This script scroll and make some content "hidden".
It's not really a scroll, just changing page position. Do you think that our h1 and our category content could be consider "hidden" by Google ? Thank you very much for your help0 -
Duplicate content issue, across site domains (blogging)
Hi all, I've just come to learn that a client has been cross-posting their blog posts to other blogs (on higher quality domains, in some cases). For example - this is the same post on 3 different blogs. http://thebioethicsprogram.wordpress.com/2014/06/30/how-an-irb-could-have-legitimately-approved-the-facebook-experiment-and-why-that-may-be-a-good-thing/
On-Page Optimization | | ketanmv
http://blogs.law.harvard.edu/billofhealth/2014/06/29/how-an-irb-could-have-legitimately-approved-the-facebook-experiment-and-why-that-may-be-a-good-thing/
http://www.thefacultylounge.org/2014/06/how-an-irb-could-have-legitimately-approved-the-facebook-experimentand-why-that-may-be-a-good-thing.html
And, sometimes a 4th time, on an NPR website. I'm assuming this is doing no one any favors and Harvard or NPR is going to earn the rank most every time. I'm going to encourage them to publish only fresh content on their real blog, would you agree? Can this actually harm the ranking of their blog and website - should we delete the old entries when migrating the blog? They are going to move their Wordpress Blog to hosting on their real domain soon:
http://www.bioethics.uniongraduatecollege.edu/news/ The current set up is not adding any value to their domain. Thank you for any advice! Ketan0 -
Duplicate Content - What can be duplicate in two different product pages.
I am having a hard time understanding how my 3 different product pages are being shown up as Duplicate Content in s crawl. Some of my 21 different pages are being shown as duplicate content. Here are 3 of those: 1. http://champu.in/korn-rock-band-mens-round-neck-t-shirt-india 2. http://champu.in/stop-the-burning-mens-round-neck-t-shirt-india 3. http://champu.in/funny-t-shirts/absolut-punjabi-red-men-s-round-neck-t-shirt Can someone help me with this. Thanks in advance 🙂
On-Page Optimization | | sidjain4you0 -
Should I use bolded keywords for keywords in the content throughout the page?
If I'm trying to optimize for a specific keyword, should I bold all of the keywords that appear in the content of the page or just one or two? or none at all?
On-Page Optimization | | globalrose.com0 -
How can I make it so that the various iterations (pages) do not come up as duplicate content ?
Hello, I wondered if somebody could give me some advice. The problem of various iterations of the clanedar page coming up as duplicate content. There is a large calendar on my site for events and each time the page is viewed it is seen as duplicate content . How can I make it so that the various iterations (pages) do not come up as duplicate content ? Regards
On-Page Optimization | | Tony14Aug0 -
Break-up content into individual pages or keep on one page
I am working on a dental website. Under menu item "services" lists everything he does like.. Athletic Sports Guards
On-Page Optimization | | Czubmeister
An athletic sports guard is a resilient plastic appliance that is worn to protect the teeth and gum tissues by absorbing the forces generated by traumatic blows during sports or other activities. Digital X-Rays We use state of the art digital x-rays and digital cameras to help with an accurate diagnosis of any concerns. Digital Imaging On initial visits, and recall visits, we take a series of digital photographs to aid us in diagnosis as well as to give you a close-up view of your mouth and any oral conditions. Smile Makeovers
We offer a number of different options including bleaching, bonding, porcelain veeners, and in some cases, implants and/or orthodontic care is utilized in our smile makeover planning. Nitrous oxide for your Comfort Would it be better to break these services up into individual pages? I was thinking I would because then I could add more pictures and expand on the topic and try to get an "A" grade on each page. I'm not sure how I could rank a page if I have 35 services listed on the page. That would be an awfully big H1! Suggestions?0 -
Percentage of duplicate content allowable
Can you have ANY duplicate content on a page or will the page get penalized by Google? For example if you used a paragraph of Wikipedia content for a definition/description of a medical term, but wrapped it in unique content is that OK or will that land you in the Google / Panda doghouse? If some level of duplicate content is allowable, is there a general rule of thumb ratio unique-to-duplicate content? thanks!
On-Page Optimization | | sportstvjobs0