Duplicate Page Content
-
I've got several pages of similar products that google has listed as duplicate content. I have them all set up with rel="prev" and rel="next tags telling google that they are part of a group but they've still got them listed as duplicates. Is there something else I should do for these pages or is that just a short falling of googles webmaster tools?
One of the pages: http://www.jaaronwoodcountertops.com/wood-countertop-gallery/walnut-countertop-9.html
-
Oh, sorry - didn't catch that some were duplicated. Given the scope, I think I'd put the time into creating unique titles and single-paragraph descriptions. There's a fair shot these pages could rank for longer-tail terms, and the content certainly has value to visitors.
-
Your right. There are a few pages with unique titles already but there are several that are dups.
-
I'm wondering if we're looking at two different things - I was looking at the pages like:
http://www.jaaronwoodcountertops.com/wood-countertop-gallery/walnut-countertop-8.html
http://www.jaaronwoodcountertops.com/wood-countertop-gallery/walnut-countertop-9.html
These already seem to have unique titles.
-
Thanks. We try to make them nice. I'm going to work on adding some content to each page but it does get difficult when they're so similar. I may just do a few pages and have them indexed and the others noindex.
-
The duplicate content is showing up as dup. titles and description tags. Do you think that if I added titles like "Photographs of J. Aaron Wood Countertops and Butcher Block | Image One" to all the pages and then changed just the image number that would be enough to eliminate the dup. title issues? Would that make any difference if the content is the same?
-
It sounds like you're all pretty much saying the same thing as far as the options go. I was so happy when I learned about the rel=prev/next tags.
Do you guys think I should add noindex to all the pages now and as I add content remove the noindex or should I just leave them as they are and start adding the content as I get time? Which is worse for overall site rankings, loosing content or having duplicate content?
Dr. Meyers: The duplicate content is showing up as dup. titles and descriptions. Do you think that if I added titles like "Photographs of J. Aaron Wood Countertops and Butcher Block - Image One" to all the pages and then changed just the image number that would be enough to eliminate the dup. title issues? Would that make any difference if the content is the same?
Thanks guys.
-
This isn't a typical application of rel=prev/next, and I'm finding Google's treatment of those tags is inconsistent, but the logic of what you're doing makes sense, and the tags seem to be properly implemented. Google is showing all of the pages indexed, but rel=prev/next doesn't generally de-index paginated content (like a canonical tag can).
Where is GWT showing them as duplicates (i.e. title, META description, etc.)?
Long-term, there are two viable solutions:
(1) Only index the main gallery (NOINDEX the rest). This will focus your ranking power, but you'll lose long-tail content.
(2) Put in the time to write at least a paragraph for each gallery page. It'll take some time, but it's doable.
Given the scope (you're talking dozens of pages, not 1000s), I'd lean toward (2). These pages are somewhat unique and do potentially have value, but you need to translate more of that uniqueness into copy Google can index.
-
in the duplicate page use meta tag no index follow.
-
Hi
It seems that you have created pages just for the pictures you wanted to display and google possibly does not understand the content, as there isn't content. in a nutshell page 9 and 10 has almost the same content just with a different picture.
For your own sake a Title on each page will help you get better results and while you have already a page for each picture why not adding some details to it. google will like that more:-)
You might have a look into a proper CMS system in the future as pages will change!
I like your products:-)
Regards,
Jim Cetin
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content from long Site Title
Hello! I have a number of "Duplicate Title Errors" as my website has a long Site Title: Planit NZ: New Zealand Tours, Bus Passes & Travel Planning. Am I better off with a short title that is simply my website/business name: Planit NZ My thought was adding some keywords might help with my rankings. Thanks Matt
Technical SEO | | mkyhnn0 -
Landing page video scripts - duplicate content concerns
we are planning to create a series of short (<30 sec) videos for landing pages for our clients PPC campaigns. Since our clients all offer the same services (except in different geographical regions of the county) - we were planning to use the SAME script ( approx 85 words) with only the clients business name changed. Our question is : Would these videos be identified as 'duplicate content' - if we are only planning to use the videos on landing pages and only for PPC? -in other words are we in any danger of any kind of consequences from the engines for repeating script text across a series of landing pages featured only at PPC campaigns?
Technical SEO | | Steve_J0 -
Duplicate content due to numerous sub category level pages
We have a healthcare website which lists doctors based on their medical speciality. We have a paginated series to list hundreds of doctors. Algorithm: A search for Dentist in Newark locality of New York gives a result filled with dentists from Newark followed by list of dentists in locations near by Newark. So all localities under a city have the same set of doctors distributed jumbled an distributed across multiple pages based on nearness to locality. When we don't have any dentists in Newark we populate results for near by localities and create a page. The issue - So when the number of dentists in New York is <11 all Localities X Dentists will have jumbled up results all pointing to the same 10 doctors. The issue is even severe when we see that we have only 1-3 dentists in the city. Every locality page will be exactly the same as a city level page. We have about 2.5 Million pages with the above scenario. **City level page - **https://www.example.com/new-york/dentist - 5 dentists **Locality Level Page - **https://www.example.com/new-york/dentist/clifton, https://www.example.com/new-york/dentist/newark - Page contains the same 5 dentists as in New York city level page in jumbled up or same order. What do you think we must do in such a case? We had discussions on putting a noindex on locality level pages or to apply canonical pointing from locality level to city level. But we are still not 100% sure.
Technical SEO | | ozil0 -
How do I fix issue regarding near duplicate pages on website associated to city OR local pages?
I am working on one e-commerce website where we have added 300+ pages to target different local cities in USA. We have added quite different paragraphs on 100+ pages to remove internal duplicate issue and save our website from Panda penalty. You can visit following page to know more about it. And, We have added unique paragraphs on few pages. But, I have big concerns with other elements which are available on page like Banner Gallery, Front Banner, Tool and few other attributes which are commonly available on each pages exclude 4 to 5 sentence paragraph. I have compiled one XML sitemap with all local pages and submitted to Google webmaster tools since 1st June 2013. But, I can see only 1 indexed page by Google on Google webmaster tools. http://www.bannerbuzz.com/local http://www.bannerbuzz.com/local/US/Alabama/Vinyl-Banners http://www.bannerbuzz.com/local/MO/Kansas-City/Vinyl-Banners and so on... Can anyone suggest me best solution for it?
Technical SEO | | CommercePundit0 -
How do I deal with Duplicate content?
Hi, I'm trying SEOMOZ and its saying that i've got loads of duplicate content. We provide phone numbers for cities all over the world, so have pages like this... https://www.keshercommunications.com/Romaniavoipnumbers.html https://www.keshercommunications.com/Icelandvoipnumbers.html etc etc. One for every country. The question is, how do I create pages for each one without it showing up as duplicate content? Each page is generated by the server, but Its impossible to write unique text for each one. Also, the competition seem to have done the same but google is listing all their pages when you search for 'DID Numbers. Look for DIDWW or MyDivert.
Technical SEO | | DanFromUK0 -
Huge number of indexed pages with no content
Hi, We have accidentally had Google indexed lots os our pages with no useful content at all on them. The site in question is a directory site, where we have tags and we have cities. Some cities have suppliers for almost all the tags, but there are lots of cities, where we have suppliers for only a handful of tags. The problem occured, when we created a page for each cities, where we list the tags as links. Unfortunately, our programmer listed all the tags, so not only the ones, where we have businesses, offering their services, but all of them! We have 3,142 cities and 542 tags. I guess, that you can imagine the problem this caused! Now I know, that Google might simply ignore these empty pages and not crawl them again, but when I check a city (city site:domain) with only 40 providers, I still have 1,050 pages indexed. (Yes, we have some issues between the 550 and the 1050 as well, but first things first:)) These pages might not be crawled again, but will be clicked, and bounces and the whole user experience in itself will be terrible. My idea is, that I might use meta noindex for all of these empty pages and perhaps also have a 301 redirect from all the empty category pages, directly to the main page of the given city. Can this work the way I imagine? Any better solution to cut this really bad nightmare short? Thank you in advance. Andras
Technical SEO | | Dilbak0 -
How to prevent duplicate content in archives?
My news site has a number of excerpts in the form of archives based on categories that is causing duplicate content problems. Here's an example with the nutrition archive. The articles here are already posts, so it creates the duplicate content. Should I nofollow/noindex this category page along with the rest and 2011,2012 archives etc (see archives here)? Thanks so much for any input!
Technical SEO | | naturalsociety0 -
Sharing the same content on every page
As an ecommerce site, one of the tabs on the product description is filled with delivery information. This tab is populated the same way on every product page. I think this is contributing to an increased score on my pages similarity to each other. Is there a way to obscure this info for se's and is it worthwhile doing so?
Technical SEO | | LadyApollo0