Will duplicate product information paragraphs negatively impact our site?
-
We are selling paint and have separate pages for different colour cans, each with their own unique description.
We would like to include a few additional paragraphs of product information below each description, but this will be identical across all the products. Do you think this will be a problem being duplicate content?
-
I wouldn't say there would be massive chances of a penalty here, that being said it's an area where you could be 'adding value' and uniqueness to your pages and you're not doing it. So your pages may be 'less competitive' and you may be missing out on an opportunity. It's more of a competitive missed opportunity than an 'error' per-se
In reality you should have one product page for each product and then just have 'product variants' for stuff like quantity, size, colour etc. On the modern web people find this easier to navigate and since many sites do offer that, they might seem like more competitive places to shop for paint cans than your site. Price does matter, but it's not the sole arbiter of how products are ranked on Google's search engine - other stuff matters too. Unless you have a virtual monopoly on the product (only you can sell it, or only you can sell it at a greatly discounted price due to a special relationship with the supplier) then I would consider the UX and design of your site. No one wants an 'arse-ache' of a browsing experience
Many tools will flag what you are about to do as duplicate content and they're technically right. But instead of going on some crazy copy-writing crusade, think about the architecture of your site. You can still have separate URLs for different product variations if you want, even via parameter-variables (though that's a bit of a 'basic' implementation). If you make it clear to Google through new, more streamlined architecture that they're all actually the same product, the duplicate description(s) won't matter 'as much' (though they'll still be a missed opportunity for more diverse rankings IMO)
You can make it even more apparent to Google that all the different variations are actually the 'same product' by utilising Product schema and some of the deeper stuff like ProductModel which will bind it all together. Whatever you implement, test it here. If this tool throws errors and warnings, keep working away until they're all fixed
Canonical tags are another option but they will decrease your ranking 'footprint' and in this case I wouldn't recommend them, despite 'slight' content duplication risk (which in reality, are mostly negligible)
Final note: you say you have 'unique' descriptions, but remember if they're used elsewhere online they're not unique. If they're unique internally that's great, but if you got them all from a supplier then... obviously loads of other sites are probably using them, which could easily be a big issue for you
-
Hi Justin,
Great question, to help answer that question I will use a quote from Google's support document regarding duplicate content.
https://support.google.com/webmasters/answer/66359?hl=en
"Examples of non-malicious duplicate content could include:
- Discussion forums that can generate both regular and stripped-down pages targeted at mobile devices
- Store items shown or linked via multiple distinct URLs
- Printer-only versions of web pages
"
I think your situation would likely fall under the similar category as "acceptable" like the store items example I highlighted. Keep in mind although duplicate content should really be avoided when possible, Google does NOT actually penalize site's for having it.
Although I would try to keep the overall amount of duplicate content to a minimum, it shouldn't be too big of an issue. Utilize the unique descriptions, in this case, you likely won't have to worry too much about the duplicate content.
I hope that helps!
Best,
Alex Ratynski -
Hi Joe,
Thanks for your help, it would probably be about 50%, but we could look to make this more like 80% unique content if you think this will help.
-
Hello,
How much of the copy is unique per page?
WRT to content originality, I've worked to is 80% unique content per page as a general rule.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Regional sites built on different platforms - will this solution for international targeting work?
We are working with our dev team on a few upcoming user stories to improve store.hp.com. We came across a question which isn’t clear in the international targeting documentation. Within http://store.hp.com, we have a number of regional stores, but those are often built on separate platforms. Therefore a story developed on the US infrastructure doesn’t carry over to Canada and so forth. The Canada Store is managed by a different team, so that story needs to get scoped, prioritized, etc. independently. In regards to helping Google understand page equivalence, will Google accept the page relationship if we include hreflang tags exclusively in the sitemap for the US site and exclusively as page-level markup for Canada site? For example: http://store.hp.com/CanadaStore (hreflang notation at page-level): http://store.hp.com/us/en" /> http://store.hp.com/CanadaStore" /> http://store.hp.com/us/en" /> http://store.hp.com/us/en (hreflang notation within sitemap file): <loc>http://store.hp.com/us/en</loc> rel="alternate" hreflang="en-ca" href=" http://store.hp.com/CanadaStore" /> rel="alternate" hreflang="en-us" href="http://store.hp.com/us/en" /> Appreciate the help anyone can give! Zach
Technical SEO | | ZachKline0 -
Site hacked in Jan. Redeveloped new site. Still not ranking. Should we change domain?
Our top ranking site in the UK was hacked at the end of 2014. http://www.ultimatefloorsanding.co.uk/ The site was the subject of a manual spam action from Google. After several unsuccessful attempts to clean it up, using Securi.net and reinstating old versions of the site, changing passwords etc. we took the decision to redevelop the site. We also changed hosting provider as we had received absolutely no support from them whatsoever in resolving the issue. So far we have: Removed the old website files off the server Developed a new website having implemented 301's for all the old URL's (except the spam ones) Submitted a reconsideration request for the manual spam action, which was accepted. Disavowed all the spammy inbound links through Webmaster Tools Implemented custom URL parameters through Google to not index the SPAM URLs ( which were using parameters) Our organic traffic is down by 63% compared to last year, and we are not ranking for most of our target keywords any longer. Is there anything that I am missing in the actions I have taken so far? We were advised that at this stage changing domain and starting again might be the way to go. However the current domain has been used by us since 2007, so it would be a big call. Any advice is appreciated, thanks. Sue - http://www.ultimatefloorsanding.co.uk/
Technical SEO | | galwaygirl0 -
Google how deal with licensed content when this placed on vendor & client's website too. Will Google penalize the client's site for this ?
One of my client bought licensed content from top vendor of Health Industry. This same content is on the vendor's website & my client's site also but on my site there is a link back to vendor is placed which clearly tells to anyone that this is a licensed content & we bought from this vendor. My client bought paid top quality content for best source of industry but at this same this is placed on vendor's website also. Will Google penalize my client's website for this ? Niche is HEALTH
Technical SEO | | sourabhrana1 -
301'd site, but new site is not getting picked up in google.
Hi I'm having big issues! Any help would be greatly appreciated This is the 3rd time this happened. Every time I switch my old site greatcleanjokes.com to the new design of chokeonajoke.com traffic goes almost completely down (I even tried out the new design on greatcleanjokes [to see if it was a 301 issue] and traffic also went down.) What can possibly be wrong with this new site that google just doesn't like it ?! I was ranking high up for many big phrase like joke of the day, corny jokes, clean jokes, short jokes. Now It's all gone. I also think it's strange that when I search for site:chokeonajoke.com the post pages show up before the category pages!? Here is the old site http://web.archive.org/web/20140406214615/http://www.greatcleanjokes.com/ Here is the new one http://chokeonajoke.com/ If you can't figure out anything do you know of anyone I can hire who may be able to figure it out?
Technical SEO | | Nickys22111 -
Joomla creating duplicate pages, then the duplicate page's canonical points to itself - help!
Using Joomla, every time I create an article a subsequent duplicate page is create, such as: /latest-news/218-image-stabilization-task-used-to-develop-robot-brain-interface and /component/content/article?id=218:image-stabilization-task-used-to-develop-robot-brain-interface The latter being the duplicate. This wouldn't be too much of a problem, but the canonical tag on the duplicate is pointing to itself.. creating mayhem in Moz and Webmaster tools. We have hundreds of duplicates across our website and I'm very concerned with the impact this is having on our SEO! I've tried plugins such as sh404SEF and Styleware extensions, however to no avail. Can anyone help or know of any plugins to fix the canonicals?
Technical SEO | | JamesPearce0 -
Are aggregate sites penalised for duplicate page content?
Hi all,We're running a used car search engine (http://autouncle.dk/en/) in Denmark, Sweden and soon Germany. The site works in a conventional search engine way with a search form and pages of search results (car adverts).The nature of car searching entails that the same advert exists on a large number of different urls (because of the many different search criteria and pagination). From my understanding this is problematic because Google will penalize the site for having duplicated content. Since the order of search results is mixed, I assume SEOmoz cannot always identify almost identical pages so the problem is perhaps bigger than what SEOmoz can tell us. In your opinion, what is the best strategy to solve this? We currently use a very simple canonical solution.For the record, besides collecting car adverts AutoUncle provide a lot of value to our large user base (including valuations on all cars) . We're not just another leech adword site. In fact, we don't have a single banner.Thanks in advance!
Technical SEO | | JonasNielsen0 -
Is 100% duplicate content always duplicate?
Bit of a strange question here that would be keen on getting the opinions of others on. Let's say we have a web page which is 1000 lines line, pulling content from 5 websites (the content itself is duplicate, say rss headlines, for example). Obviously any content on it's own will be viewed by Google as being duplicate and so will suffer for it. However, given one of the ways duplicate content is considered is a page being x% the same as another page, be it your own site or someone elses. In the case of our duplicate page, while 100% of the content is duplicate, the page is no more than 20% identical to another page so would it technically be picked up as duplicate. Hope that makes sense? My reason for asking is I want to pull latest tweets, news and rss from leading sites onto a site I am developing. Obviously the site will have it's own content too but also want to pull in external.
Technical SEO | | Grumpy_Carl0