Percentage of duplicate content allowable
-
Can you have ANY duplicate content on a page or will the page get penalized by Google?
For example if you used a paragraph of Wikipedia content for a definition/description of a medical term, but wrapped it in unique content is that OK or will that land you in the Google / Panda doghouse?
If some level of duplicate content is allowable, is there a general rule of thumb ratio unique-to-duplicate content?
thanks!
-
I dont believe you have aproblem if you havea bit of duplicate content, google does not penilize you for duplicate content, it just dosent award you points for it.
-
That sounds like something Google will hate by default. Your problem there is page quantity to quality and uniqueness ratio.
-
It's quite difficult to provide the exact data as Google algorithm is Google's hidden treasure. Better to keep yourself safe by creating completely unique content, Referring to your example of Wikipedia definition, you can add something like " ACCORDING TO WIKIPEDIA ..... " while copying definition or adding reference links while copying any content from other sources.
Remember that Google is not only giving importance to unique content but it should be of high quality. That means the article should be innovative like a complete new thing & well researched, so it mustn't be of 200 or less words. So Google will compare the quality of the whole article with the copied content & then it'll decide whether it's a duplicate content article or not.
-
We recently launched a large 3500 page website that auto generates a sentence after we plug in statistical data in our database.
So the only unique content is a single sentence?
Within that sentence many of the words would need to be common as well. Consider a simple site that offered the population for any given location. "The population of [California] is [13 million] people."
In the above example only 3 words are unique. Maybe your pages are a bit more elaborate but it seems to me those pages are simply not indexable. What you can do is index the main page where users can enter the location they wish to learn about, but not each possible result (i.e. California).
Either add significantly more content, or only index the main page.
-
We recently launched a large 3500 page website that auto generates a sentence after we plug in statistical data in our database. All pages are relevant to users and provide more value than other results in serps, but i think a penalty is in place that the farmer update may have detected with a sort of auto-penalty against us.
I sent in a reconsideration request last week, the whole project is on hold until we get a response. I'm expecting a generic answer from them.
We are debating on either writing more unique content for every page or entering in more statistical data to run some cool correlations. The statistical data would be 3x more beneficial to the user I feel, but unique content is what Google seeks and a safer bet just to get us indexed properly.
-
We're currently observing a crumbling empire of websites with auto-generated content. Google is somehow able to understand how substantial your content is and devalue the page and even the whole site if it does not meet their criteria. This is especially damaging for sites who have say 10% of great unique content and 90% of their pages are generated via tagging, browsable search and variable driven paragraphs of text.
Having citations is perfectly normal but I would include reference section just in case.
-
You can have some duplicate content in the manner you mentioned above. It is a natural and expected part of the internet that existing sources of information will be utilized.
There is not any magic number which says "30% duplication is ok, but 31% is not". Google's algorithms are private and constantly changing. Use good sense to guide you as to whether your page is unique and offers value to users.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content? other issues? using vendor info when selling their prodcuts?
When building content for vendors that we sell their products? best practices? ok, to copy and paste "about us" info? or will that be flagged as duplicate content.
On-Page Optimization | | bakergraphix_yahoo.com0 -
Delete or not delete outdated content
Hi there!
On-Page Optimization | | Enrico_Cassinelli
We run a website about a region in Italy, the Langhe area, where we write about wine and food, local culture, and we give touristic informations. The website also sports a nice events calendar: in 4 years we (and our users) loaded more than 5700 events. Now, we're starting to have some troubles managing this database. The database related to events is huge both in file size and number of rows. There are a lot of images that eat up disk space, and also it's becoming difficult to manage all the data in our backend. Also, a lot of users are entering the website by landing on outdated events. I was wondering if it could be a good idea to delete events older than 6 months: the idea was to keep only the most important and yearly recurring events (which we can update each year with fresh information), and trash everything else. This of course means that 404 errors will increase, and also that our content will gettin thinner, but at the same time we'll have a more manageable database, and the content will be more relevant and "clean". What do you think? thank you 🙂 Best0 -
Impact of rogue keyword in content
I have a page that is optimised - title, URL, content etc for the chosen keywords. However, within the content are some batches of bullet point text that has repeated text throughout. So for example I have 5 instances of my chosen keyword within the content and 24 instances of the two word text within the bullet points. Does this kind of scenario have any impact on ranking?
On-Page Optimization | | MickEdwards0 -
Duplicate Content?
Hi All, I have a new client site, a static site with navigation across the top, and down the left side. Two of the menus from the top navigation are replicated in the navigation structure on the left hand side. They have the exact same url structure, they are in fact the same exact page, listed on the site in two areas. My question is - is this a case of duplicate content, or, as they urls are the exact same, will they be seen as a single page? A canonical tag on one would be replicated on the other by the CMS - so do I leave it, or try to get them to re-structure removing one of the links? (I doubt they will do this as its a brand new site they just has developed). Many thanks!
On-Page Optimization | | Webrevolve0 -
Duplicate Page Title
Hi Guys, First off, it's an honour to be a part of this awesome community. I'm using WordPress and getting top 3 rankings for great keywords and I'm very excited, however my page titles are in this format "keyword optimised title here - site name here" eg: "This is my keyword - this is the name of my blog", "This is another keyword - this is the name of my blog", "This is a longtail keyword - this is the name of my blog" SEOMoz is reporting errors because of duplicate page title tags due to the "this is the name of my blog" being in every page title. Will this hurt my rankings? Thanks in advance and keep up the great work! Cheers, Troy.
On-Page Optimization | | TroyDean710 -
Split testing and dupe content
Hi Everyone, good to be here. I'd like to do split testing in Adwords, currently with a clients site we are selling from a normal site with navigation. The site has about 5 specific products, I want to dupe one of the products and create a funnel without navigation distractions right to checkout. Then A/B test the same product pages in Adwords, one with nav and one without. Will the dupe content be ignored do you think? I'm only slightly concerned as the product pages rank well at the moment.
On-Page Optimization | | eonicWeb0 -
Do product pages need unique content or does having duplcate content hurt on those pages?
We are adding product rapidly to our website but this requires allowing duplicate to exist on our product pages of furniture-online.com. From an SEO standpoint do we need to make this content unique for each product. Since we aren't link building to specific product pages and we don't anticipate product pages being found in a search result, are we ok leaving the duplicate content in place and spending our dollars elsewhere?
On-Page Optimization | | gallreddy0 -
Filtered Navigation, Duplicate content issue on an Ecommerce Website
I have navigation that allows for multiple levels of filtering. What is the best way to prevent the search engine from seeing this duplicate content? Is it a big deal nowadays? I've read many articles and I'm not entirely clear on the solution. For example. You have a page that lists 12 products out of 100: companyname.com/productcategory/page1.htm And then you filter these products: companyname.com/productcategory/filters/page1.htm The filtered page may or may not contain items from the original page, but does contain items that are in the unfiltered navigation pages. How do you help the search engine determine where it should crawl and index the page that contains these products? I can't use rel=canonical, because the exact set of products on the filtered page may not be on any other unfiltered pages. What about robots.txt to block all the filtered pages? Will that also stop pagerank from flowing? What about the meta noindex tag on the filitered pages? I have also considered removing filters entirely, but I'm not sure if sacrificing usability is worth it in order to remove duplicate content. I've read a bunch of blogs and articles, seen the whiteboard special on faceted navigation, but I'm still not clear on how to deal with this issue.
On-Page Optimization | | 13375auc30