How different does content need to be to avoid a duplicate content penalty?
-
I'm implementing landing pages that are optimized for specific keywords. Some of them are substantially the same as another page (perhaps 10-15 words different). Are the landing pages likely to be identified by search engines as duplicate content? How different do two pages need to be to avoid the duplicate penalty?
-
Thanks, everyone, for the responses. They were very helpful.
-
First of Google does not "penalize" you for duplicate content, unless you're doing it on a massive scale (Panda). If Google detects duplicate content on your site, it will only display one version of that content in the SERPs. Still not ideal, not quite a penalty.
Perhaps more importantly, why do you have multiple landing pages that differ by only 10 words? Aside from the duplicate content issue, how are you "optimizing" each page for different keywords? If you are just changing the title and the URL, then it's probably not worth it from an SEO or user perspective.
If you want to rank for multiple keywords, write rich content which is relevant for multiple keywords or create multiple pages which substantially different and specifically aimed at your target keywords. Changing 10-15 words isn't optimizing for anything.
-
Yes, those landing pages sound like they will be viewed as duplicate content with only 10 or so words different... unless you only have 25 words on each page (which would then be incredibly thin content). I've heard people say that a page should be a minimum of 60% different (No idea how that number was determined though) to avoid duplicate errors. At that point it becomes simpler and easier in most cases to write up completely new content for every page to avoid any issues.
-
TextMarketing is spot on!
Either re-writing from memory or having someone else write the content based on a generic layout are two ways around having duplicate content.
And just for some additional info, this what Google considers duplicate content: "Duplicate content generally refers to substantive blocks of content within or across domains that either completely match other content or are appreciably similar." and "...content is deliberately duplicated across domains in an attempt to manipulate search engine rankings or win more traffic. Deceptive practices like this can result in a poor user experience, when a visitor sees substantially the same content repeated within a set of search results."
Duplicate Content = BAD SEO and BAD User Experience.
Mike
-
I would worry that Goggle may find these pages to be duplicate content if there's only a 10 to 15 word difference. I would recommend re-writing each page from memory without looking at the other in order to help make differentiate the content.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to stop /tag creating duplicate content - Wordpress
Hi, I keep getting alert for duplicate content. It seems Wordpress is creating it through a /tag https://www.curveball-media.co.uk/tag/cipr/ https://www.curveball-media.co.uk/tag/pr-agencies/ Something in the way we've got Wordpress set up?
Technical SEO | | curveballmedia0 -
Duplicate Content
Crawl Diagnostics has returned several issues that I'm unsure how to fix. I'm guessing it's a canonical link issue but not entirely sure... Duplicate Page Content/Titles On a website (http://www.smselectronics.co.uk/market-sectors) with 6 market sectors but each pull the same 3 pages as child pages - certifications, equipment & case studies. On each products section where the page only shows X amount of items but there are several pages to fit all the products this creates multiple pages. There is also a similar pagination problem with the Blogs (auto generated date titles & user created SEO titles) & News listings. Blog Tags also seem to generate duplicate pages with the same content/titles as the parent page. Are these particularly important for SEO or is it more important to remove the duplication by deleting them? Any help would be greatly appreciated. Thanks
Technical SEO | | BBDCreative0 -
Is duplicate content ok if its on LinkedIn?
Hey everyone, I am doing a duplicate content check using copyscape, and realized we have used a ton of the same content on LinkedIn as our website. Should we change the LinkedIn company page to be original? Or does it matter? Thank you!
Technical SEO | | jhinchcliffe0 -
Content Duplication and Canonical Tag settings
Hi all, I have a question regarding content duplication.My site has posted one fresh content in the article section and set canonical in the same page for avoiding content duplication._But another webmaster has taken my post and posted the same in his site with canonical as his site url. They have not given to original source as well._May I know how Google will consider these two pages. Which site will be affected with content duplication by Google and how can I solve this issue?If two sites put canonical tags in there own pages for the same content how the search engine will find the original site which posted fresh content. How can we avoid content duplication in this case?
Technical SEO | | zco_seo0 -
Tips and duplicate content
Hello, we have a search site that offers tips to help with search/find. These tips are organized on the site in xml format with commas... of course the search parameters are duplicated in the xml so that we have a number of tips for each search parameter. For example if the parameter is "dining room" we might have 35 pieces of advice - all less than a tweet long. My question - will I be penalized for keyword stuffing - how can I avoid this?
Technical SEO | | acraigi0 -
.com & .ie website how to avoid duplicate blog content?
We have 2 websites .com & .ie (both are more or less identical except 2 different markets). How can I avoid duplicate blog content as lots of our .com/blog and .ie/blog is the same? Maybe.... Our main .com blog articles are searchable then on our .ie blog content non searchable? (This way both markets get to view the content but only Google actually searches our .com blog) Alliteratively I would need to rewrite each article so that is unique Advise would be appreciated, thank you.
Technical SEO | | AdvanceSystems0 -
Help With Joomla Duplicate Content
Need another set of eyes on my site from someone with Joomla experience. I'm running Joomla 2.5 (latest version) and SEOmoz is giving my duplicate content errors on a lot of my pages. I checked my sitemap, I checked my menus, and I checked my links, and I can't figure out how SEOmoz is finding the alternate paths to my content. Home page is: http://www.vipfishingcharters.com/ There's only one menu at the top. Take the first link "Dania Beach" under fishing charters for example. This generates the SEF url: http://www.vipfishingcharters.com/fishing-charters/broward-county/dania-beach-fishing-charters-and-fishing-boats.html Somehow SEOmoz (and presumably all other robots) are finding duplicate content at: http://www.vipfishingcharters.com/broward-county/dania-beach-fishing-charters-and-fishing-boats.html SEOmoz says the referrer is the homepage/root. The first URL is constructed using the menu aliases. The second one is constructed using the Joomla category and article alias. Where is it getting this and how can I stop it? <colgroup><col width="601"></colgroup>
Technical SEO | | NoahC0 -
Help removing duplicate content from the index?
Last week, after a significant drop in traffic, I noticed a subdomain in the index with duplicate content. The main site and subdomain can be found below. http://mobile17.com http://232315.mobile17.com/ I've 301'd everything on the subdomain to the appropriate location on the main site. Problem is, site: searches show me that if the subdomain content is being deindexed, it's happening really slowly. Traffic is still down about 50% in the last week or so... what's the best way to tackle this issue moving forward?
Technical SEO | | ccorlando0