How to publish duplicate content legitimately without Panda problems
-
Let's imagine that you own a successful website that publishes a lot of syndicated news articles and syndicated columnists.
Your visitors love these articles and columns but the search engines see them as duplicate content.
You worry about being viewed as a "content farm" because of this duplicate content and getting the Panda penalty.
So, you decide to continue publishing the content and use...
<meta name="robots" content="noindex, follow">
This allows you do display the content for your visitors but it should stop the search engines from indexing any pages with this code. It should also allow robots to spider the pages and pass link value through them.
I have two questions.....
-
If you use "noindex" will that be enough to prevent your site from being considered as a content farm?
-
Is there a better way to continue publication of syndicated content but protect the site from duplicate content problems?
-
-
Good idea about attributing with rel=canonical.
Thanks!
-
Noindexing the syndicated articles should, in theory, minimize the likelihood of having a Panda problem, but it seems like Panda is constantly evolving. You will probably see some kind of drop in rankings as the number of indexed pages of for site will decrease. If you have say, 1000 pages total on the site and suddenly 900 are taken out of the index, this might be a problem. If it is a much smaller percentage of the site, you might not have a problem at all. Other than the number of indexed pages, I don't think you will have a problem once the syndicated stuff is noindexed.
It will probably take Google a while to re-index/un-index the pages, so hopefully it won't be a fast drop if there is one. In the long run, it is probably better to at least have the appearance of trying to do the right thing. Linking to the source, and maybe using rel=canonical tags to the original article would also be a good practice. -
Thank you, Nick.
We will be using the "noindex" only on the pages with syndicated content. This is a DreamWeaver site and it is easy to place the code on specific pages and does not use excerpts.
Do you still see a potential problem?
The question really is... "Could a site that contains a lot of syndicated content have a Panda problem if the pages that contain that content are noindexed?"
-
I am assuming you intend to use no index only on the duplicate content articles. Using no index on everything would also prevent your content from being indexed and found through Google.
If you are using Wordpress or something else that will allow showing excerpts, you could try making the article pages noindex and show only excerpts on the main page and category pages which would be indexed and followed. I think that would make the articles not appear in searches and avoid duplicate content penalties, while allowing the pages that show the excerpts to still be indexed and rank OK.
The idea here is that the pages showing the excerpts would have enough text to help the home and category pages to rank for the subject matter and hopefully not be seen as what it is - copied content.
You will probably eventually get caught by the Panda, but this may work as a temporary solution until you can get some original content mixed in.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Fullsite=true coming up as duplicate content?
Hello, I am new to the fullsite=true method of mobile site to desktop site, and have recently found that about 50 of the instances in which I added fullsite=true to links from our blog show as a duplicate to the page that it is pointing to? Could someone tell me why this would be? Do I need to add some sort of rel=canonical to the main page (non-fullsite=true) or how should I approach this? Thanks in advance for your help! L
Technical SEO | | lfrazer0 -
Duplicate content on user queries
Our website supports a unique business industry where our users will come to us to look for something very specific (a very specific product name) to find out where they can get it. The problem that we're facing is that the products are constantly changing due to the industry. So, for example, one month, one product might be found on our website, and the next, it might be removed completely... and then might come back again a couple months later. All things that are completely out of our control - and we have no way of receiving any sort of warning when these things might happen. Because of this, we're seeing a lot of duplicate content issues arise... For Example... Product A is not active today... so www.mysite.com/search/productA will return no results... Product B is also not active today... so www.mysite.com/search/productB will also return no results. As per Moz Analytics, these are showing up as duplicate content because both pages indicate "No results were found for {your searched term}." Unfortunately, it's a bit difficult to return a 204 in these situations (which I don't know if a 204 would help anyway) or a 404, because, for a faster user experience, we simultaneously render different sections of the page... so in the very beginning of the page load - we start rendering the faster content (template type of content) that says "returning 200 code, we got the query successfully & we're loading the page".. the unique content results finish loading last since they take the longest. I'm still very new to the SEO world, so would greatly appreciate any ideas or suggestions that might help with this... I'm stuck. 😛 Thanks in advance!
Technical SEO | | SFMoz0 -
Duplicate Content Issue
My issue with duplicate content is this. There are two versions of my website showing up http://www.example.com/ http://example.com/ What are the best practices for fixing this? Thanks!
Technical SEO | | OOMDODigital0 -
Duplicate video content question
This is really two questions in one. 1. If we put a video on YouTube and on our site via Wistia, how would that affect our rankings/authority/credibility? Would we get punished for duplicate video content? 2. If we put a Wistia hosted video on our website twice, on two different pages, we would get hit for having duplicate content? Any other suggestions regarding hosting on Wistia and YouTube versus just Wistia for product videos would be much appreciated. Thank you!
Technical SEO | | ShawnHerrick1 -
Duplicate Content
Hi, we need some help on resolving this duplicate content issue,. We have redirected both domains to this magento website. I guess now Google considered this as duplicate content. Our client wants both domain name to go to the same magento store. What is the safe way of letting Google know these are same company? Or this is not ideal to do this? thanks
Technical SEO | | solution.advisor0 -
Duplicate Content on Navigation Structures
Hello SEOMoz Team, My organization is making a push to have a seamless navigation across all of its domains. Each of the domains publishes distinctly different content about various subjects. We want each of the domains to have its own separate identity as viewed by Google. It has been suggested internally that we keep the exact same navigation structure (40-50 links in the header) across the header of each of our 15 domains to ensure "unity" among all of the sites. Will this create a problem with duplicate content in the form of the menu structure, and will this cause Google to not consider the domains as being separate from each other? Thanks, Richard Robbins
Technical SEO | | LDS-SEO0 -
Does turning website content into PDFs for document sharing sites cause duplicate content?
Website content is 9 tutorials published to unique urls with a contents page linking to each lesson. If I make a PDF version for distribution of document sharing websites, will it create a duplicate content issue? The objective is to get a half decent link, traffic to supplementary opt-in downloads.
Technical SEO | | designquotes0 -
Duplicate content and http and https
Within my Moz crawl report, I have a ton of duplicate content caused by identical pages due to identical pages of http and https URL's. For example: http://www.bigcompany.com/accomodations https://www.bigcompany.com/accomodations The strange thing is that 99% of these URL's are not sensitive in nature and do not require any security features. No credit card information, booking, or carts. The web developer cannot explain where these extra URL's came from or provide any further information. Advice or suggestions are welcome! How do I solve this issue? THANKS MOZZERS
Technical SEO | | hawkvt10