Duplicate content question
-
Hi
I have a site that is run off one CMS system but has 3 different web addresses. One is a comic shop, one is a toy shop and one is a game shop. Now due to the nature of what we are selling some of the products we are selling on both or all 3 of the sites.
I was wondering as to whether this would affect my ability to rank in google and if i would be penalised for any duplicate content?
Thanks in advance
-
One trick you can do is dilute the text pretty easily and make some small changes to make them compete with each other instead of one canceling two out like a duplicate content will get you. You could try something like this and it should be pretty easy to implement if you are using a template based cart system.
Shop 1 - Just have the normal product description on the page and all of the standard text you have from your menu, ect basically changing nothing.
Shop 2 - If your site is utilizing rich snippets, change them on a template level. Most work like this $template_variable So on the second site delete a little bit of the rich snippet data, say the condition and the manufacturer name. At the same time on every product page add a 200-400 word shipping and return policy. As the current standards are, that should dilute everything enough to look like a totally different page.
Shop 3 - Basically the same as shop 2, change the rich snippet data that is sent a little. Maybe it could be adding manufacturer data that is not in the other templates, or taking something else that is non essential out of the template. Then add in on the product pages a different but meaning the same 100-200 word shipping policy. At the same time either add in an "about" section for the manufacturer, or if you have too many manufacturers for this to be reasonable add in an about for your company on every product page.
By doing this you will make the pages compete with each other instead of making one page dominate. If you are using a template based CMS, the changes should be easy and should only take about an hr or two to do, minus the time to write the content.
-
Highland has a very good answer to this for sure. But i'd like to add something. We are magento specialists and work with a lot of multi-store environments, i'd say follow the following steps
- Best case: try to write unique copy per website with the same product and title. this would be fine, as long as the content is really unique on all 3 sites you have 3 ranking opportunities in stead of one.
- Else: If thats costs to much time or effort and does not give you enough ROI just choose highland's method of setting a canonical tag to you most powerful domain which is more likely to rank
- If thats to much time just forget it and let google fight it out, it wont give you much devaluation
-
It's not a penalty per se, it just means that Google is going to pick one copy and devalue the others. Does it affect your ability to rank? Not really. It just means one site is going to be the winner. Remember, a penalty is where Google devalues your whole site for bad behavior. Duplicate content is not bad behavior.
If you want to pick the rankings winner yourself you can add a canonical tag to the other pages pointing to the one you want to rank.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Thin Content pages
I have a couple of pages that are thin content. One is essentially a page with the icons of our customers and a link out to their website. The other is a summary portfolio page that has some images of some of the client work we have done with links to internal pages that have more details about each client situation, approach, etc. These deeper pages are just fine. What is the recommendation for handling these thin content pages? We could add content, but then it wouldn't really help the user very much.
On-Page Optimization | | ExploreConsulting0 -
Duplicate Content on Event Pages
My client has a pretty popular service of event listings and, in hope of gathering more events, they opened up the platform to allow users to add events. This works really well for them and they are able to garner a lot more events this way. The major problem I'm finding is that many event coordinators and site owners will take the copy from their website and copy and paste it, duplicating a lot of the content. We have editor picks that contain a lot of unique content but the duplicate content scares me. It hasn't hurt our page ranking (we have a page ranking of 7) but I'm wondering if this is something that we should address. We don't have the manpower to eliminate all the duplication but if we cut down the duplication would we experience a significant advantage over people posting the same event?
On-Page Optimization | | mattdinbrooklyn0 -
Boat broker - issues with duplicate content and indexing search results
Hello, I have read a lot about optimising product pages and not indexing search results or category pages as ideally a person should be directed straight to a product page. I am interested in how best to approach a site that is listing second hand products for sale - essentially a marketplace of second hand goods (in my case, www.boatshed.com - international boat brokers). For example, we currently have 5 Colvic Sailer 26 boats for sale across the world - that is 5 boats of the same make and model but differing years, locations, sellers and prices. My concern is with search results and 'category' pages. Unlike typical e-commerce sites, when someone searches for a 'Colvic sailer 26 for sale' I want them to go to a search results style page as it is more useful for them to see a list of boats than one random one that Google decides is most important (or possibly one it can match by location). Currently we have 3 different URL types to show search results style pages (i.e. paginated lists of boats that include name, image and short description):
On-Page Optimization | | pbscreative
manufacturer URL's e.g. http://www.boatshed.com/colvic-manufacturer-145.html
category URL's e.g. barges http://www.boatshed.com/barges-category-55.html
and normal search results e.g. dosearch.php?form_boattype_textbox=&.... I have noindexed the search results pages but our category and manufacturer URLs show up in search results and ultimately these are pages I want people to land on. I am however getting duplicate content warnings in Moz. Most boats are in several categories and all will come up on 1 manufacturer and one manufacturer and model page. Both sets of URL's are in my opinion needed; lots of users search for exact makes / models and lots of users just search for the type of boat e.g. 'barge for sale' so both sets of landing pages are useful. Any suggestions or thoughts greatly appreciated Thanks Ben0 -
What Should I Do With Low Quality Content?
As my site has definitely got hit by Panda, I am in the process of cleaning my website of low quality content. Needless to say, shitty articles are completed being removed but I think lots of this content is now of low quality because it is obsolete and dated. So what should I do with this content? Should I rewrite those articles as completely new posts and link from the old posts to the new ones? Or should I delete the old posts and do a 301 redirect to the new post? Or should I rewrite the content of these articles in place so I can keep the old URL and backlinks? One thing is that I've got a lot more followers than I used to so publishing a new post gets a lot more views, like and shares and whatnot from social networks.
On-Page Optimization | | sbrault741 -
Duplicate Content Issues with Forum
Hi Everyone, I just signed up last night and received the crawl stats for my site (ShapeFit.com). Since April of 2011, my site has been severely impacted by Google's Panda and Penguin algorithm updates and we have lost about 80% of our traffic during that time. I have been trying to follow the guidelines provided by Google to fix the issues and help recover but nothing seems to be working. The majority of my time has been invested in trying to add content to "thin" pages on the site and filing DMCA notices for copyright infringement issues. Since this work has not produced any noticeable recovery, I decided to focus my attention on removing bad backlinks and this is how I found SEOmoz. My question is about duplicate content. The crawl diagnostics showed 6,000 errors for duplicate page content and the same for duplicate page title. After reviewing the details, it looks like almost every page is from the forum (shapefit.com/forum). What's the best way to resolve these issues? Should I completely block the "forum" folder from being indexed by Google or is there something I can do within the forum software to fix this (I use phpBB)? I really appreciate any feedback that would help fix these issues so the site can hopefully start recovering from Panda/Penguin. Thank you, Kris
On-Page Optimization | | shapefit0 -
Exponentially Increasing Duplicate Content On Blogs
Most of the clients that I pick up are either new to SEO best practices, or have worked with sketchy SEO providers in the past, who did little more than build spammy links. Most of them have deployed little if any on-site SEO best practices, and early on I spend a lot of time fixing canonical and duplicate content issues alla 301 redirects. Using SEOMOZ, however, I see a lot of duplicate content issues with blogs that live on the sites I work on. With every new blog article we publish, more duplicate content builds up. I feel like duplicate content on blogs grows exponentially, because every time you write a blog article, it exists provisionally on the blog homepage, the article link, a category page, maybe a tag page, and an author page. I have a two-part question: Is duplicate content like this a problem for a blog -- and for the website that the blog lives on? Are search engines able to parse out that this isn't really duplicate content? If it is a problem, how would you go about solving it? Thanks in advance!
On-Page Optimization | | RCNOnlineMarketing0 -
Duplicate content and the Moz bot
Hi Does our little friend at SEOmoz follow the same rules as the search engine bots when he crawls my site? He has sent thousands of errors back to me with duplicate content issues, but I thought I had removed these with nofollow etc. Can you advise please.
On-Page Optimization | | JamieHibbert0