Am I doing enough to rid duplicate content?
-
I'm in the middle of a massive cleanup effort of old duplicate content on my site, but trying to make sure I'm doing enough.
My main concern now is a large group of landing pages. For example:
http://www.boxerproperty.com/lease-office-space/office-space/dallas
http://www.boxerproperty.com/lease-office-space/executive-suites/dallas
http://www.boxerproperty.com/lease-office-space/medical-space/dallas
And these are just the tip of the iceberg. For now, I've put canonical tags on each sub-page to direct to the main market page (the second two both point to the first, http://www.boxerproperty.com/lease-office-space/office-space/dallas for example). However this situation is in many other cities as well, and each has a main page like the first one above. For instance:
http://www.boxerproperty.com/lease-office-space/office-space/atlanta
http://www.boxerproperty.com/lease-office-space/office-space/chicago
http://www.boxerproperty.com/lease-office-space/office-space/houston
Obviously the previous SEO was pretty heavy-handed with all of these, but my question for now is should I even bother with canonical tags for all of the sub-pages to the main pages (medical-space or executive-suites to office-space), or is the presence of all these pages problematic in itself? In other words, should http://www.boxerproperty.com/lease-office-space/office-space/chicago and http://www.boxerproperty.com/lease-office-space/office-space/houston and all the others have canonical tags pointing to just one page, or should a lot of these simply be deleted?
I'm continually finding more and more sub-pages that have used the same template, so I'm just not sure the best way to handle all of them. Looking back historically in Analytics, it appears many of these did drive significant organic traffic in the past, so I'm going to have a tough time justifying deleting a lot of them.
Any advice?
-
Heather,
I'm confused as to what the duplicate content is. The three Dallas pages you mentioned have different content. Sure there's a decent amount that's the same from the site-wide content (nav menus, etc.), but each has different text and information about different locations that are available. How is it duplicate?
Kurt Steinbrueck
OurChurch.Com -
Heather,
First things: 1. Are they still driving traffic? 2. Rel=canonicals are supposed to be used on identical pages or on a page whose content is a subset of the canonical version.
Those pages are very thin content and I certainly wouldn't leave them as they are. If they're still driving content, I'd keep them, but for fear of panda, I'd 302 them to the main pages while I work steadily on putting real content on them and then remove the redirects as the content goes on.
If they're not still driving traffic, it seems to me that it wouldn't be very hard to justifying their removal (or 301 redirection to their main pages). Panda is a tough penalty and you don't want to get caught in that.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does changing text content on a site affects seo?
HI, i have changed some h1 and h2 , changed and added paragraphs,fixed plagiarism,grammar and added some pics with alt text, I have just done it today, I am ranking on second page QUESTION-1 is it gonna affect my 2 months SEO efforts? QUESTION -2 Do I have to submit sitemap to google again? QUESTION-3 does changing content on the site frequently hurts SEO?
Algorithm Updates | | Sam09schulz0 -
Ranking For Synonyms Without Creating Duplicate Content.
We have 2 keywords that are synonyms we really need to rank for as they are pretty much interchangeable terms. We will refer to the terms as Synonym A and Synonym B. Our site ranks very well for Synonym A but not for Synonym B. Both of these terms carry the same meaning, but the search results are very different. We actively optimize for Synonym A because it has the higher search volume of the 2 terms. We had hoped that Synonym B would get similar rankings due to the fact that the terms are so similar, but that did not pan out for us. We have lots of content that uses Synonym A predominantly and some that uses Synonym B. We know that good content around Synonym B would help, but we fear that it may be seen as duplicate if we create a piece that’s “Top 10 Synonym B” because we already have that piece for Synonym A. We also don’t want to make too many changes to our existing content in fear we may lose our great ranking for Synonym A. Has anyone run into this issue before, or does anyone have any ideas of things we can do to increase our position for Synonym B?
Algorithm Updates | | Fuel0 -
How to retain those rankings gained from fresh content...
Something tells me I know the answer to this question already but I'd always appreciate the advice of fellow professionals. So.....fresh content is big now in Google, and i've seen some great examples of this. When launching a new product or unleashing (yes unleashing) a new blog post I see our content launches itself into the rankings for some fairly competitive terms. However after 1-2 weeks these newly claimed rankings begin to fade from the lime light. So the question is, what do I need to do to retain these rankings? We're active on social media tweeting, liking, sharing and +1ing our content as well as working to create exciting and relevant content via external sources. So far all this seems to have do is slow the fall from grace. Perhaps this is natural. But i'd love to hear your thoughts, even if it is just keep up the hard work.
Algorithm Updates | | RobertChapman1 -
Google and Content at Top of Page Change?
We always hear about how Google made this change or that change this month to their algorithm. Sometimes it's true and other times it's just a rumor. So this week I was speaking with someone in the SEO field who said that this week a change occurred at Google and is going to become more prevalent where content placed at the "top of the fold" on merchant sites with products are going to get better placement, rather than if you have your products at top with some content beneath them at the bottom of the page. Any comments on this?
Algorithm Updates | | applesofgold0 -
SEOmoz suddenly reporting duplicate content with no changes???
I am told the crawler has been updated and wanted to know if anyone else is seeing the same thing I am. SEOmoz reports show many months of no duplicate content problems. As of last week though, I get a little over a thousand pages reported as dupe content errors. Checking these pages I find there is similar content (hasn't changed) with keywords that are definitely different. Many of these pages rank well in Google, but SEOmoz is calling them out as duplicate content. Is SEOmoz attempting to closely imitate Google's perspective in this matter and therefore telling me that I need to seriously change the similar content? Anyone else seeing something like this?
Algorithm Updates | | Corp0 -
What is considered duplicate content in an ecommerce website that offers the same product for retail and wholesale purchasing?
I have an ecommerce website that offers retail and wholesale products which are identical, of course with the exception of pricing. My concern is duplicate content. If the same product is offered under both the retail and wholesale category, and described identically, with the exception of price, metadata and a few words, is that considered duplicate content and would both pages be disregarded by the robots? Is it best to avoid the same description for that one product under the two separate categories? Thanks for all your help!
Algorithm Updates | | flaca0 -
Is this the best way to get rid of low quality content?
Hi there, after getting hit by the Panda bear (30% loss in traffic) I've been researching ways to get rid of low quality content. From what I could find the best advise seemed to be a recommendation to use google analytics to find your worst performing pages (go to traffic sources - google organic - view by landing page). Any page that hasn't been viewed more than 100 times in 18 months should be a candidate for a deletion. Out of over 5000 pages and using this report we identified over 3000 low quality pages which I've begun exporting to excel for further examination. However, starting with the worst pages (according to analytics) I'm noticing some of our most popular pages are showing up here. For example: /countries/Panama is showing up as zero views but the correct version (with the end slash) countries/Panama/ is showing up as having over 600 views. I'm not sure how google even found the former version of the link but I'm even less sure how to proceed now (the webmaster was going to put a no-follow on any crap pages but this is now making him nervous about the whole process). Some advise on how to proceed from here would be fantastico and danke <colgroup><col width="493"></colgroup>
Algorithm Updates | | BrianYork-AIM0 -
Will google punish us for using formulaic keyword-rich content on different pages on our site?
We have 100 to 150 words of SEO text per page on www.storitz.com. Our challenge is that we are a storage property aggregator with hundreds of metros. We have to distinguish each city with relevant and umique text. If we use a modular approach where we mix and match pre-written (by us) content, demographic and location oriented text in an attempt to create relevant and unique text for multiple (hundreds) of pages on our site, will we be devalued by Google?
Algorithm Updates | | Storitz0