New Client Wants to Keep Duplicate Content Targeting Different Cities
-
We've got a new client who has about 300 pages on their website that are the same except the cities that are being targeted. Thus far the website has not been affected by penguin or panda updates, and the client wants to keep the pages because they are bringing in a lot of traffic for those cities.
We are concerned about duplicate content penalties; do you think we should get rid of these pages or keep them?
-
This is a tough situation. I tend to agree with Ricky - these are exactly the kinds of pages that have been hit by Panda, and there's real risk. In the old days, the biggest risk was that the pages would just stop getting traffic. Now, the impact could hit the rest of the site as well, and it's a lot more dangerous.
The problem is that it's working for now, and you're asking them to give up traffic in the short-term to avoid losing it in the long-term. Again, I think the long-term risk is serious (and it's not that easy to recover from), but the short-term pain to the client is very real.
What's the scope of the 300 pages compared to the rest of the site (are we talking a 400 page site or a 40,000 page site)? How many of these city pages are getting real traffic? My best alternative solution is to pin down the 10-20% of the city pages getting most of the traffic, temporarily NOINDEX the rest, and then beef up those well-trafficked city pages with unique content (so, maybe you're talking about 30 pages). Then, build out from there.
Give these pages real value - it's not only good for SEO, but it will probably improve conversion, too. The other problem with pages that just swap out a city is that they're often low quality - they may draw traffic in, but then have high bounce rates and low conversion. If you can show that you can improve the value, even with some traffic loss, it's easier to win this fight.
-
Does the analytics support specific city search terms targeting those city specific pages, or going to the home page (or the canonical version of the duplicate content page)?
If it is the later, then you certainly should move those city specific keyword terms into the single version of the duplicate content in some creative fashion.
Regardless you still should remove the duplicate content, preferably sooner than later because they are certainly low value pages!
-
I agree with Ricky - I would slowly make all those pages unique in some way. I still find it beneficial to rank to different city pages as long as they have prime content. Google will eventually sift its way and find those pages as spam.
-
It seems to me that Google would see all of that duplicate content and simply have 1 page ranking as the canonical page. If they are seeing organic traffic and rankings for multiple pages, I am not sure how long that will last. From what I understand, it would be best to start the slow process of making the content on each page somewhat unique.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Dynamically populated content
We are developing a website for a school that has 19 campuses divided into 8 districts. Ideally, we would like to have one search page that dynamically populates when people search WHILE on the site. The question is what happens when someone does an organic search, will the search engine populate with the schools in that district. For instance, if i search on Google "Austin Schools", will the Austin district-that does not have a unique URL- show up in a Google search? What the generated page looks like is on this link http://imgur.com/stCQcP6. If yes, any special type of coding we need to add to the backend?
On-Page Optimization | | jgodwin0 -
Product Attribute pages and Duplicate content
Hiya I have two queries is about a jewellery shop running on wordpress and woocommerce. 1. I am a little indecisive on how to index the product categories without creating duplicate pages which will get me into trouble. For example: All earrings are listed on the category page: chainsofgold.co.uk/buy/earrings/ We also have product attribute pages which lists all the subcategories for the earrings: chainsofgold.co.uk/earrings/creoles/
On-Page Optimization | | bongoheads
chainsofgold.co.uk/earrings/drop/
chainsofgold.co.uk/earrings/studs/ I have the category URL and the product attribute URLs set to be indexed on my sitemaps. Will this get me into trouble creating duplicate content with the main category page? Should I only have the main category indexed and "no-index, follow" all the product attribute pages? 2. I am also thinking about incorporating these product attribute URLS into my menu so when people hover over earrings they get shown the types of earrings they can buy. However, I have the woocommerce faceted navigation working on the category pages. So if someone is visiting the page chainsofgold.co.uk/buy/earrings/ The user can click on the left hand side, and select "drops". The URL they will get though is one which is not indexed: http://www.chainsofgold.co.uk/buy/earrings/?filter_earrings=123 Can I link to those product attribute pages without the risk of getting accused of creating duplicate content? Thank you for your help. Carolina0 -
Will Google penalise me for duplicating my own website on a new domain?
Hi guys, Currently, we are using Magento Enterprise to run two separate ecommerce websites, for arguments sake lets say these two websites are ElectronicsX.com and electronicsY.co.uk. ElectronicsX.com isn't doing too well, it has little traffic, makes next to nothing each month and is just a waste of resources. ElectronicsY.co.uk on the other hand is getting 30,000 unique visits every month and is over exceeding every month in terms of revenue in sales. At the moment, the only traffic we get from Europe tends to be the odd "word of mouth" customer and English customers who have emigrated to Europe but still like to use English companies. We can't rank on Google FR or ES or anything like that because the website we have "electronicsY" is on a .co.uk TLD (a GeoTLD). So what we want to do is take ALL the content from electronicsY.co.uk and place it on electronicsX.com so that we can start targeting Europe and ranking in the serps internationally. What sort of effect will that have in terms of penalisation from Google? Or because the websites are on the same C block, would it have any effect at all? Thanks guys Tom
On-Page Optimization | | tomhall900 -
If I enbed the same video from my YouTube account on two different websites, will I get a duplicate content penalty?
I have a YouTube video I want to show my B2B and B2C customers. But I have a different websites for each. If I embed the video will I get duplicate content strike against me?
On-Page Optimization | | RoxBrock0 -
Duplicate Content Daily Rates
Our finance information site want to publish daily rates each day of the main currency / share etc prices. We've created a template with the main headers e.g. Eurozone. GBP v EUR 1.1762. Australia. GBP v AUD 1.1494.... and list top 20 currencies. We want to roll this out daily Mon - Friday. The only content that will change would be the rates on a daily basis. It's v useful info to users but we're a little cautious about it being seen as duplicate content. What advice would you give re title tags too in this new product rollout.
On-Page Optimization | | stevanl0 -
Duplication issue on my website
hi I have a cms website with 2000 pages.my problem is that 1. www.test.com/abc.html 2. www.test.com/abc.html?gallery?123testing it showing duplication page in me seomoz error list. It is a single page. Please suggest solution for it
On-Page Optimization | | wmsindia0 -
How woud you deal with Blog TAGS & CATEGORY listings that are marked a 'duplicate content' in SEOmoz campaign reports?
We're seeing "Duplicate Content" warnings / errors in some of our clients' sites for blog / event calendar tags and category listings. For example the link to http://www.aavawhistlerhotel.com/news/?category=1098 provides all event listings tagged to the category "Whistler Events". The Meta Title and Meta Description for the "Whistler Events" category is the same as another other category listing. We use Umbraco, a .NET CMS, and we're working on adding some custom programming within Umbraco to develop a unique Meta Title and Meta Description for each page using the tag and/or category and post date in each Meta field to make it more "unique". But my question is .... in the REAL WORLD will taking the time to create this programming really positively impact our overall site performance? I understand that while Google, BING, etc are constantly tweaking their algorithms as of now having duplicate content primarily means that this content won't get indexed and there won't be any really 'fatal' penalties for having this content on our site. If we don't find a way to generate unique Meta Titles and Meta Descriptions we could 'no-follow' these links (for tag and category pages) or just not use these within our blogs. I am confused about this. Any insight others have about this and recommendations on what action you would take is greatly appreciated.
On-Page Optimization | | RoyMcClean0 -
Will a "no follow" "no index" meta tag resolve duplicate content issue?
I have a duplicate content issue. If the page has already been indexed will a no follow no index tag resolve the issue or do I also need a rel canonical statement?
On-Page Optimization | | McKeeMarketing0