New Client Wants to Keep Duplicate Content Targeting Different Cities
-
We've got a new client who has about 300 pages on their website that are the same except the cities that are being targeted. Thus far the website has not been affected by penguin or panda updates, and the client wants to keep the pages because they are bringing in a lot of traffic for those cities.
We are concerned about duplicate content penalties; do you think we should get rid of these pages or keep them?
-
This is a tough situation. I tend to agree with Ricky - these are exactly the kinds of pages that have been hit by Panda, and there's real risk. In the old days, the biggest risk was that the pages would just stop getting traffic. Now, the impact could hit the rest of the site as well, and it's a lot more dangerous.
The problem is that it's working for now, and you're asking them to give up traffic in the short-term to avoid losing it in the long-term. Again, I think the long-term risk is serious (and it's not that easy to recover from), but the short-term pain to the client is very real.
What's the scope of the 300 pages compared to the rest of the site (are we talking a 400 page site or a 40,000 page site)? How many of these city pages are getting real traffic? My best alternative solution is to pin down the 10-20% of the city pages getting most of the traffic, temporarily NOINDEX the rest, and then beef up those well-trafficked city pages with unique content (so, maybe you're talking about 30 pages). Then, build out from there.
Give these pages real value - it's not only good for SEO, but it will probably improve conversion, too. The other problem with pages that just swap out a city is that they're often low quality - they may draw traffic in, but then have high bounce rates and low conversion. If you can show that you can improve the value, even with some traffic loss, it's easier to win this fight.
-
Does the analytics support specific city search terms targeting those city specific pages, or going to the home page (or the canonical version of the duplicate content page)?
If it is the later, then you certainly should move those city specific keyword terms into the single version of the duplicate content in some creative fashion.
Regardless you still should remove the duplicate content, preferably sooner than later because they are certainly low value pages!
-
I agree with Ricky - I would slowly make all those pages unique in some way. I still find it beneficial to rank to different city pages as long as they have prime content. Google will eventually sift its way and find those pages as spam.
-
It seems to me that Google would see all of that duplicate content and simply have 1 page ranking as the canonical page. If they are seeing organic traffic and rankings for multiple pages, I am not sure how long that will last. From what I understand, it would be best to start the slow process of making the content on each page somewhat unique.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I be worried about our 'Duplicate' content
Hi guys... I've just been working through some issues to give our site a little cleanup. I'm working through our duplicate content issues (we have some legitimate duplicate pages that need removing, and some of our dynamic content is problematic. Are web developers are going to sort with canonical tags this week.) However... There are some pages that are actually different products, but are very similar pages that are 'triggering' MOZ to say we have duplicate pages. Here an example... http://www.toaddiaries.co.uk/filofax-refills/filo-12-month-inserts-personal-size/fortnight-view-filofax-personal and http://www.toaddiaries.co.uk/filofax-refills/filo-12-month-inserts-personal-size/week-to-a-view-filofax-personal They are very similar refill products, it's just the diary format is different. Question: Should I be worried about this? I've never seen our rankings change in the past when 'cleaning up' duplicate content. What do you guys think? Isaac.
On-Page Optimization | | isaac6630 -
Duplicate Page content | What to do?
Hello Guys, I have some duplicate pages detected by MOZ. Most of the URL´s are from a registracion process for users, so the URL´s are all like this: www.exemple.com/user/login?destination=node/125%23comment-form What should I do? Add this to robot txt? If so how? Whats the command to add in Google Webmaster? Thanks in advance! Pedro Pereira
On-Page Optimization | | Kalitenko20140 -
What to do about resellers duplicating content?
Just went through a big redevelopment for a client and now have fresh images and updated content but now all the resellers have just grabbed the new images/content and pasted them on their own site. My client is a manufacture that sells directly online and over the phone for large orders. I'm just not sure how to handle the resellers duplicate content. Any thoughts on this? Am I being silly for worrying about this?
On-Page Optimization | | ericnkatz0 -
Form Only Pages Considered No Content/Duplicate Pages
We have a lot of WordPress sites with pages that contain only a form. The header, sidebar and footer content is the same as what's one other pages throughout the site. Each form page has a unique page title, meta description, form title and questions but the form title, description and questions add up to probably less than 100 words. Are these form pages negatively affecting the rankings of our landing pages or being viewed as duplicate or no content pages?
On-Page Optimization | | projectassistant0 -
Issue: Duplicate Page Content
Hello SEO experts, I'm facing duplicate page content issue on my website. My website is a apartments rental website when client search apartment for availability. Automatic generate same url's. I've already block these url's in robots.txt file but facing same issue. Kindly guide me what can I do. Here are some example links. http://availability.website.com/booking.php?id=17&bid=220
On-Page Optimization | | KLLC
http://availability.website.com/booking.php?id=17&bid=242
http://availability.website.com/booking.php?id=18&bid=214
http://availability.website.com/booking.php?id=18&bid=215
http://availability.website.com/booking.php?id=18&bid=256
http://availability.website.com/details.php?id=17&bid=220
http://availability.website.com/details.php?id=17&bid=242
http://availability.website.com/details.php?id=17&pid=220&bid=220
http://availability.website.com/details.php?id=17&pid=242&bid=242
http://availability.website.com/details.php?id=18&bid=214
http://availability.website.com/details.php?id=18&bid=215
http://availability.website.com/details.php?id=18&bid=256
http://availability.website.com/details.php?id=18&pid=214&bid=214
http://availability.website.com/details.php?id=18&pid=215&bid=215
http://availability.website.com/details.php?id=18&pid=256&bid=256
http://availability.website.com/details.php?id=3&bid=340
http://availability.website.com/details.php?id=3&pid=340&bid=340
http://availability.website.com/details.php?id=4&bid=363
http://availability.website.com/details.php?id=4&pid=363&bid=363
http://availability.website.com/details.php?id=6&bid=367
http://availability.website.com/details.php?id=6&pid=367&bid=367
http://availability.website.com/details.php?id=8&bid=168
http://availability.website.com/details.php?id=8&pid=168&bid=168 Thanks and waiting for your response | |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |0 -
Will google regards www.example.com and www.example.com?331457 as the duplicate content?
Our site has some affiliates, and the affiliate id is the suffix following with the url "?xxxxxx". I can see Google Analytics regards www.example.com and www.example.com?331457 as the different page, but in fact they are exactly the same, the version www.example.com?331457 is the visit from our affiliate site. And yesterday I start up my Moz Pro membership, and in the crawl issues I see SEOMoz thinks www.example.com and www.example.com?331457 are duplicate content. Is this really an issue? Will the search engine thinks these two pages are duplicate content?? Thanks you guys My first question here, not too dumb I hope. -----------------Update---------------------- I should explain how our affiliates work. We are an eBook related software company, and anyone can apply an affiliate account on the transaction platform "RegNow" even without our permission because we have opened the affiliate door. When a visitor come to our order page from an affiliate site, the url will add the affiliate ID suffix "?xxxxxx", and it's combined in cookies. After the deal is done, the affiliate gets his commission. So no matter how I customize the url with URL Builder, there must be the suffix "?xxxxxx". It's the ID of our affiliate, or they will get nothing. So the key point is, will the suffix "?331457" makes Google think www.example.com and www.example.com?331457 are different pages and duplicate content?
On-Page Optimization | | JonnyGreenwood0 -
Best practice to solve this Unique duplicate page content issue?
I just got Seomoz Pro (it's awesome!), and when I did a campaign for my website I discovered that I have a big issue with duplicate page content (as well as titles). The Crawl Diagnostics Summary told me I have 196 Crawl Errors Found (I had a total of 362 pages crawled on my site), and as much as 160 of these was duplicate page content. Which to me sounds like a big problem, correct me if I'm wrong (I'm very new to SEO). So our website is an ecommerce that sells greeting cards. The unique part about our platform is that we offer the customer to make a customization of the cards.
On-Page Optimization | | danielpett
Let me walk you through each step a customer takes so you fully understand: They find a card they like and visit the product page of that card (just like on any ecommerce store.) They then decide they want to buy it. There is no "Add to cart" button, they will instead click on a "customize the card" button. 3) This takes them to a step by step process of customizing the card. They change the name on the front of the greeting card so it says for example: "Happy Birthday Katy!". And then adds a personal text on the inside of the card. They then add an delivery address and when it should be delivered. After that they proceed to checkout and it's all done. This is my website (it's in Swedish): loveday.se - it will take you to a product page so that you can click the green button and see what I mean with the customization pages. Hopefully it helps even though it's in Swedish. My issue starts at the customization part of the site (the bolded step above), as I can see the permalinks in the diagnostics I got.
This step-by-step process looks exactly the same with every card in the store. Same call-to-action headline, same descriptive text etc. The only difference is a JPEG-file with the unique greeting card design. So, what is your take on this? Let me know if I was unclear about something. Any help or advice is greatly appreciated.0 -
Can internal duplicate content cause issues?
Hello all mozzers - has anyone used nitrosell? we use them only because their inventory connects to ours epos point but because they have thousands of 301s on our domain we are getting duplicate content because different sizes of products (we sell womenswear) are creating seperate URLS so we are duplicating both content and URLS - im curious as to whether anyone has experienced simillar problems that have affected their SERPS? Best wishes, Chris
On-Page Optimization | | DaWillow1