Would Panda target this?
-
Hi guys,
We suffered a massive rankings drop in September 2012, same date as Panda 20, so we've been trying to fix the issues since with no little to no success.
I think these Q&A's work best if I ask a specific question, instead of just screaming for help, so hopefully we're looking in the right place at least.
One area I've been looking into, is of course, content (being a Panda penalty). However, I'm not sure what about our content is causing a problem. We provide a phone unlocking service and have over 6000 handsets that we can unlock. We only allow search engines to index 5 of them, due to these being those with unique product descriptions (there are over 100 more but we want to start getting our rankings back a bit at a time). We also let them index our manufacturer pages, news and support pages, 160 approx in total.
On our handset and manufacturer pages we have much of the same content, with a few words difference to alter the price or the name of the manufacturer/phone. We also change our delivery times for some, as it can vary and have a "Why use us" section which is the same for each handset page.
In my mind there is no point changing these areas to be unique to each page as they clearly describe our service and what we offer. Changing each one for each page, especially if we wanted to start adding our other remaining 5995 handset would be ridiculously. It would also clearly be manipulative is we're just rewriting the same thing in a slightly different way to benefit a search engine and not the users.
Does anyone know if this type of content would be seen as duplicate content and would result in a penalty? And is there anything we can do about it?
Thanks,
Darren. -
Hey Chris and Kurt,
Please do be aware that we've noindexed the majority of those pages, even around 100 that have unique content on. Our aim is to concentrate the site to a handful of select pages to see if that works first (as it should in theory).
We've certainly paid a lot of attention to our content, which isn't a new thing for us. We've included videos of people unlocking the handset where we can - I've created some myself to broaden our reputation on Youtube, which also make their way onto our site. We have instructions for a large number of handsets on the pages, along with unique images and descriptions. I've even worked on numerous other "big" pieces for publication on our news pages, which have got links from the Washington Post, Huffington Post, TUAW, IntoMobile, etc and even phone networks. To manage it all, we noindex the pages with "thin" content and previously blocked Googlebot from going on them in the first place.
Having unique content on all pages shouldn't be, and I don't believe is necessary (there are various examples to support that). We've even stuck close to the rules by incorporating noindex like a boss for content we don't think should be included.
However, one of the kickers is that I've been reviewing our industry and our competitors, noticing time and time again that a lot of the ranking sites have even less unique content than we do. They stick solidly to a script, never creating a hint of unique content, even posting unlocking videos for unrelated phones on certain pages. Yet, they rank. They focus no attention to robots.txt or noindexing crap content and yet away they go blissfully into the SERPs sunset.
These competitors certainly don't have the same links as us or comparably useful content. I'd consider myself pretty knowledgeable at SEO, yet this makes no sense.
It's difficult to include all the same information on what we've done in these posts without making them stupidly long but a lot of the "standard" stuff, we've done. We've been working on getting this right for over a year, even tossing aside an extremely powerful previous domain (with some success).
If Google were a person, I'd put a bag over their head and see how they like it.
-
Great response Chris.
Darren,
The issue for Google, especially when dealing solely with the algorithm, is that they don't want to display duplicate content and they don't always get the subtlety of a situtiation like a phone unlocking process. From your perspective (and the user's) you know that each page is necessary because there are subtle differences in the unlocking process of different phones, maybe just one different step or a different menu title, etc. It's a small change, but for a user who's never done it before, it's significant.
For Google, though, If the content on the 6000 pages is essentially the same with just a few words changed in each, it looks like spam. There have just been too many low quality sites that put up a page for every keyword in their industry, but use essentially the same content on all of them to try to rank for everything. Google has to deal with the percentages. While it may make sense in your situation to have different pages with only subtle differences in content, it doesn't for most niches, so they penalize everyone who does it. That's just the world of Google that we live in.
So, the question becomes, "How do I create unique pages for each of these pages that I want indexed to appease Google?" Chris has some great suggestions for this. I know that it seems daunting when you are talking about 6000 pages, but that's just the reality. Also, keep in mind that, as you are doing now, you don't need to deal with all your pages at once. This is what has to be done for each page you want indexed and ranked in Google. So, you can take it a bit at a time if you need to...or outsource some/all of the work someone on oDesk or find a way to get your customers to create the content for you.
Kurt Steinbrueck
OurChurch.Com -
purple,
5 pages like that isn't enough to get you a penalty, although 6000 thin/dupe pages like that would be enough to put your site in a class with spam sites. So the question is--how do you deal with those other 5995 pages when the time comes to put them back up, if it comes. I'd be thinking of breaking them down first by manfacturer (with a strong page on your unlocking service for each one) then by model type (with a strong page on your unlocking service for each one) and then you might start breaking them down by specific model number. With so many new model numbers coming out each year, it seems like you'd keep your hands full creating content for just the new ones, let alone old ones but you could work your way back in priority of those that give you the most business.
As far as the content itself, it could be videos about how the consumer could unlock each specific one themselves, interviews with owners of unlocked phones, information about the phones themselves (development history, carriers that sell them, sales specs, technical specs, OS's used....) There's a whole ocean of information you could be giving your audience that pertains to the genre of cell phones, unlocking, carriers, mobile devices, manufactures, OS's, etc. Standardize on a number of specific points of data from those areas that you think best gives the audience a picture of your brand's philosophy about being in business and include them in the content for each new page you create. Remember, you've got to think like a publisher if you're going to pull yourself out of that penalty.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
When do Panda ranking factors apply when Google deindexes a page
Here is 2 scenarios Scenario 1 Lets say I have a site with a ton of pages (100,000+) that all have off site duplicate content. And lets say that those pages do not contain any rel="noindex" tags on them. Google then decides to de-index all those pages because of the duplicate content issue and slaps me with a Panda penalty. Since all those pages are no longer indexed by Google does the Panda Penalty still apply even though all those pages have been deindexed? Scenario 2 I add a rel="noindex" to all those 100,000+ off site duplicate content pages. Since Google sees that I have decided to not index them does the Panda penalty come off? What I am getting at is that I have realized that I have a ton of pages with off site duplicate content, even though those pages are already not indexed by Google does me by simply adding the rel="noindex" tag to them tell Google that I am trying to get rid of duplicate content and they lift the Panda penalty? The pages are useful to my users so I need them to stay. Since in both scenarios the pages are not indexed anyways, will Google acknowledge the difference in that I am removing them myself and lift the panda ban? Hope this makes sense
On-Page Optimization | | cbielich0 -
In need of guidance on keyword targeting
Hello I'm in need of some guidance as my head has gotten into a spin. Here's the website - www.onsite-sm.co.uk
On-Page Optimization | | Hughescov
Here's the keywords - concrete repair, concrete repairs, concrete repair contractors
Here's the question - The homepage doesn't really rank for anything specific and the concrete repair page isn't really strong enough to rank for the above keywords. What should I do? Thanks for any help.0 -
Hit by Panda - Google Disavow Help
Hi I hope you can help me A Website I manage has been hit hard by the Panda Update. I am really struggling to understand what is seen as a Spammy link. The Website use to be on page 1 for "fancy dress" now it isnt visable for that term at all and most other terms the site has dropped for. I have looked into what might have gone wrong and have removed several links , used the disavow tool 2-3 times and submitted re-consideration requests, but each time google informs me that they are still detecting unnatural links. Could somebody please take a look at our link profile www.partydomain.co.uk for "fancy dress" as an example and show examples of links you would consider that google might not like. It would also be good if anybody had any contacts in the UK that could help thanks Adam
On-Page Optimization | | AMG1000 -
Google Panda This Past Weekend Impact
I understood that Google was implementing a major Panda refresh this past weekend. Did it happen? Anyone notice any impacts? What changed?
On-Page Optimization | | lbohen0 -
Targeting local keywords and service areas.
Hi, I run a small photo booth rental business in San Francisco, CA that serves the greater Bay Area. I've created different webpages for each location that we serve, ie: "San Francisco Photo Booth", "Oakland Photo Booth", "San Jose Photo Booth", etc.... I'm assuming that for each city, the strongest keyword would be "City-Photo Booth". However, I also want to target different variations of the keyword, such as: San Francisco Photo Booth: -Photo Booth San Francisco -SF Photo Booth -San Francisco Photobooth -San Francisco, CA Photo Booth -etc.... Will adding these keywords onto the same webpage dilute the relevance of my main keyword "San Francisco Photo Booth"? Also, is there any way to place these words within the text of the webpage so that it does not sound akward and unnatural to the reader? Any advice would be appreciated, thanks!
On-Page Optimization | | pharcydeabc0 -
Do parenthetical phrases weaken the impact of target terms?
I see that Ryan Boots answered a similar question for Mark Skidmore (http://www.seomoz.org/q/does-using-parentheses-affect-the-crawlers), but I'd like to dig a little deeper. If I use a key term with a parenthetical phrase in the body copy with either or tags, will it still hold the same weight as it would if I used the same key term without parentheses? Further, would the same answer apply to key terms in the meta description? Thank you. Joe Donohue
On-Page Optimization | | mollykathariner_ms0 -
Recommendation of second target keyword
Hi there, Our company is selling airline tickets and more products within the travelling market like car rental, hotels and holidays. Now we are busy to improve our google ranking because the market of airplaine tickets is hard with many competitors. At this moment we are optimizing our offer pages. Our Strategy: 2 Keyword focus on that specific page
On-Page Optimization | | vliegticketsnl
1. Vliegtickets + Destination
2. Vliegticket, ticket or tickets + Destination Both keywords will be in the title tag. Example: vliegtickets + destination - brandname - vliegticket, ticket or tickets + destination h1 = Vliegtickets + Destination
h2 = Vliegticket, ticket or tickets + Destination Now we know that there is a big search volume on the keywords "goedkope vliegtickets" and we do not focus on these words on the offer pages of a destination. Goedkope vliegtickets could be translate like cheap airplane tickets. At the homepage of our offers we are focussing on goedkope vliegtickets, is it wise to continue doing that at the other pages also, instead of vliegticket, ticket, tickets + destination. Will goedkope vliegtickets + destination make the keyword combination vliegtickets + destination more strong and could it improve our google ranking? Or should we keep it like we do now, because vliegticket, ticket or tickets + destination is familiar to vliegtickets + destination? Hope to hear your opinion so we could decide what to do with our onpage strategy. Next thing to do than is linkbuilding. Thank you in advance.0 -
What to do with old content in light of the Panda update?
Let's say you operate a laptop review website. After several years, the individual product review URL's (like site.com/dell/xp1234-review/) aren't receiving much traffic, they may have a few links here and there. In general and considering the panda update, would the best option be to 301 the old URL's back to the category page (site.com/dell/)or just keep them where they are? Any potential issues like having excessive 301's which could slow down the site or appear fishy to search engines?
On-Page Optimization | | BryanPhelps-BigLeapWeb0