Panda'd - and I think I know how to fix it...
-
Hi,
I have a non-core site that seems to have been affected by a Panda refresh in late December http://www.seomoz.org/google-algorithm-change#2012
Anyway, I couldn't figure out for the longest time why this site, which is full of high-quality, expert-level content would get dinged -- i made several moves to try and eliminate duplicate content -- even though I couldn't find evidence of the duplicate content, but it's a wordpress site so there's lots of opportunities to accidentally introduce it through archives, tags and whatnot.
The classic SEO mistake I was making was I was forgetting about a type of post we were doing to facilitate one of our email campaigns. On most, sites there's always something you aren't optimizing, and that's the stuff that can really create unintended issues in google, because the decisions made on those pieces, is often more operational toward the other campaigns, than strategic to search.
these posts, are thin little articles, written by humans, but the text is actually submitted to another external site, published there and then recreated as content that the email campaign links to. These posts are segregated from the normal feed on the wordpress site, and the last time I had reviewed this content, we were not using a method for creating that involved publishing it to facebook first.
But, OK, so I'm going to stop indexing this content, that's a given. I believe that is the Panda issue -- I could be wrong, but it makes sense, since otherwise the site is maybe the least likely site to be affected by Panda that I've ever been involved with.
Do I do anything else, after fixing a Panda issue? Is there a reconsideration request for this or something. Should I send a singing telegram to Cutts?
I researched a few articles, and there wasn't much on what to do after you fixed it, but to wait. Just wondering if anyone else who fixed a Panda thang, utilized any communication channel to let google know. thanks!
-
I would be interested to hear of the sites that have resolved the problem with Panda. I can hardly find any examples of Panda recovery examples.
-
Previously you would wait until the next Panda refresh to see if your website was unflagged from the algo penalty. A few months ago, they made Panda a rolling algorithm so you should see whether your site is fixed after its next crawl/reindex.
No need for a reconsideration request but a singing telegram would probably get Cutt's attention if your site doesn't rebound in a week or so
Cheers, Oleg
-
Well, skipping ahead to the "what to do now?" question, I'd say you are correct in that there is nothing to do but wait. If it was a manual penalty, you would have a notice in your Webmaster Tools indicating it as such.
If it is algorithmic (and it sure sounds like it is from what you've described) then all you can do is wait. This process can take anywhere from a week to 2 months and seems to be a bit of a crap shoot. I will say, from my experience I've seen these types of fixes get resolved in more the 2-3 week range but have heard of it taking longer. A singing telegram might be nice but I think Matt would most likely just cross his arms and give off a simple fake smile, thank them and close the door.
This is oddly symbolic to the way the Google Webspam Team punks us SEO guys on the reg.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What will SEO be like in the 2020's?
Hey guys, I would love to hear your thoughts on how you think SEO will change in the 2020's. The 2010's saw some pretty cool stuff like Panda, Penguin, penalties for non-mobile-friendly, non-secure and slow loading sites. What will be more or less important for SEO's in the 2020's than today? How will machine learning and AI change SEO?
Intermediate & Advanced SEO | | GreenHatWeb0 -
Noindexing Thin News Content for Panda
We've been suffering under a Panda penalty since Oct 2014. We've completely revamped the site but with this new "slow roll out" nonsense it's incredibly hard to know at what point you have to accept that you haven't done enough yet. We have thousands of news stories going back to 2001, some of which are probably thin and some of which are probably close to other news stories on the internet being articles based on press releases. I'm considering noindexing everything older than a year just in case, however, that seems a bit of overkill. The question is, if I mine the logfiles and only deindex stuff that Google sends no further traffic to after a year could this be seen as trying to game the algo or similar? Also, if the articles are noindexed but still exist, is that enough to escape a Panda penalty or does the page need to be physically gone?
Intermediate & Advanced SEO | | AlfredPennyworth0 -
Alt tag for src='blank.gif' on lazy load images
I didn't find an answer on a search on this, so maybe someone here has faced this before. I am loading 20 images that are in the viewport and a bit below. The next 80 images I want to 'lazy-load'. They therefore are seen by the bot as a blank.gif file. However, I would like to get some credit for them by giving a description in the alt tag. Is that a no-no? If not, do they all have to be the same alt description since the src name is the same? I don't want to mess things up with Google by being too aggressive, but at the same time those are valid images once they are lazy loaded, so would like to get some credit for them. Thanks! Ted
Intermediate & Advanced SEO | | friendoffood0 -
How do I fix my sitemap?
I have no idea how this happened, but our sitemap was http://www.kempruge.com/sitemap.xml, now it's http://www.kempruge.com/category/news/feed/ and google won't index it. It 404's. Obviously, I had to have done something wrong, but I don't know what and more importantly, I don't know how to find it in the backend of wordpress to change it. I tried a 301 redirect, but GWT still 404'd it. Any ideas? And, it's been like this for a few weeks, I've just neglected it, so I can't just reset the site without losing a lot of work. Thanks, Ruben
Intermediate & Advanced SEO | | KempRugeLawGroup0 -
Is there anyway to recover my site's rankings?
My site has been top 3 for 'speed dating' on Google.co.uk since about 2003 and it went to below top 50 for a lot of it's main keywords shortly after 27 Oct 2012. I did a re-submission request and was told there was 'no manual spam action'. My conclusions is I was dropped by Google because of poor quality links I've gained over 10+ years. I have a Domain Authority of 40, a regular blog http://bit.ly/oKyi88, a KLOUT of 42, user reviews and quality content. Since Oct 2012 I've done some technical improvements and managed to get a few questionable links removed. I've continued blogging reguarly and got more active on Twitter. I've seen no improvement and my traffic is 80% down on last year. It would be great to be able to produce content that others want to link to but I've not had much success from that in over 10 years of trying and I've not seen many others in my sector, with small budgets having much success. Is there anything I can do to regain favour with Google?
Intermediate & Advanced SEO | | benners0 -
What if you can't navigate naturally to your canonicalized URL?
Assume this situation for a second... Let's say you place a rel= canonical tag on a page and point to the original/authentic URL. Now, let's say that that original/authentic URL is also populated into your XML sitemap... So, here's my question... Since you can't actually navigate to that original/authentic URL (it still loads with a 200, it's just not actually linkded to from within the site itself), does that create an issue for search engines? Last consideration... The bots can still access those pages via the canonical tag and the XML sitemap, it's just that the user wouldn't be able to access those original/authentic pages in their natural site navigation. Thanks, Rodrigo
Intermediate & Advanced SEO | | AlgoFreaks0 -
There's a website I'm working with that has a .php extension. All the pages do. What's the best practice to remove the .php extension across all pages?
Client wishes to drop the .php extension on all their pages (they've got around 2k pages). I assured them that wasn't necessary. However, in the event that I do end up doing this what's the best practices way (and easiest way) to do this? This is also a WordPress site. Thanks.
Intermediate & Advanced SEO | | digisavvy0 -
Competitior 'scraped' entire site - pretty much - what to do?
I just discovered a competitor in the insurance lead generation space has completely copied my client's site's architecture, page names, titles, even the form, tweaking a word or two here or there to prevent 100% 'scraping'. We put a lot of time into the site, only to have everything 'stolen'. What can we do about this? My client is very upset. I looked into filing a 'scraper' report through Google but the slight modifications to content technically don't make it a 'scraped' site. Please advise to what course of action we can take, if any. Thanks,
Intermediate & Advanced SEO | | seagreen
Greg0