Should We Pull The Plug On This Site?
-
I am helping a retailer out with their site. They were hit hard with the Penguin update, and traffic has dropped by about 75%. Here are the stats:
-
It is fairly new, has been up for about 3 years.
-
Has partial match domain name
-
Is nearly fully indexed with over 4K pages
-
Has NOT received an unnatural link message from Google, so no manual penalty.
-
Has had most keywords BURIED in the search results.
-
Link profile: Has done about 50-100 blog comments, 500 directory submissions, 800 social bookmarks, 5-6 press releases, 300 article submissions (most removed), about 30-50 guest blog posts.
I am thinking it may have just been hit because of aggressive use of anchor text as opposed to massive spamming. Then again, the site has never really added great content and the product pages have no unique content.
Any thoughts?
-
-
Thanks. I've watched the video before but it's worth reviewing. Still seems a bit strange that someone can violate terms of service which G never bothered to enforce for years and get slammed with "Double Secret Probatiion" while a malicious site can clean up and eventually get the penalty lifted. No doubt a malicious site manual penalty should result in a long time in the penalty box but at least it's obvious what to fix. There doesn't seem to be a reliable consensus or even many case studies on garden variety Penguin recoveries yet. Not knowing what Dean Wormer wants me to change is irritating.
-
InHouseSEO - It's not an e-commerce site. (It's a blog with a couple of hundred posts many of which need pruning but many of which are high informative and written by someone with substantial experience in the subject.)
Sounds like you're telling me the best gamble is put in the work on this blog to try to grow the legit links so that the bad ones dip below the "tipping point" which prompts the Penguin attack. Have you had success with this tactic?
The home page appears to be penalized b/c of keyword rich text from relevant blog comments on mostly relevant blogs/pages. (It's also quite possible it's just a rather severe devaluation 30 or so spots in the SERPs for the EMD keyword). Other pages are hit or miss but the stronger pages (high bounce but very high times on pages) are beginning to return to some of their former strength (probably 50% of peak traffic).
Site traffic declined just before the 25th (the date that is associated with Panda 3.5) and resulted in a 20% hit. After Panda 3.5, the G traffic dove steadily (which I assume is Penguin added to the mix). Traffic is now off by around 2/3 without excluding the Bing traffic. (Have probably seen 15 -20% improvement recently with no new posts and only added one authorative directory link (Nat'l Trade Assoc. picked up the blog).
I just reread all of the comments in the thread you linked to. (Never received a warning in WMT so I assume the penalty is algo.)
Reading your comments, it sounds like you recomment attempting to remove any blog comments that I created. (I don't expect much success based on what people are sharing.
If my pet Penquin is algorhythmic and isn't scheduled to lift anytime in the next several months, should I try to guest blog my way out of the penalty? (Assume I have access to decent releveant indy blogs that are low authority but extremely legit.)
Thanks for the reminder to re-read the thread with you and Egol.
-
Do you have an e-commerce site? Is the site as a whole hit, or is it certain keywords/pages?
I would be careful with removing links, unless they are really spammy. You might do more harm than good.
I wrote about this here:
http://www.seomoz.org/q/using-dripable-to-build-url-links-too-dilute-link-profile
Anyways, good luck.
-
InHouseSEO - this is a GREAT question. I wish there were more discussion of realistic case studies like this one rather than so much "focus" on negative SEO and a handful of high authority sites that were probably hit by mistake.
The consensus seems to be that you can file for lifting a penalty IF you can show you removed bad links AND document the efforts you made to remove the bad links that remain despite your efforts.
Matt Cutts appears to say you're more screwed if the penalty is algorhythmic. Huh? Buy BMR links, remove them and escape the penalty G imposed on your site for 50 -100 presumably manual and relevant blog comments? Gimmee a break!
The 50 - 100 blog comments are probably going to be the worst of the lot to attempt to remove. Have you had any sucess removing the trash directories? You might be able to out grow the penalty by developing new links so that the number of suspicious (or bad) links falls below the tipping point. On a recent WBF, Danny Sullivan opined that Penguin is just a devaluation of the bad links. (Not my opinion but it's an interesting opinion.) No one has shared results but some people have suggested combining removing links with developing new strong ones.
Penguin is bizarre. Some of my pages are (very) slowly returning to their former top positions even when some of the bad links point to them. New pages with extensive content (think 2,000 words of unique/expert content) were among the first 2 - 3 to cover the event but now rank around 120. (Ouch).
I share your suspicion that for many of our sites, it's aggressive use of anchor text. Developing non-aggressive links may dig us out. Would love to hear from anyone who had tried this and what results they acheived.
-
if it was an algorithmic hit check out this video
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Leveraging A Second Site
Hi, A client of mine has an opportunity to buy/control another site in the same niche. The client's site is the top-ranked site for the niche. The second site is also often top half of page one. The second site has a 15 year old design that is a really bad, almost non-functional, user experience and thin content. The client's site (site 1) has the best link profile and dominates organic search, but the second site's link profile is as good as our nearest competitor's link profile. Both sites have been around forever. Both sites operate in the affiliate marketing space. The client's site is a multi million dollar enterprise. If the object were to wring the most ROI out of the second site, would you: A) Make the second site not much more than a link slave to the first, going through the trouble to keep everything separate, including owner, hosting, G/A, log-on IPs, so as not to devalue the links to 1st site, etc? Or... B) Develop the second site and not worry about hiding that both are the same owner. Or... C) Develop the second site and still worry about it keeping it all hidden from Google. Or... D) Buy the second site and forward the whole thing to site 1. I know the white hat answer is "B," but would like to hear considerations for these options and any others. Thanks! P.S., My pet peeve is folks who slam a fast/insufficient answer into an unanswered question, just to be the first. So, please don't.
White Hat / Black Hat SEO | | 945010 -
Exchange link from sites in same google account
Hi everyone, Anybody have experience when you have some websites which stored in Google Webmaster Tool and they exchange links between sites. So is it good for sites? We are hosted on different server. Thank you so much
White Hat / Black Hat SEO | | Jeepster0 -
Preventing CNAME Site Duplications
Hello fellow mozzers! Let me see if I can explain this properly. First, our server admin is out of contact at the moment,
White Hat / Black Hat SEO | | David-Kley
so we are having to take this project on somewhat blind. (forgive the ignorance of terms). We have a client that needs a cname record setup, as they need a sales.DOMAIN.com to go to a different
provider of data. They have a "store" platform that is hosted elsewhere and they require a cname to be
sent to a custom subdomain they set up on their end. My question is, how do we prevent the cname from being indexed along with the main domain? If we
process a redirect for the subdomain, then the site will not be able to go out and grab the other providers
info and display it. Currently, if you type in the sales.DOMAIN.com it shows the main site's homepage.
That cannot be allow to take place as we all know, having more than one domain with
exact same content = very bad for seo. I'd rather not rely on Google to figure it out. Should we just have the cname host (where its pointing at) add a robots rule and have it set to not index
the cname? The store does not need to be indexed, as the items are changed almost daily. Lastly, is an A record required for this type of situation in any way? Forgive my ignorance of subdomains, cname records and related terms. Our server admin being
unavailable is not helping this project move along any. Any advice on the best way to handle
this would be very helpful!0 -
Is one of my competitors trying to get my site penalized?
Hi guys, I have been ranking #2 for a popular search term for several months now, and today I noticed a drop to #5, so I went to check my backlink profile, and I'm seeing thousands of no-follow exact keyword matched backlinks, all from spammy looking websites. I looked at some of the links and they do link to me, but I didn't generate these links, and I have never paid anybody externally to build links for me. What is the best course of action for me here? link disavow tool?
White Hat / Black Hat SEO | | davegill0 -
Why is this site performing so well in the SERP's and getting high traffic volume for no apparent reason!
The site is https://virtualaccountant.ie/ It's a really small site They have only about 7 back links, They don't blog They don't have a PPC campaign They don't stand out from the crowd in terms of product or services offered So why are they succeeding in topping the SERP's for difficult to rank for accounting keywords such as accountant and online accounts. What are they doing better than everyone else, or have they discovered a way to cheat Google, and worse still - ME!
White Hat / Black Hat SEO | | PeterConnor0 -
Disavow tool for blocking 4 to 5 sites for Article Republishing
Am finding some very low authority sites (recently picked our articles from ezine and other article sites - written over a year back) and pasted on to there site. The number of articles copies are not 1 or 2, but more than 10-12 in all these domains This has also led to our anchor based url - backlink to us from them (a part of article). Have Wrote down to remove my author profile and articles - but there has been no response from webmaster of these sites. Is Disavow a right approach. The number of such sites are 4 or 5 in nature !!
White Hat / Black Hat SEO | | Modi0 -
Google SEVERE drop as of last week (oct 10) on long standing .org site
Hello Experts Wanted some imput if possible. I own a .org informational site that has been #1 in its category for Google Yahoo and Bing under a major keyword for years. The site is aged back to 2005 and all of the sudden it dropped on August 10 (Google only- Yahoo and Bing still #1)) but remained atop the primary keywords that it is namesaked for .org (xxxxyyyzzz.org) and then Oct 9-10 it dropped from the page 1 top ranking it had for years on that primary keyword to page 13. I dont know where to begin to look. Any ideas how something like this could happen and what "Stones" I should turn. We purchased the website and are not SEO gurus so just not sure. Any help would be appreciated
White Hat / Black Hat SEO | | TBKO1 -
Opinions Wanted: Links Can Get Your Site Penalized?
I'm sure by now a lot of you have had a chance to read the Let's Kill the "Bad Inbound Links Can Get Your Site Penalized" Myth over at SearchEngineJournal. When I initially read this article, I was happy. It was confirming something that I believed, and supporting a stance that SEOmoz has taken time and time again. The idea that bad links can only hurt via loss of link juice when they get devalued, but not from any sort of penalization, is indeed located in many articles across SEOmoz. Then I perused the comments section, and I was shocked and unsettled to see some industry names that I recognized were taking the opposite side of the issue. There seems to be a few different opinions: The SEOmoz opinion that bad links can't hurt except for when they get devalued. The idea that you wouldn't be penalized algorithmically, but a manual penalty is within the realm of possibility. The idea that both manual and algorithmic penalties were a factor. Now, I know that SEOmoz preaches a link building strategy that targets high quality back links, and so if you completely prescribe to the Moz method, you've got nothing to worry about. I don't want to hear those answers here - they're right, but they're missing the point. It would still be prudent to have a correct stance on this issue, and I'm wondering if we have that. What do you guys think? Does anybody have an opinion one way or the other? Does anyone have evidence of it being one way or another? Can we setup some kind of test, rank a keyword for an arbitrary term, and go to town blasting low quality links at it as a proof of concept? I'm curious to hear your responses.
White Hat / Black Hat SEO | | AnthonyMangia0