Google Manual Penalty - Dilemma?
-
Hi Guys,
A while back, my company had a 'partial match' manual penalty from google for 'unnatural links' pointing to our site.
This glorious feat was accomplished by our previous SEO agency for quite heavily spamming links (directories, all kinds of low quality sites).
That being said, when the penalty hit we really didnt see any drop in traffic. In fact, it was not long after the penalty that we launched a new website and since our traffic has grown quite significantly. we've doubled our total visits from prior penalty to now.
This previous SEO also did submit a couple of reconsideration requests (both done loosely as to fool Google by only removing a small amount of links, then abit more the next time when it failed - this was obviously never going to work). Since then, I myself have submitted a reconsideration request which was very thorough, disavowing 85 Domains (every single one at domain level rather than the individual URLs as I didnt want to take any chances), as well as getting a fair few links removed from when the webmaster responded. I documented this all and made multiple contacts to the webmasters so i could show this to Google.
This reconsideration request was not successful - Google made some new backlinks magically appear that i had not seen previously. But really, my main point is; am I going to do more damage removing more and more links in order to remove the penalty, because as it stands we haven't actually noticed any negative effects from the penalty! Perhaps the negative effects have not been noticed due to the fact that not long after the penalty, we did get a new site which was much improved and therefore would naturally get much more traffic than the old site, but overall it has not been majorly noticed.
What do you guys think - is it worth risking drop in rankings to remove the penalty so we don't face any future issues, or should I not go too heavy with the link removal in order to preserve current rankings? (im really interested to see peoples views on this, so please leave a comment if you can help!)
-
That's the problem...it's often hard to tell whether a link is natural or not. For example, a local directory listing might be ok, but it could be unnatural. If it helps, I wrote a Moz article that describes different kinds of unnatural links: http://moz.com/ugc/what-is-an-unnatural-link-an-in-depth-look-at-the-google-quality-guidelines
-
Thanks for your response, you've clarified a lot for me here.
Essentially, so long as only the unnatural links are removed I should not harm my sites ranking?That is, so long as Google agree on which links are the unnatural ones!
I better get to work auditing all of these links - see you again in afew years! haha.
-
"Google made some new backlinks magically appear that i had not seen previously."
This made me chuckle. Google is a strange animal. John Mueller has said many times that looking at your links in Webmaster Tools is enough, but I will often get back example unnatural links that are not in Webmaster Tools. This is one of the reasons why when I do a backlink audit I combine links from a number of different sources including OSE, ahrefs and majestic.
Now, I have seen sites lift penalties by just going on their Webmaster Tools links but really it's best to get them from multiple sources.
BUT...even when I combine every possible source I can find I will quite often get example links back from Google that don't exist on ANY backlink checkers. These are tough. But usually they are clues that can help you to find more links. For example, often when this happens it's a scraped version of a press release that is given. What I'll do is take a chunk of text in quotes and search for it on Google and often I'll find 3-4 additional links that weren't in my audit list.
Another thing you can do is download new links from GWT as often new ones will pop up even if they are years old.
Are you going to do more harm to your site than good? That depends on how good you are at auditing links. If you're only getting rid of unnatural links then you won't hurt your site and you may even see an improvement in rankings either immediately, a few weeks after the penalty is lifted, or when Penguin refreshes. But, if you're guessing at your disavow decisions then yes, if you disavow good links you're going to do harm to your site.
Best of luck!
-
Keep doing what you're doing. As long as you know how to properly identify if a site/link is good or bad, you shouldn't hurt your site. Better to do this work now and prevent another penalty in the future than to put it off.
RE: total backlinks - I recommend combining and deduping Open Site Explorer, Webmaster Tolls, Majestic, and AHREFs for the most thorough picture.
-
It will often take multiple requests for Google to remove a manual penalty to ensure you put enough effort in to cleaning up your link profile.
What tools did you use to find your links? It's best to use a combination of tools to find all of the possible links to your site. The amount of links you remove/disavow is relative to the size of your link profile, some sites have had to remove or disavow 1,000s of domains.
Ensure the links that you remove are exact match links or those from directories and guest blogging etc.
It's best to remove more links than not enough as even having poor links will result in Google marking you down. If you're not thorough enough, there's every chance you could get penalized again in the future. Also make sure your recon request is clear and simple and clearly demonstrates the work you have done to remove or disavow any offending links.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does google sandbox aged domains too?
Hello, i have a question. Recently i bought a domain from godaddy auction which is 23 years old and have DA 37 PA 34 Before bidding i check out the domain on google using this query to make sure if pages of this website are showing or not (site:mydomain.com) only home page was indexed on google. Further i check the domain on archive web the domain was last active in 2015. And then it parked for long about 4 years. So now my question does google consider these type of domain as new or will sandboxed them if i try to rebuild them and rank for other niche keywords ? Because its been 4 weeks i have been building links to my domain send several profile and social signals to my domain. My post is indexed on google but not showing in any google serp result.
White Hat / Black Hat SEO | | Steven231 -
How much does doing google search queries dilute your search console data
So, does performing dozens or hundreds of search queries a day dilute your search console data, or does google filter this out or how does this work exactly? When you do an icognito search and click on your site does this information get recorded in search console?
White Hat / Black Hat SEO | | jfishe19880 -
Got Google Manual penalty full Spam on my website
Here are Moz Metrics: http://prntscr.com/as3fp6 Site Url: www.financialprospect.com DA- 40 PA- 48 Spam Score - 0 RD- 68 Links No Loss in Backlink Profile I think my site is having much more spun content so can you suggest me the ways to re-index my site? How can i get my site back to google? Can you suggest any tool which give number of links already spun and then we may delete those posts. Looking for positive reply...!!!
White Hat / Black Hat SEO | | morisshibu1 -
Thumbtack Blatantly Violating Google TOS?
Hi, We have a business registered on Thumbtack so we receive their newsletters. I'm aware that review sites offering a "badge" or verification logo which links back to your profile is nothing new. But the email I received from Thumbtack is a fairly blatant attempt to game Google for popular keywords. I was just curious on your thoughts about this. I believe it was Overstock who did something like this and got slapped by Google pretty hard for a while. Could Thumbtack be heading down the same path? Image: http://i.imgur.com/FWPnmEP.jpg
White Hat / Black Hat SEO | | kirmeliux0 -
Penalty removing company recommendation?
We've got a manual penalty, not sitewide, that we've been trying to remove and keep getting our reconsideration request denied. We also do not have the manpower to manually check backlinks, contact domain owners, etc anymore. Does anyone have recommendations on a company to use?
White Hat / Black Hat SEO | | CFSSEO0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Google Preferred Agency???
I just stumbled upon an SEO company's website that says they are a 'Google Preferred Agency'. This isn't just a line of copy on the site, it's featured prominently on the site, and they use the Google logo as well. I've never heard of a 'Google Preferred Agency'. One would think that even if there was such a thing, that it would involve a link back to a profile page on Google like they do with AdWords/Analytics partners... Am I missing something, or is this company doing something a little shady? I don't want to toss the name of the company out there because I don't want to publicly bash them.
White Hat / Black Hat SEO | | stevefidelity0 -
Redirecting doesn't rank on google
We are redirecting our artist's official website to copenhagenbeta.dk. We have two artists (Nik & Jay and Burhan G) that top ranks on Google (first on page 1), but one of them (Lukas Graham) doesn't rank at all. We use the same procedure with all artists. http://copenhagenbeta.dk/index.php?option=com_artistdetail&task=biography&type=overview&id=49 Doesn't rank but the old artist page still does. Is it the old page that tricks Google to think that this is the active page for the artist?
White Hat / Black Hat SEO | | Morten_Hjort0