When to remove bad links.
-
Hi everyone.
We were hit on the 5th Oct with manual penalties - after building some good links and building good content we saw some gains in our SERPS, not to where they were, but they are definately improving for some low competition keywords.
In this case would people recommend still trying to remove bad links?
We have audited our links and identified ones which seem spammy.
We were going to go through a step by step process, emailing bad link providers where possible, and then sending a disavow for any links we were not able to remove.
If we have started to see gains through other means is it wise in people's opinion to start contacting google?
We watched Matt Cutts video on disavow usage and he states not to use it unless in extreme situations, so we don't want to 'wake the beast'.
Many thanks.
James.
-
Our links were from an SEO company who always vowed their methods were totally adhering to google, but that was before penguin.
I have heard this exact statement countless times. I hate to be harsh on my own industry but things are quite bad for clients. They do not know who to trust, with good reason.
-
many "SEO agencies" have little to no SEO knowledge. They skipped everything and built links, which worked too well in the past and now many site owners are paying the price.
-
many of these same agencies outsourced 100% of their work to other countries were the work was performed in the lowest quality manner, despite assurances to the contrary
-
many sites offer the appearance to be US or UK companies, but a quick inspection shows the veil is very thin and these are actually companies from India or other countries who pay for a virtual office or a single small office in order to funnel business.
Companies and site owners need to know how to navigate the shark infested waters of SEO and work with quality service providers.
Regarding your Penguin issue, based on the information provided your efforts are not even close to what is required to resolve the issue.
1. A comprehensive backlink report is necessary to capture all known links to your site. I use data from Bing, Google, OSE, Majestic and AHREFS. Once combined, this report is the most comprehensive list in the industry. There is no single source, nor any two sources, which can be used to properly capture all the links to your site.
2. The links need to be properly identified. Most site owners and even SEOs struggle in this regard. It cannot be done by any automated tool as there are far too many errors.
3. A comprehensive Webmaster Outreach Campaign needs to be conducted, and it needs to be successful. On a bad campaign the success rate should be about 25%. On a good one, the success rate exceeds 50%. There are numerous factors involved.
I know you are probably thinking "no way! I only get 1 out of 100 site owners to respond". The problem I see is most site owners chose the easy way out when they built manipulative links, and they similarly choose the easy way out when attempting to remove them. That is why forums are full of site owners sharing "I have turned in 10 Reconsideration Requests and all of them were declined".
You need to eliminate a "significant" number of links before using the Disavow Tool. My recommendation is to seek out a quality SEO provider with experience in resolving Penguin issues. If you cannot afford the cost of cleaning up the manipulative links, you can also change domains. The cost of losing all your good links and changing domains is very high in the long term, but in the short term the expenses are quite minimal.
-
-
Hi Ryan.
I guess I would assume this is a Penguin issue now, perhaps thinking it a manual penalty was incorrect and a little ignorant of myself.
I think it is caused by bad links, in my opinion the content is written normally, there are very few issues with it and it is quite varied and updated. Our links were from an SEO company who always vowed their methods were totally adhering to google, but that was before penguin.
Over the last month or so the SERPs have started to go up, after some natural link building with related sites with the same language (French). And some extra additions to the content.
We have been contacting the deemed 'spammy' link websites to ask them to remove, one out of a few hundred have so far.
(Is 'disavow' still a tool we could eventually use in your opinion?)
I guess we are a little in the dark as to if the site is penalized, or if the link juice from the spammy sites has disappeared after penguin, which I guess would be the better reason fro serp loss for our site.
-
Hi James,
I am pleased to hear no manual actions have been taken on your site. You are correct in stating you should not submit any further Reconsideration Requests.
As I look back to your original Q&A, you stated you were impacted by a manual penalty on October 5th. What led you to make that statement?
If your site suffered a ranking drop, you can analyze your analytic data to determine exactly when that drop occurred, and what segment(s) were impacted. Did the drop only impact Google organic? If so, that would indicate an algorithm issue. If the drop impacted other traffic sources, it may be a downturn for your business or industry. In summary, a traffic drop analysis is needed.
If you know your site acquired spammy links (i.e. you hired link builders or "SEOs") then you may be impacted by Penguin. If you have low quality content, which includes thin and duplicate content, then you may have a Panda issue. There are other numerous other algorithm changes besides those two. There could be a new issue on your site as well. It's time to dive in to your analytics to gain all the data possible surrounding this drop in traffic.
-
Hi Ryan,
Just to follow up...
We got our response from Google today, the confirm no manual penalties from Google.
'We've reviewed your site and found no manual actions by the web spam team that might affect your site's ranking in Google. There's no need to file a reconsideration request for your site because any ranking issues that you may be experiencing are not related to a manual action taken by the web spam team.' (Google)
Would this indicate just an algorithm change, in this case would you still recommend disavow and removing links, they say we should not send another reconsideration request, so we are not really sure where to take it from here.
Many thanks,
James.
-
Thanks Marcus, I know it is solid advice, we have taken it on board and plan to use it.
-
James, this is real solid advice here and you have to look at the long term picture. Just because you may (or may not) be penalised due to spammy links, if they exist, and you know about them, there is a noose their ready for your site to slip it's neck into.
If you have resources and care about the long term game get everything cleaned up and you can push forward in a positive way without having to worry about any potential problems rearing their head or the positive value of solid links being diminished by historical issues.
Great advice as ever from Ryan.
-
_You said your website is making progress in some less competitive keywords. If this is the case, I think this is not a severe penalty. But since this is a manual penalty, you have to [and I mean it] send a reconsideration request and wait for the response. And yes, there is no such beast exists here. You gotta problem and you have to fix this. _
-
That would be just fine.
-
Sorry to disturb you again, would this be a good first contact message on the reconsideration form?
'Our rankings dropped for this site. We are trying to do everything possible to make it compliant with Google's guidelines - please can you tell us if there is any manual action taken on the site that we can fix.'
James.
-
Exactly!
-
Thanks for the comprehensive answer! It is really appreciated. So even if no warning message was received by us, you recommend firstly sending a reconsideration request, just asking them if the site has been penalised, in the very beginning, while we are still in the process of removing links?
And then is the answer is 'yes' sending another recon request when we have done our best at removing any spammy links?
-
Time and money is less of an issue, we just want to do what is best for the site
That is a fantastic position. SEO is a long-term proposition. This thinking should guide your entire decision making process.
Some people have mentioned in the past that sending a reconsideration request could do more harm than good
I cannot comment on what "some people" have shared. I read a lot of SEO related articles and there is a high percentage of questionable and outright incorrect information shared. I would ask you exactly who shared the advice and in what context.
Here is what Matt Cutts has shared on this topic: http://www.youtube.com/watch?v=5rsWc78dits
My strong opinion on the matter is as follows:
1. If no manual actions have been taken on your site, Google auto-replies. Accordingly, there is no harm in asking.
2. Matt shared in a different video (sorry, I was unable to easily locate the link) that his team does not go looking for problems as a result of a Reconsideration Request. Based on my knowledge and experience, if you have a penalty on your site and you submit a Reconsideration Request asking if you have a penalty, a member of the spam team will likely just push a button and share the canned response Google offers for penalties of that type. A Google employee would not go searching your site looking for issues.
3. For 100% of clients, I submit a Reconsideration Request upon accepting them as clients. It has never once been a cause for concern on any level.
Running a website is the act of a business. You cannot run your business in fear, and there is no reason to fear any aspect of the Google Reconsideration Request process as a white hat SEO or site operator.
Is it worth just removing links with no reconsideration request? Or is that essential?
It is essential to submit a Reconsideration Request if you are manually penalized.
One final note. There are legitimate other opinions on this topic. I have tremendous respect for Dr. Pete and agree with his approach 99.9% of the time, but I do recall him sharing a different viewpoint on this topic suggesting site owners not to submit a Reconsideration Request unless they had reason to believe they were penalized. Even if that were the case, in your instance there is strong reason to believe a manual penalty may exist on your site. Submit the Reconsideration Request and find out. Knowing is better then not knowing.
-
Hi Agree with what your saying, one other reason not to address the penalty is that we have not received any warning on webmasters. The ranks are now lets say 10 - 20 further down in the serps than they were originally. But have gained perhaps 20 - 30 in the last few months.
Some people have mentioned in the past that sending a reconsideration request could do more harm than good, (I don't know if that is true, just something I read).
Is it worth just removing links with no reconsideration request? Or is that essential?
Time and money is less of an issue, we just want to do what is best for the site.
-
HI James,
We were previously hit with a manual penalty and did 3 re-submissions before the manual penalty was removed.
Google just release a disavow tool in webmaster tools where you can effectively tell google which links you don't want. Check it out here. I'd read up on it first, lots of pro's and cons.
My advice show Google you are trying to do good. Highlight the links you don't like and have had removed or asked to have removed, keep it all in a spreadsheet/google docs highlighting which ones have now been removed/asked to be removed/aren't your fault.
Then once your confident you've cut out the bad, resubmit with the evidence, close your eyes, cross fingers and wait roughly 2-4 weeks in the hope they will remove the manual penalty.
But be warned you may be in for the long haul, I mean months.
-
after building some good links and building good content we saw some gains in our SERPS, not to where they were, but they are definately improving for some low competition keywords.
The degree of penalization for manipulative links varies greatly from site to site. At the worst case, your site does not rank for anything except your domain name when entered with the TLD (i.e. mysite.com). It sounds like in your case you are penalized but not severely.
You can create new pages and rank for those new terms, but your penalization will remain a problem until you deal with it. You are asking if you can ignore the penalty. I would suggest that would be unwise. Why?
1. Most sites built links to their most important pages / keywords. For small to medium businesses, a group of a few keywords typically produces a large chunk of their traffic. For example "Los Angeles Auto Insurance" may provide 40% of the traffic to a website whereas those other pages you are building do not even provide 1% of the traffic of the core keyword.
2. It is hard enough for a non-penalized site to compete for traffic in search results. To move up a single position in ranking can make a huge difference in sales. It is likely at some point you will want to improve back to your pre-penalized ranking. The first step you need to take is removing the penalty.
3. You are presuming you will not be further penalized. In August Matt Cutts shared future Penguin changes were coming and the effects would be "jarring and jolting". I suspect the sites which are currently penalized and ignored the penalty will be further penalized.
The sole reason not to address the penalty is the cost (time / money). I would suggest you do whatever it takes to remove the penalty, then deal with the costs later. Sure, that's easy for me to say but the question is, how committed are you to this business? If you had the website up 5 years ago and intend to be in business 5 years from now, then it is an easy call. Remove the penalty and distribute the costs over time.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Back link plan discussion
When you have a lot of keywords that you rank for say something like 15,000 or more. How do you develop a good back link plan? I was thinking to first look at the highest volume keywords we already rank for but aren't in the top 1-3 spots. To focus on those few words trying to obtain more high quality back links. But I'm not sure if this is the best plan . What would you do? What are some good consistent back link plans you can use to work on a keyword or lots of keywords? Thanks for the discussion, Chris
Algorithm Updates | | Cfarcher1 -
Remove spam url errors from search console
My site was hacked some time ago. I've since then redesigned it and obviously removed all the injection spam. Now I see in search console that I'm getting hundreds of url errors (from the spam links that no longer work). How do I remove them from the search console. The only option I see is "mark as fixed", but obviously they are not "fixed", rather removed. I've already uploaded a new sitemap and fetched the site, as well as submitted a reconsideration request that has been approved.
Algorithm Updates | | rubennunez0 -
Are links from inside duplicate content on a 3rd party site pointing back to you worthwhile.
In our niche there are lots of specialist 'profile / portfolio' sites were we can upload content (usually project case studies. These are often quite big and active networks and can drive decent traffic and provide links from high ranking pages. The issue im a bit stuck on is - because they are profile / portfolio based usually its the same content uploaded to each site. But im beginning to get the feeling that these links from within duplicate content although from high ranking sites are not having an effect. Im about to embark on a campaign to re rewrite each of our portfolio items (each one c. 400 words c. 10 times) for each different site, but before i do i wandered if any one has had any experience / a point of view on with wether Google is not valuing links from within duplicate content (bare in mind these arnt spam sites, and are very reputable, mainly because once you submit the content it gets reviewed prior to going live). And wether a unique rewrite of the content solves this issue.
Algorithm Updates | | Sam-P0 -
Content Caching Memory & Removal of 301 Redirect for Relieving Links Penalty
Hi, A client site has had very poor link legacy, stretching for over 5 years. I started the campaign a year ago, providing valuable good quality links. Link removals and creating a disavow to Google have been done, however after months and months of waiting nothing has happened. If anything, after the recent penguin update, results have been further affected. A 301 redirect was undertaken last year, consequently associating those bad links with the new site structure. I have since removed the 301 redirect in an attempt to detach this legacy, however with little success. I have read up on this and not many people appear to agree whether this will work. Therefore, my new decision is to start a fresh using a new domain, switching from the .com to .co.uk version, helping remove all legacy and all association with the spam ridden .com. However, my main concern with this is whether Google will forever cach content from the spammy .com and remember it, because the content on the new .co.uk site will be exactly the same (content of great quality, receiving hundreds of visitors each month from the blog section along) The problem is definitely link related and NOT content as I imagine people may first query. This could then cause duplicate content, knowing that this content pre-existed on another domain - I will implement a robots.txt file removing all of the .com site , as well as a no index no follow - and I understand you can present a site removal to Google within webmaster tools to help fast track the deindexation of the spammy .com - then once it has been deindexed, the new .co.uk site will go live with the exact same content. So my question is whether Google will then completely forget that this content has ever existed, allowing me to use exactly the same content on the new .co.uk domain without the threat of a duplicate content issue? Also, any insights or experience in the removal of a 301 redirect, detaching legacy and its success would also be very helpful! Thank you, Denver
Algorithm Updates | | ProdoDigital0 -
How effective are nofollow links today (2013) ?
Hi, We had a question about the effectiveness of nofollow today. Nofollowing some links on pages was to make sure pagerank flows to content which is most relevant and useful to visitors on the site. Looking at the 2009 article, http://www.seomoz.org/blog/google-says-yes-you-can-still-sculpt-pagerank-no-you-cant-do-it-with-nofollow, it seems that adding the meta tag nofollow would no longer help us in ensuring this goal. We had a couple of questions: 1. Do you think Google today only passes pagerank to dofollow links
Algorithm Updates | | SEMEnthusiast
2. Are sites today using iframes/javascript to make sure googlebot passes pagerank to only relevant pages
3. Any other best practice you would suggest Thanks0 -
Difference between Google's link: operator and GWT's links to your sites
I haven't used the Google operator link: for a while, and I noticed that there is a big disparity between the operator "link:" and the GWT's links to your site. I compared these results on a number of websites, my own and competitors, and the difference seem to be the same across the board. Has Google made a recent change with how they display link results via the operator? Could this be an indication that they are clean out backlinks?
Algorithm Updates | | tdawson090 -
So, useless link exchange pages still work?!
After 3 years out of SEO I thought things might have moved on, but apparently not. Bit of back link research and all the top sites in my niche have tons of reciprocal links to barely relevant sites. Do I really have to do this? I mean I thought this was so out of date, it's not much better than keyword stuffing. So, should I just forget my lofty principles asking myself 'is this of any value to my users?' and just take the medicine?
Algorithm Updates | | Cornwall0 -
Removing secure subdomain from google index
we've noticed over the last few months that Google is not honoring our main website's robots.txt file. We have added rules to disallow secure pages such as: Disallow: /login.cgis Disallow: /logout.cgis Disallow: /password.cgis Disallow: /customer/* We have noticed that google is crawling these secure pages and then duplicating our complete ecommerce website across our secure subdomain in the google index (duplicate content) https://secure.domain.com/etc. Our webmaster recently implemented a specific robots.txt file for the secure subdomain disallow all however, these duplicated secure pages remain in the index. User-agent: *
Algorithm Updates | | marketing_zoovy.com
Disallow: / My question is should i request Google to remove these secure urls through Google Webmaster Tools? If so, is there any potential risk to my main ecommerce website? We have 8,700 pages currently indexed into google and would not want to risk any ill effects to our website. How would I submit this request in the URL Removal tools specifically? would inputting https://secure.domain.com/ cover all of the urls? We do not want any secure pages being indexed to the index and all secure pages are served on the secure.domain example. Please private message me for specific details if you'd like to see an example. Thank you,0