Do you have to wait after disavowing before submitting a reconsideration request
-
Hi all
We have a link penalty at the moment it seems. I went through 40k links in various phases and have disavowed over a thousand domains that date back to old SEO work. I was barely able to have any links removed as the majority are on directories etc that no one looks after any more etc and / or which are spammy and scraped anyway.
According to link research tools link detox tool, we now have a very low risk profile (I loaded the disavowed links into the tool for it to take into consideration when assessing our profile). I then submitted a reconsideration request on the same day as loading the new disavowed file (on the 26th of April). However today (7th May) we got a message in webmaster central that says our link profile is still unnatural. Aaargh.
My question: is the disavow file taken into consideration when the reconsideration request is reviewed (ie is that information immediately available to the reviewer)? Or do we have to wait for the disavow file to flow through in the crawl stats? If so, how long do we have to wait?
I've checked a link that I disavowed last time and it's still showing up in the links that I pull down from Webmaster Central, and indeed links that I disavowed at the start of April are still showing up in the list of links that can be downloaded.
Any help gratefully received. I'm pulling my hair out here, trying to undo the dodgy work of a few random people many months ago!
Cheers,
Will
-
You seem to have a good handle on the issue but you might consider getting an experienced SEO in for at least a second opinion. We can only give very general help here on the Q&A, as we don't have access to your data
They do say to wait at least a few weeks for results
Cheers
S
-
Hi Stephen
I've been using the links downloaded from Webmaster (as directed to by Matt Cutts in one of his videos IIRC) plus also the data set from Link Research Tools. Is that insufficient? I've only got so many hours in the day as my day job is running this company...I figured taking the links that Google gave me would surely be enough...but these days who knows. G seems to want to make people jump through a lot of hoops...
-
Hey Marcus
Thanks for your input. Yeah, we have a lot of links but then we've been around for 7 years and weirdo scrapers and random replicants of DMOZ alone contribute a zillion links without us even having done anything. Not saying we didn't do link building back in the day (we did, just like everyone else, in what was at the time a white hat fashion but apparently no longer is) but we have had no permanent marketing team at all for the last two years as we've focused on some B2B parts of our business. So frustrating that bad links just kept growing and we're supposed to be responsible for them!
Anyway, as you say, will need to go in a bit harder I guess. eg just because a site is PR0, I didn't remove it before, as some random person with a no marks blog who used our birthday balloon picture on their blog didn't deserve to be disavowed as far as I thought. But, well, I can't take any chances now so will just have to bin anything under PR1 and take another look at links from themed websites (eg should I disavow other blogs that have added us to their blogroll unsolicited even if they're in our vertical? It's hard to tell. What about genuine flower directories? Who knows?).
What's really frustrating is that the whole message from Matt Cutts is "you really shouldn't use this tool" (ref disavow) as you could damage your site but 1. barely anyone takes links down when requested as far as I can tell and 2. given the amount of junk that's been pointed at our site that we're not responsible for (though we are are responsible for some), then I think the contention that very few people would need to use it is a bit optimistic and there's therefore a danger or people like me totally shooting themselves in the foot, given there are no clear rules on the grey areas I mention above.
PS understood that it's not some magic solution and we'll rank #1 for everything afterwards. I just want to get it cleared up and be able to get back to my day job. God knows how a smaller business than us would cope with something like this. Seems to me it pushes the advantage even further in the direction of bigger companies with the resources to manage a screw up like this.
Anyway, blah blah. Time to get the machete out.
-
In my experience, if you have this message again, you still have links they don't like. 35% of linking domains is not a great deal and as Stephen said, whilst Link Detox gives you a good starting place you really do have to audit these links in a brutal fashion.
You have 15000 external links from 2000 sites - that's a hell of a lot of links for a semi popular blog let alone a site that does not really publish any content that would attract links.
If you are holding onto links as you think they are 'ok' or because they 'don't look too bad' then you may need to get a whole lot more aggressive with what you remove.
Also, just because you remove the manual penalty, don't expect things to be amazing afterwards.
An alternative approach to finding the bad links and getting them removed is to identify the good ones and consider getting them repointed to a new URL and starting again with a rebrand / new URL. It can be easier to get a response from the good sites than it can be getting a response from the bad ones.
Failing that get a whole lot more aggressive with what you remove.
Hope that helps!
Marcus
-
How sure are you you have a full dataset of links? What did you use as you database for links to start cleaning from? (I would expect ahrefs, GWT, seomoz + majestic etc)
S
-
Well, I also went through all the links manually which was the world's most boring task, then followed up with a healthcheck. Gah.
We've disavowed about 35% of all linking domains now...
-
I doubt its a time thing, it's more likely that they still see dirty links that you have not disallowed
That's the problem with these jump one the bandwagon tools like Link detox et al - they give you a nice score but that doesn't mean anything
404ing burnt pages and starting again may be a much quicker process than messing around with link disavowal
How many domains were linking and how many domains did you disallow?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Disavow links of my own in niche forums that i post to regularly?
Hi Yall, I'm disavowing a new set of links and have come across a wall: Let's say your niche is in web hosting and you post to forums such as a webhostingtalk.com (a forum very popular in the hosting business). If your sole purpose is mostly selling your business and you have links (not anchor text keywords) that you direct users to for specific products and such...do you do a disavow those links? I'm not leaving links like: Web hosting, or, Free Hosting... I'm posting deals and answering some questions on other posts that direct to my site with traditional links. Thank you
Intermediate & Advanced SEO | | Shawn1240 -
Submitting XML Sitemap for large website: how big?
Hi there, I’m currently researching how I can generate an XML sitemap for a large website we run. We think that Google is having problems indexing the URLs based on some of the messages we have been receiving in Webmaster tools, which also shows a large drop in the total number of indexed pages. Content on this site can be accessed in two ways. On the home page, the content appears as a list of posts. Users can search for previous posts and can search all the way back to the first posts that were submitted. Posts are also categorised using tags, and these tags can also currently be crawled by search engines. Users can then click on tags to see articles covering similar subjects. A post could have multiple tags (e.g. SEO, inbound marketing, Technical SEO) and so can be reached in multiple ways by users, creating a large number of URLs to index. Finally, my questions are: How big should a sitemap be? What proportion of the URLs of a website should it cover? What are the best tools for creating the sitemaps of large websites? How often should a sitemap be updated? Thanks 🙂
Intermediate & Advanced SEO | | RG_SEO0 -
Can submitting sitemap to Google webmaster improve SEO?
Can creating fresh sitemap and submitting to Google webmaster improve SEO?
Intermediate & Advanced SEO | | chanel270 -
Are links that are disavowed with Google Webmaster Tools removed from the Google Webmaster Profile for the domain?
Hi, Two part question - First, are links that you disavow using google webmaster tools ever removed from the webmaster tools account profile ? Second, when you upload a file to disavow links they ask if you'd like to replace the previously uploaded file. Does that mean if you don't replace the file with a new file that contains the previously uploaded urls those urls are no longer considered disavowed? So, should we download the previous disavow file first then append the new disavow urls to the file before uploading or should we just upload a new file that contains only the new disavow urls? Thanks
Intermediate & Advanced SEO | | bgs0 -
Our Robots.txt and Reconsideration Request Journey and Success
We have asked a few questions related to this process on Moz and wanted to give a breakdown of our journey as it will likely be helpful to others! A couple of months ago, we updated our robots.txt file with several pages that we did not want to be indexed. At the time, we weren't checking WMT as regularly as we should have been and in a few weeks, we found that apparently one of the robots.txt files we were blocking was a dynamic file that led to the blocking of over 950,000 of our pages according to webmaster tools. Which page was causing this is still a mystery, but we quickly removed all of the entries. From research, most people say that things normalize in a few weeks, so we waited. A few weeks passed and things did not normalize. We searched, we asked and the number of "blocked" pages in WMT which had increased at a rate of a few hundred thousand a week were decreasing at a rate of a thousand a week. At this rate it would be a year or more before the pages were unblocked. This did not change. Two months later and we were still at 840,000 pages blocked. We posted on the Google Webmaster Forum and one of the mods there said that it would just take a long time to normalize. Very frustrating indeed considering how quickly the pages had been blocked. We found a few places on the interwebs that suggested that if you have an issue/mistake with robots.txt that you can submit a reconsideration request. This seemed to be our only hope. So, we put together a detailed reconsideration request asking for help with our blocked pages issue. A few days later, to our horror, we did not get a message offering help with our robots.txt problem. Instead, we received a message saying that we had received a penalty for inbound links that violate Google's terms of use. Major backfire. We used an SEO company years ago that posted a hundred or so blog posts for us. To our knowledge, the links didn't even exist anymore. They did.... So, we signed up for an account with removeem.com. We quickly found many of the links posted by the SEO firm as they were easily recognizable via the anchor text. We began the process of using removem to contact the owners of the blogs. To our surprise, we got a number of removals right away! Others we had to contact another time and many did not respond at all. Those we could not find an email for, we tried posting comments on the blog. Once we felt we had removed as many as possible, we added the rest to a disavow list and uploaded it using the disavow tool in WMT. Then we waited... A few days later, we already had a response. DENIED. In our request, we specifically asked that if the request were to be denied that Google provide some example links. When they denied our request, they sent us an email and including a sample link. It was an interesting example. We actually already had this blog in removem. The issue in this case was, our version was a domain name, i.e. www.domainname.com and the version google had was a wordpress sub domain, i.e. www.subdomain.wordpress.com. So, we went back to the drawing board. This time we signed up for majestic SEO and tied it in with removem. That added a few more links. We also had records from the old SEO company we were able to go through and locate a number of new links. We repeated the previous process, contacting site owners and keeping track of our progress. We also went through the "sample links" in WMT as best as we could (we have a lot of them) to try to pinpoint any other potentials. We removed what we could and again, disavowed the rest. A few days later, we had a message in WMT. DENIED AGAIN! This time it was very discouraging as it just didn't seem there were any more links to remove. The difference this time, was that there was NOT an email from Google. Only a message in WMT. So, while we didn't know if we would receive a response, we responded to the original email asking for more example links, so we could better understand what the issue was. Several days passed we received an email back saying that THE PENALTY HAD BEEN LIFTED! This was of course very good news and it appeared that our email to Google was reviewed and received well. So, the final hurdle was the reason that we originally contacted Google. Our robots.txt issue. We did not receive any information from Google related to the robots.txt issue we originally filed the reconsideration request for. We didn't know if it had just been ignored, or if there was something that might be done about it. So, as a last ditch final effort, we responded to the email once again and requested help as we did the other times with the robots.txt issue. The weekend passed and on Monday we checked WMT again. The number of blocked pages had dropped over the weekend from 840,000 to 440,000! Success! We are still waiting and hoping that number will continue downward back to zero. So, some thoughts: 1. Was our site manually penalized from the beginning, yet without a message in WMT? Or, when we filed the reconsideration request, did the reviewer take a closer look at our site, see the old paid links and add the penalty at that time? If the latter is the case then... 2. Did our reconsideration request backfire? Or, was it ultimately for the best? 3. When asking for reconsideration, make your requests known? If you want example links, ask for them. It never hurts to ask! If you want to be connected with Google via email, ask to be! 4. If you receive an email from Google, don't be afraid to respond to it. I wouldn't over do this or spam them. Keep it to the bare minimum and don't pester them, but if you have something pertinent to say that you have not already said, then don't be afraid to ask. Hopefully our journey might help others who have similar issues and feel free to ask any further questions. Thanks for reading! TheCraig
Intermediate & Advanced SEO | | TheCraig5 -
Easiest way to disavow single links on an on-going basis?
We frequently get random super-sketchy looking blogs linking to us with no author or contact information. I believe we are being targeted by a competitor setting up garbage links to us. I am hoping to use the Google disavow links tool to deal with this but is it: Safe to use or does it flag us as link spammers by using it Possible to use on an on-going basis for single links (as them come in, as opposed to a bunch of backlogged links) Thanks!
Intermediate & Advanced SEO | | BlueLinkERP0 -
Best practice to disavow spammy links
Hi Forum, I'm trying to quantify the logic for removing spammy links.
Intermediate & Advanced SEO | | Mark_Ch
I've read the article: http://www.seomoz.org/blog/how-to-check-which-links-can-harm-your-sites-rankings. Based on my pivot chart results, I see around 55% of my backlinks at zero pagerank. Q: Should I simply remove all zero page rank links or carry out an assessment based on the links (zero pagerank) DA / PA. If so what are sensible DA and/or PA metrics? Q: What other factors should be taken into consideration, such as anchor text etc.0 -
Google Reconsideration Request Without Warning Message First?
Is there ever a circumstance to request a reconsideration through Google Webmaster Tools without having first received some type of warning message from Google? For instance, if its discovered that the site was violating Google's policy for a long time inadvertently, the problem is fixed now, just wanted to let you know. Or is it sort of like don't wake the beast and never communicate with Google unless they initiate the conversation?
Intermediate & Advanced SEO | | wattssw0