On the use of Disavow tool / Have I done it correctly, or what's wrong with my perception?
-
On a site I used GSA search engine ranker. Now, I got good links out of it. But, also got 4900 links from one domain. And, I thought according to ahrefs. One link from the one domain is equal to 4900 links from one domain. So, I downloaded links those 4900 and added 4899 links to disavow tool. To disavow, to keep my site stable at rankings and safe from any future penalty. Is that a correct way to try disavow tool? The site rankings are as it is.
-
Highland's analysis is correct. You should only be using the disavow tool carefully and with a measured approach, and usually in dealing with penalties.
-
Wait, you just saw a bunch of links and disavowed them because... you saw a bunch of links? Did you have any penalties or rank drops? Anything that would lead you to believe these links are actively harming your rankings?
Disavow is a cleanup tool, not a preventative tool. I mean, there's a really good reason why they tell you NOT to use this tool lightly. If i could put this on a giant neon sign I would, but here's what Google says on their help page (emphasis mine)
This is an advanced feature and should only be used with caution. If used incorrectly, this feature can potentially harm your site’s performance in Google’s search results. We recommend that you disavow backlinks only if you believe you have a considerable number of spammy, artificial, or low-quality links pointing to your site, and if you are confident that the links are causing issues for you. In most cases, Google can assess which links to trust without additional guidance, so most normal or typical sites will not need to use this tool.
If I were you, I'd yank the disavow out right now
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
One guy using some Alexa rank tricks to gain high PR etc..?
Hi! One finnish guy is getting pretty nice Alexa ranking to his sites, even if the real traffic is not somewhere near it would lead for that cool Alexa rank. I am a bit suspisious if he is using some Low Bounce Rate High Traffic Boosters on his sites.. I will give you some examples here to look into.. Vihjepaikka(dot)com - Created on 2013-03-13 - Alexa Rank 129k!!! - PR3 - Backlinks not many qualitys.. Casinolla(dot)net - Created on 2014-10-15 - Alexa Rank 351k!!! - PR0 - Backlinks 0!!! Cashadvance777(dot)com - Created on 2014-09-04 - Alexa Rank 772k!!! - PR3 - Backlinks 0!!! Let me know your thoughts on these.. Cheers!
White Hat / Black Hat SEO | | Kononen0 -
Recovering from Google Penguin/algorithm penalty?
Anyone think recovery is possible? My site has been in Google limbo for the past 8 months to around a year or so. Like a lot of sites we had seo work done a while sgo and had tons of links that Google now looks down on. I worked with an seo company for a few months now and they seem to agree Penguin is the likely culprit, we are on page 8-10 for keywords that we used to be on page 1 for. Our site is informative and has everything in tact. We deleted whatever links possible and some sites are even hard to find contact information for and some sites want money, I paid a few a couple bucks in hopes maybe it could help the process. Anyway we now have around 600 something domains on disavow file we out up in March-April, with around 100 or 200 added recently as well. If need be a new site could be an option as well but will wait and see if the site can improve on Google with a refresh. Anyone think recovery is possible in a situation like this? Thanks
White Hat / Black Hat SEO | | xelaetaks0 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
Do industry partner links violate Google's policies?
We're in the process of The Great _Inquisition_piecing together a reconsideration request. In doing so, we reached out to an agency to filter and flag our backlinks as safe, should be no-followed, or should be removed. The problem is, they flagged several of our earned, industry partner links (like those pointing to us, HireAHelper, from 1-800-Pack-Rat and PODS for example) as either should be no-followed or should be removed. I have a hard time believing Google would penalize such a natural source of earned links, but then again, this is our second attempt at a Reconsideration Request, and I want to cover all my bases. What say you Moz community? No-follow? Remove? Leave alone?
White Hat / Black Hat SEO | | DanielH0 -
Negative SEO and when to use to Dissavow tool?
Hi guys I was hoping someone could help me on a problem that has arisen on the site I look after. This is my first SEO job and I’ve had it about 6 months now. I think I’ve been doing the right things so far building quality links from reputable sites with good DA and working with bloggers to push our products as well as only signing up to directories in our niche. So our backlink profile is very specific with few spammy links. Over the last week however we have received a huge increase in backlinks which has almost doubled our linking domains total. I’ve checked the links out from webmaster tools and they are mainly directories or webstat websites like the ones below | siteinfo.org.uk deperu.com alestat.com domaintools.com detroitwebdirectory.com ukdata.com stuffgate.com | We’ve also just launched a new initiative where we will be producing totally new and good quality content 4-5 times a week and many of these new links are pointing to that page which looks very suspicious to me. Does this look like negative Seo to anyone? I’ve read a lot about the disavow tool and it seems people’s opinions are split on when to use it so I was wondering if anyone had any advice on whether to use it or not? It’s easy for me to identify what these new links are, yet some of them have decent DA so will they do any harm anyway? I’ve also checked the referring anchors on Ahrefs and now over 50% of my anchor term cloud are totally unrelated terms to my site and this has happened over the last week which also worries me. I haven’t seen any negative impact on rankings yet but if this carries on it will destroy my link profile. So would it be wise to disavow all these links as they come through or wait to see if they actually have an impact? It should be obvious to Google that there has been a huge spike in links so then the question is would they be ignored or will I be penalised. Any ideas? Thanks in advance Richard
White Hat / Black Hat SEO | | Rich_9950 -
Preparing for Penguin: Remove, Disavow, or change to branded
For someone that has 80 root domains pointing to their domain and 10 of them are sitewide backlinks from 10 PR4+ sites. All paid for. All with the same main keyword anchor text Should I advise him to remove the links, dissavow the links, dissavow then remove or just change to branded anchor text for the 10 sitewide links. Another option is to just keep one link (preferrably editorial) from each site. The only reason not to pull them off right away is that the client could not sustain his business with a drop in sales. These are by far the strongest 10 root domains. Eventually, when he has enough good backlinks these are all coming off. There was a huge drop in sales for this site last fall, but it recovered almost completely by changing keyword stuffing and adding ecommerce content. Looking to keep his sales and also prepare for this years updates.
White Hat / Black Hat SEO | | BobGW0 -
Google 'most successful online businesses'
how come this guy has all but 1 of the top ten results? (UK results - I'm guessing same in USA?) - with thin content on a spammed keyword on multi-sub domains? How can we 'white hat' guys compete if stuff like this is winning?
White Hat / Black Hat SEO | | TheInternetWorks0 -
Why doesn't Google find different domains - same content?
I have been slowly working to remove near duplicate content from my own website for different locals. Google seems to be doing noting to combat the duplicate content of one of my competitors showing up all over southern California. For Example: Your Local #1 Rancho Bernardo Pest Control Experts | 858-352 ... <cite>www.pestcontrolranchobernardo.com/</cite>CachedYou +1'd this publicly. UndoPest Control Rancho Bernardo Pros specializes in the eradication of all household pests including ants, roaches, etc. Call Today @ 858-352-7728. Your Local #1 Oceanside Pest Control Experts | 760-486-2807 ... <cite>www.pestcontrol-oceanside.info/</cite>CachedYou +1'd this publicly. UndoPest Control Oceanside Pros specializes in the eradication of all household pests including ants, roaches, etc. Call Today @ 760-486-2807. The competitor is getting high page 1 listing for massively duplicated content across web domains. Will Google find this black hat workmanship? Meanwhile, he's sucking up my business. Do the results of the competitor's success also speak to the possibility that Google does in fact rank based on the name of the url - something that gets debated all the time? Thanks for your insights. Gerry
White Hat / Black Hat SEO | | GerryWeitz0