Best to Leave Toxic Links or Remove/Disovow on Site with Low Number of Linking Domains
-
Our site has only 87 referring domains (with at least 7,100 incoming links). LinkDetox has identified 29% of our back links as being toxic and 14% as being questionable. Virtually all of these links derive from spammy sites.
We never received a manual penalty, but ever since the first Penguin penalty in 2012 our search volume and ranking has dropped with some uneven recover in the last 3 years.
By removing/disavowing toxic links are we risking that over optimized link text will be removed and that ranking will suffer as a result? Are we potentially shooting ourselves in the foot? Would we be better to spend a few months building quality links from reputable domains before removing disavowing bad links? Or toxic links (as defined by LinkDetox) so bad that it should be a priority to remove them immediately before taking any other step?
Thanks, Alan
-
If your site has in fact been negatively impacted by Penguin, you'll have to wait for the next Penguin refresh for your changes to have an impact. You may see fluctuations due to other algorithm changes though.
-
That is reassuring!!! My concern is that the number of domains that link to our site is so low (less than 100) that evening removing low quality links could be a negative.
In terms of seeing a ranking/traffic improvement, do you think we could expect it once the links are removed/disavowed or do you think it would occur only upon a Penguin update?
Thanks!!! Alan
-
Toxic, spammy links can only hurt you, but they are even more harmful if they make up a large percentage of your total backlinks. Remove/disavow them as soon as possible and work on earning legitimate, high-value backlinks. Also, over-optimized link text can be harmful as well.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page Speed or Site Speed which one does Google considered a ranking signal
I've read many threads online which proves that website speed is a ranking factor. There's a friend whose website scores 44 (slow metric score) on Google Pagespeed Insights. Despite that his website is slow, he outranks me on Google search results. It confuses me that I optimized my website for speed, but my competitor's slow site outperforms me. On Six9ja.com, I did amazing work by getting my target score which is 100 (fast metric score) on Google Pagespeed Insights. Coming to my Google search console tool, they have shown that some of my pages have average scores, while some have slow scores. Google search console tool proves me wrong that none of my pages are fast. Then where did the fast metrics went? Could it be because I added three Adsense Javascript code to all my blog posts? If so, that means that Adsense code is slowing website speed performance despite having an async tag. I tested my blog post speed and I understand that my page speed reduced by 48 due to the 3 Adsense javascript codes added to it. I got 62 (Average metric score). Now, my site speed is=100, then my page speed=62 Does this mean that Google considers page speed rather than site speed as a ranking factor? Screenshots: https://imgur.com/a/YSxSwOG **Regarding: **https://six9ja.com/
Reporting & Analytics | | Kingsmart1 -
Filter Tracking works fine at staging site but not on LIVE site why?
Hello Expert, For my ecommerce site I want to track filter url's like price range, size, width, color etc and fully filter url should display in google analytic. I have implemented filter tracking at staging server and it works perfectly but on LIVE site it not show me full filter url. Do you guys think any parameter which i have configured in search console affect this? Note - I have configured in this way - http://webmasters.stackexchange.com/questions/93008/how-to-track-a-product-filter-in-the-product-list-view-with-google-analytics My filter url's are given below. And in search console I have configure two parameters. 1) effect - Sort, Crawl - No urls 2) FT - effect- ( - ) , crawl - Let google bot decide. But as per me this parameter is for crawling should not affect tracking right? mysite.com?FP=0&filtSeq=Price&Sort=BS
Reporting & Analytics | | adamjack
mysite.com?FT=7581&filtSeq=Type&Sort=BS
mysite.com?FT=1042&filtSeq=Colour&Sort=BS In robot file nothing is block. In analytic it showing me url till mysite.com only where as in staging it shows me full filter url. Thanks!0 -
Moz Crawl shows over 100 times more pages than my site has?
The latest crawl stats are attached. My site has just over 300 pages? Wondering what I have done wrong? RRv3fR0
Reporting & Analytics | | Billboard20120 -
If Links not in GWT does that mean they havent been Indexed yet?
Hi we have had some success recently with increased rank positions, so I am trying to find our what's caused it? Am I correct in thinking that if google hasnt listed any new links in my GWT account that it hasnt indexed them yet and therefore not impacting my rankings? Thanks Ash
Reporting & Analytics | | AshShep10 -
Domain Authority
Recently when doing my usual research that I do weekly I noticed a drastic drop in my domain authority. How or why does this happen and how can I fix it? http://www.brick-anew.com/ Thanks
Reporting & Analytics | | SammyT0 -
Low bounce rate; need help troubleshooting code
I've had an outside developer do a bunch of custom work in Google Analytics to get my site to integrate with Foxycart and accurately report sales in the ecommerce section. With a foxycart upgrade came more GA tweaking, and now my bounce rate is at 1-2%. I know this isn't right, and suspecting there is something triggering GA a second time, causing a second page load, or something. Could someone that loves code look at http://www.strikemodels.com/ and tell me if they can easily spot what's obliterating the accuracy of my bounce rate calculations? Or do I need to go back to my dev and up the can of GA worms to troubleshoot things? As you can tell by the code, I'm running the latest version of WP with a few plugins, Thesis 1.8, and on Apache.
Reporting & Analytics | | KeriMorgret0 -
Unable to use Open Site Explorer
I have repeatedly tried to use Open Site Explorer on a site. I always get the same response: No Data Available for this URL The site is 4 months old. It is listed by Alexa in 1.3 million page category. The site ranks well on numerous key terms for the industry it covers. OSE offers 4 reasons for not having data: 1. Recency of page creation. In the explanation offered 45-60 days is the longest estimate it should take. The site has been live close to 120 days. 2. Deep down in the web. The tool doesn't even have any data for the site's home page. 3. Blocked pages. The site is listed in Google and these pages are not blocked. 4. No links. There are plenty of links. Site url is www.terapvp.com What can I do to fix this issue and get data?
Reporting & Analytics | | RyanKent0