Using the Moz to weed out bad backlinks
-
How do you use the opensite explorer to weed out bad backlinks in your profile, and then how do you remove them if you cannot contact the various webmasters.
-
Just using OSE? I would arrange the links you export by PA. Start with the lowest number and work your way up removing the good looking links so you are left with the bad links
If you want to use other tools, you can use scrapebox, or even http://nielsbosma.se/projects/seotools/ to detect their PR first. (or better yet, check if the link is indexed in google before checking PR)
If it's deindexed, then you dont have to bother wasting your time on it. Put it on a separate list and contact for removal once you are done sorting out the links. If the link is not present anymore but the site is bad, disavow it.
PS. I would also export links from google webmaster tools
-
If you can't contact them and they are links that you don't want to count for (or against!) your website then you should disavow them.
Interestingly, Googler John Mueller this week did a Google hangout where the question of the difference between disavow and removal was discussed. If you haven't received a warning of manual penalty then there is apparently "virtually no difference". That makes disavow a much more efficient way of dealing with the issue anyway.
video of hangout here Approx 14 mins 40 secs
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Use 302 redirect when site crashes
My company has switched to a new ecommerce platform that we are not totally familiar with yet. As we've worked with it, we've had a couple situations where both the front and back ends of our site crashed simultaneously (always after installing a third party module). The platform's built-in backup solution hasn't been an option in those situations so we've been coming up with alternatives. We now have a duplicate of the site on our server for such emergencies. The plan is to have pages on the broken site point to the backup site using 302 redirects until the broken site is fixed. Is this correct usage of the 302 redirect? I often see people recommend to never use 302 redirects, but I thought this might be the kind of situation where they'd be appropriate. If so, are there other SEO considerations we should keep in mind? For example, I'm wondering if we should put canonical tags on the temporary site that point to the broken site so the broken site stays in the SE indexes.
Technical SEO | | Kyle_M1 -
How to use robots.txt to block areas on page?
Hi, Across the categories/product pages on out site there are archives/shipping info section and the texts are always the same. Would this be treated as duplicated content and harmful for seo? How can I alter robots.txt to tell google not to crawl those particular text Thanks for any advice!
Technical SEO | | LauraHT0 -
Using canonical for duplicate contents outside of my domain
I have 2 domains for the same company, example.com and example.sg Sometimes we have to post the same content or event on both websites so to protect my website from duplicate content plenty i use canonical tag to point to either .com or .sg depend on the page. Any idea if this is the right decision Thanks
Technical SEO | | MohammadSabbagh0 -
Which address do I use for citations
Hello, When I created my google places, I entered my address and when I got my google places activated I noticed that the address google places was displaying was a short abbreviation of my address. So my question is when it comes to creating citations for my listing do I grab the address google places generated for me in the listing or the long version of my address? I've just heard when it comes to creating citations, you need to make sure it is identical across the board. I hope this makes sense. Thanks!
Technical SEO | | fbbcseo0 -
Has anyone used a company to help promote their site
Hi, i receive around ten emails a day claiming they can help you get your site in the top ten in google, now i know most are a load of rubbish but i am just wondering if anyone has used any of these companies for a new site or an old site. I am about to launch a new site after xmas and i am just wondering if any of these companies are worth looking at to help promote the new site instead of doing all the ground work myself. Would love to know your thoughts
Technical SEO | | ClaireH-1848860 -
Is there a tool to figure out bad backlinks
With the new changes to the google algorithm. I'm trying to figure out what links google may think are hurting my site. Any thoughts? Thanks
Technical SEO | | MQMORAN23230 -
Remove a directory using htaccess
Hi, Can someone tell me if there's a way using htaccess to say that everything in a particular directory, let's call it "A", is gone (http 410 code)? i.e. all the links should be de-indexed? Right now, I'm using the robots file to deny access. I'm not sure if it's the right thing to do since Google webmaster tools is showing me the link as indexed still and a 403 error code. Thanks.
Technical SEO | | webtarget0 -
When does it make sense to use no-follow on your own domain?
Hey guys, I'm not too sure if I'm over-thinking this, but I've seen no-follow being used with SEOmoz and I'm looking to implement this myself. Most of my links point to my root domain (yes I'm working on building links to deep pages) so would it make sense to 'limit' or 'no-follow' links on my root domain so that only the most important pages are being passed link juice? Thanks
Technical SEO | | reegs0