Is there a tool to find out if a URL has been deemed "SPAM" by GOOGLE
-
I am currently doing a link audit on one of my sites and I am coming across some links that appear to be spam. Is there a tool that I can plug their URL into to see if they have been deemed spam by GOOGLE?
-
Few things you can try:
- Google the URL if it doesn't come up strong possibility it was de-indexed and has a penalty.
- Use a service like link detox which will roughly shows you what it thinks to be nasty.
- Majestic seo has a neat tool for finding out what it thinks of sites - https://www.majesticseo.com/reports/neighbourhood-checker
- Similar to above http://spyonweb.com/ can be handy for working out link wheels
- Look into the stats of the site e.g trust flow, authority etc. recommend tools like Open site explorer, majestic seo or Hrefs.
Research is the key and you could dig pretty deep Hope some of those help.
but as to what Google thinks you're still going to have to figure that out on your own.
Good luck
-
No, there isn't a tool that does this conclusively. Welcome to Shades of Certainty, I will be your host.
You could try the cache: operator in conjunction with the URL. If it isn't cached, there are either really big crawl problems or it's the worst kind of spam. This isn't a silver bullet, but it's one step to determine if a page is the worst kind of spam.
For on-page considerations, you may want to try the Moz On-Page Grader. Google it ain't, but it's better than nothing at all if you're wondering.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I'm noticing that URL that were once indexed by Google are suddenly getting dropped without any error messages in Webmasters Tools, has anyone seen issues like this before?
I'm noticing that URLs that were once indexed by Google are suddenly getting dropped without any error messages in Webmasters Tools, has anyone seen issues like this before? Here's an example:
Intermediate & Advanced SEO | | nystromandy
http://www.thefader.com/2017/01/11/the-carter-documentary-lil-wayne-black-lives-matter0 -
Multilingual SEO - site using Google translate within existing URL structure
Hi everyone - I've just been looking at a site that simply uses Google Translate through its website. So basically, on any page you can Google Translate the content to any language you like - there's no change to the URL structure according to language, etc. I haven't come across this approach before (simply allowing users to Google Translate withing the existing page) - and it doesn't sit well with me - let me have your thoughts re: the SEO implications. Thanks in advance, Luke
Intermediate & Advanced SEO | | McTaggart0 -
Only ranking well when "UK" is added to search term
Hi, what does it mean when a lot of our keyword phrases rank only when "UK" is typed in the search term? For example:
Intermediate & Advanced SEO | | Solid_Web
"boxes" (not in top 50)
"boxes UK" (38) "big storage boxes" (45)
"big storage boxes UK" (33) We haven't attempted to SEO the pages for search terms with "UK" appended to them. Our domain is a co.uk domain. So, what reasons could there be that are we ranking in such a way?0 -
Ranking of Moz "A" grade page.
Hello, I built a site in Weebly recently and it was indexed by Google and the one page in fact ranked #1 for one keyword. I used absolutely no SEO optimization techniques for this. It then rapidly dropped out of sight (not surprising ). I have now optimized the site in general and specifically the page www.insolvencylifeline.co.za/voluntary-sequestration-process as recommended by Moz. All the optimization was on-page, except that I also used the SEOProfiler tool to submit the site to their list of search engines recommended and I manually linked to a number of reputable directories. I did this on 09/03. If I search for www.insolvencylifeline.co.za/voluntary-sequestration-process I can see the page has been cached on 10/3. However,if I search for any of my 3 search terms for example "voluntary sequestration" and then do an advanced search for "insolvencylifeline", I only get search results for pages cached before 9/3. My page www.insolvencylifeline.co.za/voluntary-sequestration-process which I know is fully optimized (“A” Moz grade) for the search term, does not rank at all. Also if I search for www.insolvencylifeline.co.za, I can see that the page also was cached on 10/3. However, it does not show www.insolvencylifeline.co.za/voluntary-sequestration-process at all and the other pages shown were all cached before 9/3. Does this mean that the page www.insolvencylifeline.co.za/voluntary-sequestration-process does not rank at all even though it is indexed? If so, any thoughts on why? Regards, Gerhard.
Intermediate & Advanced SEO | | Gerrhard0 -
Do I need to use rel="canonical" on pages with no external links?
I know having rel="canonical" for each page on my website is not a bad practice... but how necessary is it for pages that don't have any external links pointing to them? I have my own opinions on this, to be fair - but I'd love to get a consensus before I start trying to customize which URLs have/don't have it included. Thank you.
Intermediate & Advanced SEO | | Netrepid0 -
How much does "overall site semantic theme" influence rankings?
OK. I've optimized sites before that are dedicated to 1, 2 or 3 products and or services. These sites inherently talk about one main thing - so the semantics of the content across the whole site reflect this. I get these ranked well on a local level. Now, take an e-commerce site - which I am working on - 2000 products, all of which are quite varied - cookware, diningware, art, decor, outdoor, appliances... there is a lot of different semantics throughout the site's different pages. Does this influence the ranking possibilities? Your opinion and time is appreciated. Thanks in advance.
Intermediate & Advanced SEO | | bjs20100 -
Can I, in Google's good graces, check for Googlebot to turn on/off tracking parameters in URLs?
Basically, we use a number of parameters in our URLs for event tracking. Google could be crawling an infinite number of these URLs. I'm already using the canonical tag to point at the non-tracking versions of those URLs....that doesn't stop the crawling tho. I want to know if I can do conditional 301s or just detect the user agent as a way to know when to NOT append those parameters. Just trying to follow their guidelines about allowing bots to crawl w/out things like sessionID...but they don't tell you HOW to do this. Thanks!
Intermediate & Advanced SEO | | KenShafer0 -
400 errors and URL parameters in Google Webmaster Tools
On our website we do a lot of dynamic resizing of images by using a script which automatically re-sizes an image dependant on paramaters in the URL like: www.mysite.com/images/1234.jpg?width=100&height=200&cut=false In webmaster tools I have noticed there are a lot of 400 errors on these image Also when I click the URL's listed as causing the errors the URL's are URL Encoded and go to pages like this (this give a bad request): www.mysite.com/images/1234.jpg?%3Fwidth%3D100%26height%3D200%26cut%3Dfalse What are your thoughts on what I should do to stop this? I notice in my webmaster tools "URL Parameters" there are parameters for:
Intermediate & Advanced SEO | | James77
height
width
cut which must be from the Image URLs. These are currently set to "Let Google Decide", but should I change them manually to "Doesn't effect page content"? Thanks in advance0