Links via scraped / cloned content
-
Just been looking at some backlinks on a site - a good proportion of them are via Scraped wikipedia links or sites with similar directories to those found on DMOZ (just they have different names).
To be honest, many of these sites look pretty dodgy to me, but if they're doing illegal stuff there's absolutely no way I'll be able to get links removed.
Should I just sit and watch the backlinks increase from these questionable sources, or report the sites to Google, or do something else? Advice please.
-
Thanks Donnie and Oleg - good advice there! Couldn't believe how many dodgy sites were hitting this particular site. Around 15% of links (out of not many links - around 300) from dodgy outfits all doing more or less the same kind of stuff.
-
Yup, you can't control the bad links coming in so your best bet is to develop your brand (buy building high quality links w/ brand anchor text) to reduce the impact of spammy links.
-
IMO focus on building good links for your own site. It is completely up to you, however the time you invest trying to get them removed is probably better off spent branding your own site. If they are doing wrong things they will end up getting it. Everyone who f's with the big G eventually gets it.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Separating the syndicated content because of Google News
Dear MozPeople, I am just working on rebuilding a structure of the "news" website. For some reasons, we need to keep syndicated content on the site. But at the same time, we would like to apply for google news again (we have been accepted in the past but got kicked out because of the duplicate content). So I am facing the challenge of separating the Original content from Syndicated as requested by google. But I am not sure which one is better: *A) Put all syndicated content into "/syndicated/" and then Disallow /syndicated/ in robots.txt and set NOINDEX meta on every page. **But in this case, I am not sure, what will happen if we will link to these articles from the other parts of the website. We will waste our link juice, right? Also, google will not crawl these pages, so he will not know about no indexing. Is this OK for google and google news? **B) NOINDEX meta on every page. **Google will crawl these pages, but will not show them in the results. We will still loose our link juice from links pointing to these pages, right? So ... is there any difference? And we should try to put "nofollow" attribute to all the links pointing to the syndicated pages, right? Is there anything else important? This is the first time I am making this kind of "hack" so I am exactly sure what to do and how to proceed. Thank you!
White Hat / Black Hat SEO | | Lukas_TheCurious1 -
Is there a paid link hierarchy?
It seems like the more I learn about my competition's links, the less I understand about the penalties associated with paid links. Martindale-hubbard (in my industry) basically sells links to every lawyer out there, but none of the websites with those links are penalized. I'm sure you all have services like that in your various industries. Granted, Martindale-hubbard is involved in the legal community and it's tied to Lexis Nexis, but any small amount of research would tell you that paid links are a part of their service. Why does this company (and companies that use them) not get penalized? Did the penguin update just go after companies that got links from really seedy, foreign companies with gambling/porn/medication link profiles? I keep reading on this forum and other places that paid links are bad, but it looks to me like there are fundamental differences in the penalties for paid links purchased from one company vs another. Is that the case or am I missing something? Thanks, Ruben
White Hat / Black Hat SEO | | KempRugeLawGroup0 -
Can you disavow a spamy link that is not pointing to your website?
We have submitted several really spammy websites to the Google spam team. We noticed they take a very long time to react to submissions. Do you know if it is possible to disavow a link that is not pointing to your website but rather to a very spammy website? Thanks
White Hat / Black Hat SEO | | Carla_Dawson0 -
When to NOT USE the disavow link tool
Im not here to say this is concrete and should never do this, and please if you disagree with me then lets discuss. One of the biggest things out there today especially after the second wave of Penguin (2.0) is the fear striken web masters who run straight to the disavow tool after they have been hit with Penguin or noticed a drop shortly after. I had a friend who's site who never felt the effects of Penguin 1.0 and thought everything was peachy. Then P2.0 hit and his rankings dropped of the map. I got a call from him that night and he was desperately asking me for help to review his site and guess what might have happened. He then tells me the first thing he did was compile a list of websites back linking to him that might be the issue and create his disavow list and submitted it. I asked him "How long did you research these sites before you came the conclusion they were the problem?" He Said "About an hour" Then I asked him "Did you receive a message in your Google Webmaster Tools about unnatural linking?" He Said "No" I said "Then why are you disavowing anything?" He Said "Um.......I don't understand what you are saying?" In reading articles, forums and even here in the Moz Q/A I tend to think there is some misconceptions about the disavow tool from Google that do not seem to be clearly explained. Some of my findings with the tool and when to use it is purely based on logic IMO. Let me explain When to NOT use the tool If you spent an hour reviewing your back link profile and you are to eager to wait any longer to upload your list. Unless you have less than 20 root domains linking to you, you should spend a lot more than an hour reviewing your back link profile You DID NOT receive a message from GWT informing you that you had some "unnatural" links Ill explain later If you spend a very short amount of time reviewing your back link profile. Did not look at each individual site linking to you and every link that exists, then you might be using it WAY TO SOON. The last thing you want to do is disavow a link that actually might be helping you. Take the time to really look at each link and ask your self this question (Straight from the Google Guidelines) "A good rule of thumb is whether you'd feel comfortable explaining what you've done to a website that competes with you, or to a Google employee" Studying your back link profile We all know when we have cheated. Im sure 99.9% of all of us can admit to it at one point. Most of the time I can find back links from sites and look right at the owner and ask him or her "You placed this back link didn't you?" I can see the guilt immediately in their eyes 🙂 Remember not ALL back links you generate are bad or wrong because you own the site. You need to ask yourself "Was this link necessary and does it apply to the topic at hand?", "Was it relevant?" and most important "Is this going to help other users?". These are some questions you can ask yourself before each link you place. You DID NOT receive a message about unnatural linking This is were I think the most confusing takes place (and please explain to me if I am wrong on this). If you did not receive a message in GWT about unnatural linking, then we can safely say that Google does not think you contain any "fishy" spammy links in which they have determined to be of a spammy nature. So if you did not receive any message yet your rankings dropped, then what could it be? Well it's still your back links that most likely did it, but its more likely the "value" of previous links that hold less or no value at all anymore. So obviously when this value drops, so does your rank. So what do I do? Build more quality links....and watch you rankings come back 🙂
White Hat / Black Hat SEO | | cbielich1 -
Multiple domains different content same keywords
what would you advice on my case: It is bad for google if i have the four domains. I dont link between them as i dont want no association, or loss in rakings in branded page. Is bad if i link between them or the non branded to them branded domain. Is bad if i have all on my webmaster tools, i just have the branded My google page is all about the new non penalized domain. altough google gave a unique domain +propdental to the one that he manually penalized. (doesn't make sense) So. What are the thinks that i should not do with my domain to follow and respect google guidelines. As i want a white hat and do not do something that is wrong without knowledge
White Hat / Black Hat SEO | | maestrosonrisas0 -
Hidden links in badges using javascript?
I have been looking at a strategy used by a division of Tripadvisor called Flipkey. They specialize in vacation home rentals and have been zooming up in the rankings over the past few months. One of the main off-page tactics that they have been using is providing a badge to property managers to display on their site which links back. The issue I have is that it seem to me that they are hiding a link which has keyword specific anchor text by using javascript. The site I'm looking at offers vacation rentals in Tamarindo (Costa Rica). http://www.mariasabatorentals.com/ Scroll down and you'll see a Reviews badge which shows reviews and a link back to the managers profile on Flipkey. **However, **when you look at the source code for the badge, this is what I see: Find Tamarindo Vacation Rentals on FlipKey Notice that there is a link for "tamarindo vacation rentals" in the code which only appears when JS is turned off in the browser. I am relatively new to SEO so to me this looks like a black hat tactic. But because this is Tripadvisor, I have to think that that I am wrong. Is this tactic allowed by Google since the anchor text is highly relevant to the content? And can they justify this on the basis that they are servicing users with JS turned off? I would love to hear from folks in the Moz community on this. Certainly I don't want to implement a similar strategy only to find out later that Google will view it as cloaking. Sure seems to be driving results for Flipkey! Thanks all. For the record, the Moz community is awesome. (Can't wait to start contributing once I actually know what I'm doing!)
White Hat / Black Hat SEO | | mario330 -
Same template site same products but different content?
for the sake of this post I am selling lighters. I have 3 domains small-lighters.com medium-lighter.com large-lighters.com On all of the websites I have the same template same images etc and same products. The only difference is the way the content is worded described etc different bullet points. My domains are all strong keyword domains not spammy and bring in type in traffic. Is it ok to continue in this manner in your opinion?
White Hat / Black Hat SEO | | dynamic080 -
What does Youtube Consider Duplicate content and will it effect my ranking/traffic?
What does youtube consider duplicated content? If I have a power point type video that I already have on youtube and I want to change the beginning and end call to action, would that be considered duplicate content? If yes then how would this effect my ranking/youtube page. Will it make a difference if I have it embedded on my blog?
White Hat / Black Hat SEO | | christinarule0