Dealing with manual penalty...
-
I'm in the back-and-forth with Google's Quality Search team at the moment. We discovered a manual penalty on our website and have been trying to get it removed as of late. Problem is, tons of spammy incoming links.
We did not ask for or purchase any of these links, it just so happens that spammy websites are linking to our site. Regardless, I've done my best to remove quite a few links in the past week or so, responding to the Quality Search team with a spreadsheet of the links in question and the action taken on each link.
No luck so far.
I've heard that if I send an email to a website asking for a link removal, I should share that with Google as well. I may try that.
Some of the links are posted on websites with no contact info. A WhoIs search brings up a hidden registrant. Removing these links is far from easy.
My question is, what are some techniques that are proven to be effective when working your way through the removal of a manual penalty? I know Google isn't going to tell me all of the offending links (they've offered a few examples, we've had those removed - still penalized) so what's the best way for me to find them myself? And, when I have a link removed, it may stay in Webmaster Tools as an active link for a while even though it no longer exists. Does the Quality Search team use Webmaster Tools to check or do they use something else?
It's an open-ended question, really. Any help dealing with a manual penalty and what you have done to get that penalty removed is of great help to me. Thanks!
-
Ryan Kent has some experience with this, and shared it in this Q&A at http://www.seomoz.org/q/does-anyone-have-any-suggestions-on-removing-spammy-links
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My PR1 website with no manual action is not appearing in the google top 200 result, it used to be top 1
Hi All, My website has 5 years of history, it used to be top 3 for its keyword with an SEO firm I hired back then with PR3. After 2013, its ranking starts dropping to not appearing at all in the google search result. But my site is not banned. There are still some long tail google traffic. There is no manual action in the webmaster tool. Even today, it still has PR1 but it is still not appearing in the top 200 results of google. My onsite optimization for the main keyword is good. It's just not appearing in google results at all. I manually reviewed all the top 200 results, there are so many spammy blogs were listed but not my legit website homepage with unique content. I do regret hiring that SEO firm now. But there is no manual action to my site according to google so I don't know if I should disavow all the old backlinks. I started to do some quality SEO works myself now. My ranking is now at around 11~15 in Yahoo/Bing from 30~40. Do I see some light at the end of the tunnel? Does that mean my site may appear in google again? Thank you all for the reply.
Technical SEO | | ChelseaP0 -
How to deal with duplicated content on product pages?
Hi, I have a webshop with products with different sizes and colours. For each item I have a different URL, with almost the same content (title tag, product descriptions, etc). In order to prevent duplicated content I'am wondering what is the best way to solve this problem, keeping in mind: -Impossible to create one page/URL for each product with filters on colour and size -Impossible to rewrite the product descriptions in order to be unique I'm considering the option to canonicolize the rest of de colours/size variations, but the disadvantage is that in case the product is not in stock it disappears from the website. Looking forward to your opinions and solutions. Jeroen
Technical SEO | | Digital-DMG0 -
Google Penalty Investigation
Hi All, I've recently had a google penalty, and have spent a long while working through tying up any loose ends with my site (I'm building a new one, so you'll still see some probs on my current one). My search referrals do slowly seem to be recovering, but only for variations of terms that are similar to 'bike repair'. Now, my site does primarily offer bike repair advice, so that's a good thing, but I'm not yet ranking for any of my specific keywords. One example is that I used to rank quite high for the term 'schrader valve'. Is this a signal of anything in particular? Or can I not read anything specific into this? Thanks!
Technical SEO | | madegood0 -
How best to deal with www.home.com and www.home.com/index.html
Firstly, this is for an .asp site - and all my usual ways of fixing this (e.g. via htaccess) don't seem to work. I'm working on a site which has www.home.com and www.home.com/index.html - both URL's resolve to the same page/content. If I simply drop a rel canonical into the page, will this solve my dupe content woes? The canonical tag would then appear in both www.home.com and www.home.com/index.html cases. If the above is Ok, which version should I be going with? - or - Thanks in advance folks,
Technical SEO | | Creatomatic
James @ Creatomatic0 -
Different URLS for our multi language pages caused penalty?
Hi all, We have a website www.phoneboxlanguage.com with 4 different language versions (Spanish, French, Italian, German). We have all the different versions on totally different URLS. E.G the French URL is www.cours-telephone-anglais.com. Recently this month we saw a huge drop in SERPS for all the 'foreign' language pages. This had happened before for the Spanish and French, which we put down to keyword density issues, so created new URLS for those pages. However now all 4 foreign pages have dropped. Could this be due instead to a penalty for duplicate sites? The content is obviously different due to different languages, but the coding and templates for the sites are the same. How can we find out this is the case and what should we do? I was thinking after some research on the forum to create subfolders in the original (phoneboxlanguage.com) and then create 301 redirects, from the old dropped sites, or would their penalties be bad for our original site, if this were the case? We are obviously very keen to not further damage the site and the original site which remains o.k. Many thanks for your kind help. Quime.
Technical SEO | | Quime0 -
Does 301 redirect cause penalty
Good Morning, I am considering doing a 301 (permanent) re-direct of roughly 100 domains, split between my 3 main e-commerce sites. Would taking an action like this put any of the 100 domains or any of the 3 recipient domains at risk of violating G's guidelines? Thanks...
Technical SEO | | Prime851 -
Large Scale Ecommerce. How To Deal With Duplicate Content
Hi, One of our clients has a store with over 30,000 indexed pages but less then 10,000 individual products and make a few hundred static pages. Ive crawled the site in Xenu (it took 12 hours!) and found it to by a complex mess caused by years of hack add ons which has caused duplicate pages, and weird dynamic parameters being indexed The inbound link structure is diversified over duplicate pages, PDFS, images so I need to be careful in treating everything correctly. I can likely identify & segment blocks of 'thousands' of URLs and parameters which need to be blocked, Im just not entirely sure the best method. Dynamic Parameters I can see the option in GWT to block these - is it that simple? (do I need to ensure they are deinxeded and 301d? Duplicate Pages Would the best approach be to mass 301 these pages and then apply a no-index tag and wait for it to be crawled? Thanks for your help.
Technical SEO | | LukeyJamo0 -
Duplication Penalty through Specs?
I am trying to figure our how to correct a recently incurred duplication penalty on a partner site. I didn't see any posts on this yet specific to my problem. The site used to be ranked on page 1 of Google for all important keywords but now we ran into the situation that many pages were bumped to pos 100 or lower due to duplication issues. This is an aviation site, discussing airplanes and each page discusses a different model but each page also has the specs of the plane and while the data parts are different for each plane the specification terms are the same ,see here: Primary Function:
Technical SEO | | WizardHQ
Crew:
Engine:
Thrust:
Weight Empty:
Max. Weight:
Length:
Wingspan:
Cruise Speed:
Max.Speed:
Climb:
Ceiling:
Range:
First Flight:
Year Deployed: Is there an easy way to get Google to stop including these terms (not the data in the 2nd column) from the page anaysis to prevent this causing the duplication issues we are are seeing due to this? Thanks in advance!0