Penality issues
-
Hi there,
I'm working on site that has been badly hit by penguin. The reasons are clear, exact match blog network links and tons of spammy exact match links such as comment spam, low quality directories, the usual junk.
The spammy links were mainly to 2 pages, they were targetting keyword 1 and keyword 2.
I'd like to remove these two pages from google, as they dont even rank in google now and create one high quality page that targets both the keywords, as they are similar.
The dilemma I have is these spammy pages still get traffic from bing and yahoo and it's profitable traffic. Is there a safe way to remove the pages from google and leave them for bing and yahoo?
Peter
-
What about using this Irving? Have you tried it before?
-
The problem with Google is that it's difficult to know whether it is a page level penalty or an anchor text filter that you are triggering from the exact match anchor text abuse. You could try creating a new page for those keywords but there is the chance that they still stop any page from ranking well for that term because of the anchor text (this has happened to me before). Let's hope Google follows Bings lead and comes up with a link removal tool!
Worth a try though.
-
I don't think there is any way around that, the pages need to 404 or Google will reindex them due to all of the links pointing to the pages, even if you do set up robots.txt to allow bing and disallow googlebot to crawl those pages that only works when the crawlers come in from the homepage.
-
My personal opinion is that Bing and Yahoo don't value those links at all. They may not penalizing you for it, but they probably aren't boosting your ranking either.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My Homepage Won't Load if Javascript is Disabled. Is this an SEO/Indexation issue?
Hi everyone, I'm working with a client who recently had their site redesigned. I'm just going through to do an initial audit to make sure everything looks good. Part of my initial indexation audit goes through questions about how the site functions when you disable, javascript, cookies, and/or css. I use the Web Developer extension for Chrome to do this. I know, more recently, people have said that content loaded by Javascript will be indexed. I just want to make sure it's not hurting my clients SEO. http://americasinstantsigns.com/ Is it as simple as looking at Google's Cached URL? The URL is definitely being indexed and when looking at the text-only version everything appears to be in order. This may be an outdated question, but I just want to be sure! Thank you so much!
Technical SEO | | ccox10 -
Please let me know if I am in a right direction with fixing rel="canonical" issue?
While doing my website crawl, I keep getting the message that I have tons of duplicated pages.
Technical SEO | | kirupa
http://example.com/index.php and http://www.example.com/index.php are considered to be the duplicates. As I figured out this one: http://example.com/index.php is a canonical page, and I should point out this one: http://www.example.com/index.php to it. Could you please let me know if I will do a right thing if I put this piece of code into my index.php file?
? Or I should use this one:0 -
Canonicalization Issue?
Good day! I am not sure if my company has a Canonicalization issue? When typing in www.cushingco.com the site redirects to http://www.cushingco.com/index.shtml A visitor can also type in http://cushingco.com/index.shtml into a web browser and land on our homepage (and the url will be http://www.cushingco.com/index.shtml) A majority of websites that link to our company point to: http://www.cushingco.com/index.shtml We are in the process of cleaning up citations and pulling together a content marketing strategy/editorial calendar. I want to be sure folks interested in linking to us have the right url. Please ask me any questions to help narrow down what we might be doing incorrectly. Thanks in advance!! Jon
Technical SEO | | SEOSponge0 -
Can a Novice Fix Parallelize Issues?
I was working yesterday on making my WP site quicker (sellingwarnerrobins.com) and after updating the htaccess file to solve some "Leverage Browser Caching" issues I re-ran a scan on Pingdom Tools and am now getting a zero for "Parallelize downloads across hostnames" with a list of 34 items to fix. I did some web searches and when the articles started talking about cnames, subdomains, and hostname distribution it went beyond my capabilities. Are these Parallelize "issues" something a novice like myself can easily fix? If so, how?
Technical SEO | | Anita_Clark0 -
Does having a page (or site) available on HTTP and HTTPS cause duplication issues?
Say I've got a site that can be accessed using either protocal (i.e. HTTP and HTTPS), but most (if not all of the links) are pointing to the HTTP versions. Will it cause a problem if I start link building to HTTPS versions? In other words does google see http://mysite.com as the same page as https://mysite.com? Thanks
Technical SEO | | PeterAlexLeigh0 -
Duplicate page content issue needs resolution.
After my last "crawl" report, I received a warning about "duplicate page content". One page was: http://anycompany.com and the other was: http://anycompany.com/home.html How do I correct this so these pages aren't competing with each other or is this a problem?
Technical SEO | | JamesSagerser0 -
What is with WordPress Dupe issues?
Hi, Just wondering if anyone can explain for me why it seems every tag that is entered in WP blog posts on a site creates a duplicate page (identified by ROGER and friends in SEOmoz crawl)? Obviously if you can offer a solution (apart from the extremely obvious "don't use tags") I would be immensely grateful. Thanks so much,
Technical SEO | | ShaMenz0 -
OnPage Issues with UTF-8 and ISO-8859-1
Hi guys, I hope somebody can help me figure this out. On one of my sites I set the charset to UTF-8 in the content-type meta-tag. The file itself is also UTF-8. If I type german special chars like ä, ö, ß and the like they get displayed as a tilted square with a questionmark inside. If I change the charset to iso-8859-1 they are getting displayed properly in the browser but services like twitter are still having the issues and stop "importing" content once they reach one of those specialchars. I would like to avoid having to htmlencode all on-page content, so my preference would be using UTF-8.. You can see it in action when you visit this URL for example: http://www.skgbickenbach.de/aktive/1b/artikel/40-minuten-fußball-reichen-nicht_1045?charset=utf-8 Remove the ?charset parameter and the charset it set to iso-8859-1. Hope somebody has an answer or can push me into the right direction. Thanks in advance and have a great day all. Jan
Technical SEO | | jmueller0