Identifying why my site has a penalty
-
Hi,
My site has been hit with a google penalty of some sort, but it doesn't coincide with a penguin or panda update. I have attached a graph of my visits that demonstrates this.
I have been working on my SEO since the latter part of last year and have been seeing good results, then all of a sudden my search referrals dropped by 70%.
Can anyone advise on what it could be?
Thanks!
Will
-
Great. Just audit it, fix problems, audit again, write more great content and give it time. Even if you fix the problem (assuming it was an onsite problem) it may take some time for Google to show the love agian.
-
Oh okay! That makes sense. Found a few issues with my php rules that automatically write links on a few of my contents pages.
I've learned some valuable tips here, such a fantastic help. I'm going to get the new site up in a week or two and we'll see if things change.
I'll keep you updated!
-
Ok so. If a bit of content resides at /bikes/mountain-bikes/ and the menulink I use is /bikes/mountain-bikes/ I'll get a status code 200. There is no added delay, no page rank lost, 200 == OK. The menu link points directly to the destination content.
Now lets say you've decided to change the location of that content to /bikes/mountain-bikes/index.html.
You do the 301 redirects on from the old url to the new one, THEN you need to update your links to reflect the new location so you're not just pointing at 301 redirects.
-
Thanks for the table of links. I'll see to it.
I'll work on the code on the new version of the site, seems pointless to do it now.
I've installed the plugin. How do I change the status code of a page? I don't really understand how it can be anything but 200, as if i'm viewing it it's obviously there! I thought 301's pushed the user to the 200 version of the page and only existed temporarily in the browser? Obviously I'm wrong, perhaps you could explain it for me?
Cheers for the screaming frog tool, looks great.
-
Did you change them. The scan I just did doesn't show them.... Maybe your host was getting funky or something lol.
Get this and click the links on your site. You want to link to status code 200, not 301
https://chrome.google.com/webstore/detail/server-status-code-inspec/bmngiaijlojlejaiijgedgejgcdnjnpk
I wouldn't de-index them, I havent found a legitimate reason to de-index anything since 2005, but im a programmer and normally don't need to patch things. You could probably quickly fix them just by adding some content/images.
im going to private msg you another spreadsheet. this should show you source+destination of your 404's and 301's.
btw, the spider im using is Screaming Frog, its the best I've found.
-
Just checked the 418's and they do seem to be already re-directed with 301's, or are actually in place. What would be the protocol here?
-
Got your message, thank you. What tool did you completed the crawl with? I'm sort of disappointed this stuff didn't come up in my seomoz weekly scans.
A few questions;
- How do i know where the 301's are being sent from? So in a this chain of events...
Link on a page on my site > routed via a 301 > landing on the desired page
... how do I find the first step in the process? the table you sent me seems to point out only the middle step.
- Yes the 'about us' and 'contact us' pages are weak. I'm building a new version of the site as we speak and will take care of it then. In the mean time, if i no-index them is that as good as getting rid of them?
I will now sort the 404's and 418's. Without wanting to sound like a broken record; thanks again! Do let me know if there is anything I can do in return once we've got to the bottom of this.
Will
-
private messaged you a google doc of the crawl. Looks like pages that no longer exist, they need 301's.
-
Wow, thanks for all this. It's late now in the UK so I'll check it out tomorrow.
Cheers
p.s. Where are my 418's coming from!?!
-
My crawl finished. You also have a bunch of status 418 "I'm a teapot" status codes. IDK what this is so I looked it up.
Per wikipedia:
418 I'm a teapot (RFC 2324)This code was defined in 1998 as one of the traditional IETF April Fools' jokes, in RFC 2324, Hyper Text Coffee Pot Control Protocol, and is not expected to be implemented by actual HTTP servers.
-
-
You'd think so, but 1) we cant fully trust everything Google says and 2) it could have been something that the algorithm progressively finds and penalizes.
Its possible that this is not related to links or content.
Take care of your RCS and make it awesome (real company shtuff)
About us (under construction content, not good)
Contact us (weak and thin, include social
FAQ
Terms and Conditions (404 error on your site!). I once broke all my footer links on a blog that was getting 5k/day and it slammed me down to 600/day nearly instantaneously. Ive seen other sites with 404 errors survive and even Cutts has downplayed the issue of 404 errors, but I believe any 404 can be indicative of a bad user experience. Scan your site for 404s and fix them all.
Also, many of your internal links appear to be pointing to 301 redirects. Update your links to point to the status 200 status code (directly to the destination, not through 301)
In just a quick overview, the above are my notes. This isnt a detailed audit, but you should scan your site for 404 errors and fix them, get your RCS stuff in order and conduct a full site review looking for anything that may be frowned upon by google.
-
Thanks devknob,
In answer to your questions;
-
it is across all organic traffic and all keywords to my entire site
-
the content on my site is fairly squeaky clean. I've been using the seomoz pro-tool to keep it in check. I use yoast seo for wordpress to handle my canonicals and employ no dodgy js hiding techniques. I did not remove content.
-
I haven't been buying links. I do have 20,000+ sitewide links coming from bikingbis.com and 12,000 sitewide links coming from citycyclingedinburgh.info/bbpress/. The ones from bikingbis have been removed and have requested removal of the other. Anchor text is varied and is mainly branded keywords
My question is though, if it's a bad backlink problem, wouldn't it coincide with a panda or penguin update?
Thanks again
Will
-
-
Check your analytics
- Is it a specific group of keywords?
- Is it organic traffic at all?
- Is it traffic to specific page or pages?
Check your website.
- Are your link canonicals setup CORRECTLY?
- Do you have content that is hidden via css/javascript and has no mechanism for unhiding?
- Have you changed alot of links recently and not performed 301 redirects?
- Do you have good content, title tags and meta descriptions?
- Did you remove content
Check your links
- Have you been buying links? Check your backlink profile using opensite explorer. Is there any unusual activity here?
- Is your anchor text varied?
Have you gotten a notice in Google Webmasters tools?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Penguin penalty is automated or manual?
Hi, I have seen some of our competitors are missing from top SERP and seems to be penalised as per this penalty checker: http://pixelgroove.com/serp/sandbox_checker/. Is this right tool to check penalty? Or any other good tools available? Are these penalties because of recent Penguin update? If so, is this a automated or manual penalty from Google? I don't think all of these tried with black-hat techniques and got penalised. The new penguin update might triggered their back-links causing this penalty. Even we dropped for last 2 weeks. What's the solution for this? How effectively link-audit works? Thanks, Satish
White Hat / Black Hat SEO | | vtmoz0 -
Clean-up Question after a wordpress site Hack added pages with external links from a massive link wheel?
Hey All, Thought I would throw this out to ensure I am dotting my "i's" and crossing my "t's"..... Client WordPress site was hacked injected 3-4 pages that cross linked to hundreds (affiliate junk spam link wheel). Pages were removed, 3rd party cleared all malware/viruses. Heavy duty firewall and security monitoring are in place. Hacked pages are now showing as 404. No penalties, ranking issues....If anything there was a temporary BOOST in rankings due to the large link-wheel type net that the pages were receiving....That has since leveled out rankings. I guess my question is, in your opinion is it best to let those pages 404, I am noticing a large amount of links going to them from all over the world from this large link net that was built. I find the temptation to 301 re-direct deleted pages to the homepage difficult...lol..{the temptation is REAL}. Is there anything I am missing? Any other steps that YOU would take? I am assuming letting those pages 404 would be the best bet, as in time they will roll off index.... Thank you in advance, I appreciate any feedback or opinions....
White Hat / Black Hat SEO | | Anthony_Howard0 -
How do I get more video reviews on a new site?
Hi, I'm working on a site that has video reviews of various places. It's general information/experience that most people possess and production-wise they are selfies. These videos are then transcribed and, voila... searchable content. My problem is this... how do I get large numbers of people to go to the trouble to make a 2 minute selfie? I thought about HARO, since one could work in a plug for something, but they have a site traffic threshold that this new site isn't at. Any and all ideas on how to efficiently generate this content for a new site with very little traffic would be appreciated. Thanks!
White Hat / Black Hat SEO | | 945010 -
International web site - duplicate content?
I am looking at a site offering different language options via a javascript drop down chooser. Will google flag this as duplicate content? Should I recommend the purchase of individual domains for each country? i.e. .uk
White Hat / Black Hat SEO | | bakergraphix_yahoo.com1 -
Someone has built low quality links to my site - what should I do?
Hey guys, I was wondering whether you could offer me some help on something. One of the site's I'm working on has a blog attached to it and we sometimes accept guest posts from authors. A month or so back we published a blog that has been attracting a number of low-quality backlinks. Having looked into the matter further, it turned out that the client who had created the guest post was doing something called "tiered link building" and was building crappy links to their guest post content on other websites. I have subsequently deleted the blog post in question - will this devalue/cancel out the inbound links pointing to the original URL? Or do I need to do something extra? Disavow even? Comments appreciated!
White Hat / Black Hat SEO | | Webrevolve0 -
Should I 301 Redirect a Site with an 'Unnatural Link' Warning?
Hey Fellow Mozzers, I have recently been approached by a new client that has been issued with an 'Unnatural Link' warning and lost almost all of their rankings. Open Site Explorer shows a ton of spammy links all using main keyword anchor text and there are way too many of them to even consider manually getting them removed. There are two glimmers of hope for the client; The first is that the spammy links are dropping off at a rate of about 25 per week; The second is that they own both the .com and the .co.uk domain for their business. I would really appreciate some advice on the best way to handle this, should I :- Wait it out for some of the spammy links to drop off whilst at the same time pushing social media and build some good clean links using the URL and brand as anchor text? Then submit a recosideration request? Switch the website over from the .com domain to the .co.uk domain and carry out a 301 redirect? Switch the website over from the .com to the .co.uk without doing a redirect and start again for the client with a clean slate? I would still register an address change via Webmaster Tools. Add a duplicate site on the .co.uk domain. Leave the .com site in place but rel="canonical" the entire domain over to the .co.uk Any advice would be very much apprecited. Thanks
White Hat / Black Hat SEO | | AdeLewis
Ade.0 -
Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
Hi All, In relation to this thread http://www.seomoz.org/q/what-happend-to-my-ranks-began-dec-22-detailed-info-inside I'm still getting whipped hard from Google, this week for some reason all rankings have gone for the past few days. What I was wondering though is this, when Google says- Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations? I assume my site hits the nail on the head- [removed links at request of author] As you can see I target LG Optimus 3D Sim Free, LG Optimus 3D Contract and LG Optimus 3D Deals. Based on what Google has said, I know think there needs to be 1 page that covers it all instead of 3. What I'm wondering is the best way to deal with the situation? I think it should be something like this but please correct me along the way 🙂 1. Pick the strongest page out of the 3 2. Merge the content from the 2 weaker pages into the strongest 3. Update the title/meta info of the strongest page to include the KW variations of all 3 eg- LG Optimus 3D Contract Deals And Sim Free Pricing 4. Then scatter contract, deals and sim free throughout the text naturally 5. Then delete the weaker 2 pages and 301 redirect to the strongest page 6. Submit URL removal via webmastertools for the 2 weaker pages What would you do to correct this situation? Am I on the right track?
White Hat / Black Hat SEO | | mwoody0 -
My attempt to reduce duplicate content got me slapped with a doorway page penalty. Halp!
On Friday, 4/29, we noticed that we suddenly lost all rankings for all of our keywords, including searches like "bbq guys". This indicated to us that we are being penalized for something. We immediately went through the list of things that changed, and the most obvious is that we were migrating domains. On Thursday, we turned off one of our older sites, http://www.thegrillstoreandmore.com/, and 301 redirected each page on it to the same page on bbqguys.com. Our intent was to eliminate duplicate content issues. When we realized that something bad was happening, we immediately turned off the redirects and put thegrillstoreandmore.com back online. This did not unpenalize bbqguys. We've been looking for things for two days, and have not been able to find what we did wrong, at least not until tonight. I just logged back in to webmaster tools to do some more digging, and I saw that I had a new message. "Google Webmaster Tools notice of detected doorway pages on http://www.bbqguys.com/" It is my understanding that doorway pages are pages jammed with keywords and links and devoid of any real content. We don't do those pages. The message does link me to Google's definition of doorway pages, but it does not give me a list of pages on my site that it does not like. If I could even see one or two pages, I could probably figure out what I am doing wrong. I find this most shocking since we go out of our way to try not to do anything spammy or sneaky. Since we try hard not to do anything that is even grey hat, I have no idea what could possibly have triggered this message and the penalty. Does anyone know how to go about figuring out what pages specifically are causing the problem so I can change them or take them down? We are slowly canonical-izing urls and changing the way different parts of the sites build links to make them all the same, and I am aware that these things need work. We were in the process of discontinuing some sites and 301 redirecting pages to a more centralized location to try to stop duplicate content. The day after we instituted the 301 redirects, the site we were redirecting all of the traffic to (the main site) got blacklisted. Because of this, we immediately took down the 301 redirects. Since the webmaster tools notifications are different (ie: too many urls is a notice level message and doorway pages is a separate alert level message), and the too many urls has been triggering for a while now, I am guessing that the doorway pages problem has nothing to do with url structure. According to the help files, doorway pages is a content problem with a specific page. The architecture suggestions are helpful and they reassure us they we should be working on them, but they don't help me solve my immediate problem. I would really be thankful for any help we could get identifying the pages that Google thinks are "doorway pages", since this is what I am getting immediately and severely penalized for. I want to stop doing whatever it is I am doing wrong, I just don't know what it is! Thanks for any help identifying the problem! It feels like we got penalized for trying to do what we think Google wants. If we could figure out what a "doorway page" is, and how our 301 redirects triggered Googlebot into saying we have them, we could more appropriately reduce duplicate content. As it stands now, we are not sure what we did wrong. We know we have duplicate content issues, but we also thought we were following webmaster guidelines on how to reduce the problem and we got nailed almost immediately when we instituted the 301 redirects.
White Hat / Black Hat SEO | | CoreyTisdale0