Organic listings disappeared I don't know why!
-
Brief history:
I am MD of a medium sized health organisation in the UK. We have one of the leading websites in the world for our industry. We were hit by a Google algorithm update last year (Penguin or Panda, I can't remember, but that's not relevant here I don't think) and our daily visits went down from around 10,000 to around 5,000 in two separate hits over a couple of months. Then there was a steady decrease to about 3,000-4,000 visits a day until we totally updated the design of the site and did some good work on the content. We have always been white-hat and the site has around 3,000 pages with unique content added daily.
So things have really been on the up for the past couple of months. We have been receiving around 6,000 visits a day in recent weeks (a slow incline over the past few months), until Sunday. Sunday morning around 10am all of our organic listings pretty much disappear, including for our brand name. Monday morning a few come back, including our brand name and our main, most competitive keyword, which we were showing up on the third page for and we returned to this page. Then Tuesday morning another few of our most competitive keywords show up, back where they were before. This includes images which had disappeared from Google images.
Our PPC and business listings were not really affected at all.
My developer submitted a site map through webmaster tools on Monday morning and I'm not sure if this is the reason pages started to show up again. In our Webmaster tools the indexed pages are about a quarter of all of the ones on the site - all pages were indexed before. I just don't know what has happened! It doesn't make any sense as 1. Google don't seem to have rolled out any algorithm updates on that day 2. we do not have any messages in Webmaster Tools 3. a number of our main keywords have re-appeared - why would that happen if we had been hit by a Google update?!
Our organic hits, which previously made up about 80% of all our hits, have gone down by 80% and this is drastically affecting business. If this continues it is likely we will have to downsize the business and I'm not sure what to do.
When I saw that the 'indexed pages' in Webmaster tools started to increase (they were around 600 on Monday, around 900 yesterday and then this morning, around 1,300), I thought that we were on our way up and maybe this problem would just resolve itself and our listings would re-appear, but now our indexed pages have reduced slightly since this morning, back down to around 1,100 so the increase has stalled.
Can anybody help?! Do you have any idea what could be causing this? Apparently there have been no changes made to robots.txt and my developer says that no changes were made that could have affected our listings.
ANY ADVICE WOULD BE GREATLY APPRECIATED.
-
Interesting situation...and very frustrating for you, I'm sure.
You mentioned this below:
"I checked 'cached snapshot of page' in Google Toolbar for the pages that weren't being indexed, and it showed up as a 404 error. "
This sounds like you had some sort of technical error. But some things still don't add up for me. It kind of sounds like your pages were not resolving for Google. But, the odd thing is that if Google sees a 404 error, they keep trying for days, weeks or even months before they conclude that the pages should be removed from the index.
I don't have an answer for you but the first place I'd look is to make sure that your robots.txt file is not blocking Googlebot in some way. I'd also check server logs and perhaps check with your host to see if there was some significant down time for the site.
If there was a technical glitch, and the problem is now fixed, then your pages should come back into the index without you doing anything.
I'm pretty certain this isn't a penalty issue though.
-
Thank you. I will look into this, although I don't think the pages are set to no follow because there has been a further development. I checked 'cached snapshot of page' in Google Toolbar for the pages that weren't being indexed, and it showed up as a 404 error. These are pages that have always been cached before this problem occurred. I then went to 'submit URL to Google' and submitted a couple of URLs. They instantly showed up in Google's listings in the same spot as they were before and the cached snapshot of page then showed up correctly. I could do that for every page but 1. that would be a HUGE job and 2. would that look spammy or suspicious to Google? 3. is there a way of doing that for multiple pages at a time?! I feel like this problem is very close to being solved but I just don't quite know how to solve it.
-
The only time I've seen this type of thing happen - all of the pages in a site are no longer indexed, yet PPC still works, is when something on the site has been set to no-index / no-follow.
If you had a manual penalty from Google, that would show up in Google Webmaster Tools. Plus, the site would still be indexed, just ranked really, really low. If everything was missing from Google's cache, then the most likely explanation is that it was set accidentally to no-index / no-follow.
This is a very easy thing to mess up, and it's possible that someone might have hit the wrong button by accident, or updated the robots.txt file.
In the past, I had a project manager who messed this up for a client while doing a content update on the site, and it was about a week before anyone noticed. She's no longer here (not due just to that issue). But this is so critical for me and my company that we've put an automated and human testing check in place each day:
For our company, we have an automated script that runs through all of our sites (and client's sites) each day to make sure that the site is set to index / follow, both on the pages and in the robot.txt file. We also check the title tag and make sure that the name servers haven't changed.
I also pay someone on my team to run through a 12 step checklist each and every day to make sure that things like the site search are working, contact forms go through properly, and that pages are set to index / follow.
I hope this helps...
Thanks,
-- Jeff
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moving to https with a bunch of redirects my programmer can't handle
Hi Mozzers, I referred a client of mine (last time) to a programmer that can transition their site from http to https. They use a wordpress website and currently use EPS Redirects as a plugin that 301 redirects about 400 pages. Currently, the way EPS redirects is setup (as shown in the attachment) is simple: On the left side you enter your old url, and on the the right side is the newly 301'd url. But here's the issue, since my client made the transition to https, the whole wordpress backend is setup that way as well. What this means is, if my client finds another old http url that he wants to redirect, this plugin only allows them to redirect https to https. As of now, all old http to https redirects STILL work even though the left side of the plugin switched all url's to a default HTTPS. But my client is worried the next plugin update he will lose all http to https redirects. While asking our programmer to add all 400 redirects to .htaccess, he states that's too many redirects and could slow down the website. Well, we don't want to lose all 400 301's and jeopardize our SEO. Question: what does everyone suggest as an alternative solution/plugin to redirect old http urls to https and future https to https urls? Thank you all! Ol8km
Intermediate & Advanced SEO | | Shawn1240 -
How can a recruitment company get 'credit' from Google when syndicating job posts?
I'm working on an SEO strategy for a recruitment agency. Like many recruitment agencies, they write tons of great unique content each month and as agencies do, they post the job descriptions to job websites as well as their own. These job websites won't generally allow any linking back to the agency website from the post. What can we do to make Google realise that the originator of the post is the recruitment agency and they deserve the 'credit' for the content? The recruitment agency has a low domain authority and so we've very much at the start of the process. It would be a damn shamn if they produced so much great unique content but couldn't get Google to recognise it. Google's advice says: "Syndicate carefully: If you syndicate your content on other sites, Google will always show the version we think is most appropriate for users in each given search, which may or may not be the version you'd prefer. However, it is helpful to ensure that each site on which your content is syndicated includes a link back to your original article. You can also ask those who use your syndicated material to use the noindex meta tag to prevent search engines from indexing their version of the content." - But none of that can happen. Those big job websites just won't do it. A previous post here didn't get a sufficient answer. I'm starting to think there isn't an answer, other than having more authority than the websites we're syndicating to. Which isn't going to happen any time soon! Any thoughts?
Intermediate & Advanced SEO | | Mark_Reynolds0 -
Why Google isn't indexing my images?
Hello, on my fairly new website Worthminer.com I am noticing that Google is not indexing images from my sitemap. Already 560 images submitted and Google indexed only 3 of them. Altough there is more images indexed they are not indexing any new images, and I have no idea why. Posts, categories and other urls are indexing just fine, but images not. I am using Wordpress and for sitemaps Wordpress SEO by yoast. Am I missing something here? Why Google won't index my images? Thanks, I appreciate any help, David xv1GtwK.jpg
Intermediate & Advanced SEO | | Worthminer1 -
Google isn't seeing the content but it is still indexing the webpage
When I fetch my website page using GWT this is what I receive. HTTP/1.1 301 Moved Permanently
Intermediate & Advanced SEO | | jacobfy
X-Pantheon-Styx-Hostname: styx1560bba9.chios.panth.io
server: nginx
content-type: text/html
location: https://www.inscopix.com/
x-pantheon-endpoint: 4ac0249e-9a7a-4fd6-81fc-a7170812c4d6
Cache-Control: public, max-age=86400
Content-Length: 0
Accept-Ranges: bytes
Date: Fri, 14 Mar 2014 16:29:38 GMT
X-Varnish: 2640682369 2640432361
Age: 326
Via: 1.1 varnish
Connection: keep-alive What I used to get is this: HTTP/1.1 200 OK
Date: Thu, 11 Apr 2013 16:00:24 GMT
Server: Apache/2.2.23 (Amazon)
X-Powered-By: PHP/5.3.18
Expires: Sun, 19 Nov 1978 05:00:00 GMT
Last-Modified: Thu, 11 Apr 2013 16:00:24 +0000
Cache-Control: no-cache, must-revalidate, post-check=0, pre-check=0
ETag: "1365696024"
Content-Language: en
Link: ; rel="canonical",; rel="shortlink"
X-Generator: Drupal 7 (http://drupal.org)
Connection: close
Transfer-Encoding: chunked
Content-Type: text/html; charset=utf-8 xmlns:content="http://purl.org/rss/1.0/modules/content/"
xmlns:dc="http://purl.org/dc/terms/"
xmlns:foaf="http://xmlns.com/foaf/0.1/"
xmlns:og="http://ogp.me/ns#"
xmlns:rdfs="http://www.w3.org/2000/01/rdf-schema#"
xmlns:sioc="http://rdfs.org/sioc/ns#"
xmlns:sioct="http://rdfs.org/sioc/types#"
xmlns:skos="http://www.w3.org/2004/02/skos/core#"
xmlns:xsd="http://www.w3.org/2001/XMLSchema#"> <title>Inscopix | In vivo rodent brain imaging</title>0 -
Organic Rankings for the US & Australia
I have a site that is ranking well for competitive keywords in the US, but would like to have it rank in Australia as well. Although there's no direct correlation, I'm running large Adwords campaigns in both countries. I've read to write localized content for each region, but not sure if this is effective as it used to be. I've also read to use location markup and microformats. Any feedback would be greatly appreciated. Thank you in advance
Intermediate & Advanced SEO | | NickMacario0 -
Privacy Policy & T&C's SEO related question
With Adwords they request a Privacy Policy and T&C's sometimes for an Ad to be approved. Silly question I know but do you think Google looks out for pages like this to identity websites which are more genuine for organic? Thanks
Intermediate & Advanced SEO | | activitysuper0 -
How to find all of a website's SERPs?
Was wondering how easiest to find all of a website's existing SERPs?
Intermediate & Advanced SEO | | McTaggart0 -
Triple listing in rankings
We have a triple listing for the keyword vca cursus (Dutch keyword). But the page we optimized for that keyword ranks lower than our vca examen page. First we had the number one spot but now we rank 2 to 5. In opensite explorer i compared the competiton and we scored on almost all factors better. Do you know the causes why the vca examen page ranks higher then the vca cursus page. Is it possible to change that or reverse the triple listing somehow. And why does our competitor ranks higher?
Intermediate & Advanced SEO | | PlusPort0