Why is Google Webmaster Tools reporting a massive increase in 404s?
-
Several weeks back, we launched a new website, replacing a legacy system moving it to a new server. With the site transition, webroke some of the old URLs, but it didn't seem to be too much concern. We blocked ones I knew should be blocked in robots.txt, 301 redirected as much duplicate data and used canonical tags as far as I could (which is still an ongoing process), and simply returned 404 for any others that should have never really been there.
For the last months, I've been monitoring the 404s Google reports in Web Master Tootls (WMT) and while we had a few hundred due to the gradual removal duplicate data, I wasn't too concerned. I've been generating updated sitemaps for Google multiple times a week with any updated URLs. Then WMT started to report a massive increase in 404s, somewhere around 25,000 404s per day (making it impossible for me to keep up). The sitemap.xml has new URL only but it seems that Google still uses the old sitemap from before the launch. The reported sources of 404s (in WMT) don't exist anylonger. They all are coming from the old site.
I attached a screenshot showing the drastic increase in 404s. What could possibly cause this problem?
-
Thank you for both responses...
Nakul--
I have been following everything exactly as you have described. In general the goal during the development was to keep changes to an absolute minimum. This has not always been possible.
The majority of external links have been 301 redirected or in cases where the new server responds to two differnet URLs for the same content a canonical tag has been added.
I have noticed that 99% of the reported URLs are former internal links. The reported 404s are completely out of proportion (194k vs less than 5k pages in the new xml sitemap).
I am really worried. Is there anything else I can do beside monitoring and hopping?
How long does it typically take to for "Things have to work their way out of its system."?
Is it possible that Google is somehow accessing the old IP address (although the DNS records for the domain have changed)? We left the old server alive and planning to shut it down after the second site has been moved away from it.
Thanks,
Adam
-
Agreed; it could an after effect and stems from inbound URLs to your site from other sites. That's what the majority of the 404s I see in GWT come from (vs being bad pages within my site).
Google probably isn't using the old sitemap if you gave them a new one. What could be happening is that it still needs to "reorganize" and reconcile your old URLs and new URLs. The indexed pages don't just disappear overnight or get replaced immediately because of a site map change. Things have to work their way out of its system.
If there's specific URLs you want to try to remedy immediately, look into the GWT Remove URL option under the optimization section.
-
What I'd suggest doing is randomly revising some of those 404's that appear and check whether they should indeed be 404s. Are there any bulk rules / wildcard 301s you can implement to redirect the traffic for 3-6 months ?
These URLs are usually found from external links to your website. When you click on a detail of any of the reported 404s, it tells you what the error details are, whether this link is in the sitemap or where it is linked from. You'd realize in most cases it's linked from somewhere. If it's an internal link, correct it. If it's external, do you think the webmaster might update it if you contact them or is it easier to just set a 301, retaining the SEO value ?
I hope this helps.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why are some pages indexed but not cached by Google?
The question is simple but I don't understand the answer. I found a webpage that was linking to my personal site. The page was indexed in Google. However, there was no cache option and I received a 404 from Google when I tried using cache:www.thewebpage.com/link/. What exactly does this mean? Also, does it have any negative implication on the SEO value of the link that points to my personal website?
Intermediate & Advanced SEO | | mRELEVANCE0 -
How to add subdomains to webmaster tools?
Can anyone help with how I add a sub domain to webmaster tools? Also do I need to create a seperate sitemap for each sub domain? Any help appreciated!
Intermediate & Advanced SEO | | SamCUK1 -
Is the Tool Forcing Sites to Link Out?
Hi I have a tool that I wish to give to sites, it allows the user to get an accurate idea of their credit score with out giving away any personal data and with out having a credit search done on their file. Due to the way the tool works and to make the implementation on other peoples sites as simple as possible the tool remains hosted by me and a one line piece of Javascript code just needs to be added to the code of the site wishing to use the tool. This code includes a link to my site to call the information from my server to allow the tool to show and work on the other site. My questions are: Could this cause a problem with Google as far as their link quality goes? - Are we forcing people to give us a backlink to use the tool? (in the eyes of Google) or will Google not be able to read the Javascript / will ignore the link for SEO purposes? Should I make the link in the code Nofollow? If I should make the link a Nofollow any tips on how to make the most of the opportunity from a link building or SEO point of view? Thanks for your help
Intermediate & Advanced SEO | | MotoringSEO0 -
How to get around Google Removal tool not removing redirected and 404 pages? Or if you don't know the anchor text?
Hello! I can’t get squat for an answer in GWT forums. Should have brought this problem here first… The Google Removal Tool doesn't work when the original page you're trying to get recached redirects to another site. Google still reads the site as being okay, so there is no way for me to get the cache reset since I don't what text was previously on the page. For example: This: | http://0creditbalancetransfer.com/article375451_influencial_search_results_for_.htm | Redirects to this: http://abacusmortgageloans.com/GuaranteedPersonaLoanCKBK.htm?hop=duc01996 I don't even know what was on the first page. And when it redirects, I have no way of telling Google to recache the page. It's almost as if the site got deindexed, and they put in a redirect. Then there is crap like this: http://aniga.x90x.net/index.php?q=Recuperacion+Discos+Fujitsu+www.articulo.org/articulo/182/recuperacion_de_disco_duro_recuperar_datos_discos_duros_ii.html No links to my site are on there, yet Google's indexed links say that the page is linking to me. It isn't, but because I don't know HOW the page changed text-wise, I can't get the page recached. The tool also doesn't work when a page 404s. Google still reads the page as being active, but it isn't. What are my options? I literally have hundreds of such URLs. Thanks!
Intermediate & Advanced SEO | | SeanGodier0 -
Who is beating you on Google (after Penguin)?
Hi,
Intermediate & Advanced SEO | | rayvensoft
After about a month of Penguin and 1 update, I am starting to notice an annoying pattern as to who is beating me in the rankings on google. I was wondering if anybody else has noticed this.
The sites who are beating me - almost without exception - fall into these 2 categories. 1) Super sites that have little or nothing to do with the service I am offering. Now it is not the homepages that are beating me. In almost all cases they are simply pages hidden in their forums where somebody in passing mentioned something relating to what I do. 2) Nobodies. Sites that have absolutely no links back to them, and look like they were made by a 5 year old. Has anybody else noticed this? I am just wondering if what I see only apply to my sites or if this is a pattern across the web. Does this mean that for small sites to rank, it is now all about on-page SEO? If it all about on-page, well that is great... much easier than link building. But I want to make sure others see the same thing before dedicating a lot of time to overhaul my sites and create new content.| Thanks!0 -
Fixing Google Places once Banned
I have a lot of clients who have somehow botched up their Google Places listing, and now are not showing up in local search results. In one particular case, they were using 2 different Gmail accounts and submitted their listing twice by accident. It appears Google has banned them from local search results. How does one undo steps like this and start fresh? Thanks!
Intermediate & Advanced SEO | | ocsearch0 -
400 errors and URL parameters in Google Webmaster Tools
On our website we do a lot of dynamic resizing of images by using a script which automatically re-sizes an image dependant on paramaters in the URL like: www.mysite.com/images/1234.jpg?width=100&height=200&cut=false In webmaster tools I have noticed there are a lot of 400 errors on these image Also when I click the URL's listed as causing the errors the URL's are URL Encoded and go to pages like this (this give a bad request): www.mysite.com/images/1234.jpg?%3Fwidth%3D100%26height%3D200%26cut%3Dfalse What are your thoughts on what I should do to stop this? I notice in my webmaster tools "URL Parameters" there are parameters for:
Intermediate & Advanced SEO | | James77
height
width
cut which must be from the Image URLs. These are currently set to "Let Google Decide", but should I change them manually to "Doesn't effect page content"? Thanks in advance0 -
Links on Google Notebook
I have used OSE to look at links of a competitors site and notice they have dozens for links from Google Notebook pages eg http://www.google.pl/notebook/public/05275990022886032509/BDQExDQoQs8r3ls4j This page has a PA of 48 Is this a legitimate linking strategy?
Intermediate & Advanced SEO | | seanmccauley0