2.3 million 404s in GWT - learn to live with 'em?
-
So I’m working on optimizing a directory site. Total size: 12.5 million pages in the XML sitemap. This is orders of magnitude larger than any site I’ve ever worked on – heck, every other site I’ve ever worked on combined would be a rounding error compared to this.
Before I was hired, the company brought in an outside consultant to iron out some of the technical issues on the site. To his credit, he was worth the money: indexation and organic Google traffic have steadily increased over the last six months. However, some issues remain. The company has access to a quality (i.e. paid) source of data for directory listing pages, but the last time the data was refreshed some months back, it threw 1.8 million 404s in GWT. That has since started to grow progressively higher; now we have 2.3 million 404s in GWT.
Based on what I’ve been able to determine, links on this particular site relative to the data feed are broken generally due to one of two reasons: the page just doesn’t exist anymore (i.e. wasn’t found in the data refresh, so the page was simply deleted), or the URL had to change due to some technical issue (page still exists, just now under a different link). With other sites I’ve worked on, 404s aren’t that big a deal: set up a 301 redirect in htaccess and problem solved. In this instance, setting up that many 301 redirects, even if it could somehow be automated, just isn’t an option due to the potential bloat in the htaccess file.
Based on what I’ve read here and here, 404s in and of themselves don’t really hurt the site indexation or ranking. And the more I consider it, the really big sites – the Amazons and eBays of the world – have to contend with broken links all the time due to product pages coming and going. Bottom line, it looks like if we really want to refresh the data on the site on a regular basis – and I believe that is priority one if we want the bot to come back more frequently – we’ll just have to put up with broken links on the site on a more regular basis.
So here’s where my thought process is leading:
- Go ahead and refresh the data. Make sure the XML sitemaps are refreshed as well – hopefully this will help the site stay current in the index.
- Keep an eye on broken links in GWT. Implement 301s for really important pages (i.e. content-rich stuff that is really mission-critical). Otherwise, just learn to live with a certain number of 404s being reported in GWT on more or less an ongoing basis.
- Watch the overall trend of 404s in GWT. At least make sure they don’t increase. Hopefully, if we can make sure that the sitemap is updated when we refresh the data, the 404s reported will decrease over time.
We do have an issue with the site creating some weird pages with content that lives within tabs on specific pages. Once we can clamp down on those and a few other technical issues, I think keeping the data refreshed should help with our indexation and crawl rates.
Thoughts? If you think I’m off base, please set me straight.
-
I was actually thinking about some type of wildcard rule in htaccess. This might actually do the trick! Thanks for the response!
-
Hi,
Sounds like you’ve taken on a massive job with 12.5 million pages, but I think you can implement a simple fix to get things started.
You’re right to think about that sitemap, make sure it’s being dynamically updated as the data refreshes, otherwise that will be responsible for a lot of your 404s.
I understand you don’t want to add 2.3 million separate redirects to your htaccess, so what about a simple rule - if the request starts with ^/listing/ (one of your directory pages), is not a file and is not a dir, then redirect back to the homepage. Something like this:
does the request start with /listing/ or whatever structure you are using
RewriteCond %{REQUEST_URI} ^/listing/ [nc]
is it NOT a file and NOT a dir
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
#all true? Redirect
RewriteRule .* / [L,R=301]This way you can specify a certain URL structure for the pages which tend to turn to 404s, any 404s outside of your first rule will still serve a 404 code and show your 404 page and you can manually fix these problems, but the pages which tend to disappear can all be redirected back to the homepage if they’re not found.
You could still implement your 301s for important pages or simply recreate the page if it’s worth doing so, but you will have dealt with a large chunk or your non-existing pages.
I think it’s a big job and those missing pages are only part of it, but it should help you to sift through all of the data to get to the important bits – you can mark a lot of URLs as fixed and start giving your attention to the important pages which need some works.
Hope that helps,
Tom
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's more valuable, a Blog or a Forum, and how to integrate?
We want to start a blog or forum (maybe eventually both) and are unsure what is the best way to publish it from an SEO standpoint. If the blog is published on our domain, like domain.com/blog then that obviously helps the site but if the base site is a for-profit business wouldn't it get less credibility, eyeballs, links as opposed to if you started the blog as it's own separate community on a separate domain and then just strategically linked to the for profit site (sponsorship links)? Essentially the question is, if I'm the Lucky Soday Company, do I start a Blog on the Lucky Soda website, or do I start a separate website to grow a soft drink enthusiast community blog / forum? I would guess a blog has more SEO potential than a discussion forum?
Intermediate & Advanced SEO | | MrSem0 -
I've had to share this for the comedy value!
One of our clients today has sent over a list of keywords which he hopes to be ranked on page one for, please check these out and try not to laugh. All the existing Birmingham xxxx searches Hosted Voice Cloud Communications Cloud Solutions Cloud Services Pure Cloud VoIP Telephony Communications Unified Communications Fixed line SIP & SIP Trunks Broadsoft Yealink Contact Centre & Hosted Contact Centre Cyber Security Ransomware Open DNS Secure device management IoT – Internet of Things CISCO Meraki partner System manager Routers Switches Virtual stacking SOPHOS UTM partner SOPHOS Silver partner General Data Protection Regulation Business Mobile Mobile / Mobility M2M – Mobile 2 Mobile EE Vodafone O2 Managed print Photocopier / Printer Ethernet Leased Line EoFTTC FTTC ADSL2+ Broadband Connectivity WiFi CMX location analytics High capacity 802.11ac Automatic RF optimisation Security radio Identity-based firewall AC Dual Band Cloud managed wifi MDM – mobile device management Critical data Insurance Critical data Storage Collaboration I'm not sure he understood why I wanted to gather this information but he's defiantly not got the right end of the stick!
Intermediate & Advanced SEO | | chrissmithps0 -
Wordpress Blog in 2 languages. How to SEO or structure it?
Hi Moz community, I have got a wordpress blog currently in the spanish language. I want to create the same blog content but in english version. (manually translate it to english instead of using translation service such as Google Translate). How should i structure the blog for SEO? How will it work? Any structure markups i should know about? Any examples? Thanks
Intermediate & Advanced SEO | | WayneRooney0 -
Should you allow an auto dealer's inventory to be indexed?
Due to the way most auto dealership website populate inventory pages, should you allow inventory to be indexed at all? The main benefit us more content. The problem is it creates duplicate, or near duplicate content. It also creates a ton of crawl errors since the turnover is so short and fast. I would love some help on this. Thanks!
Intermediate & Advanced SEO | | Gauge1230 -
Any idea why this page isn't indexing?
Hi Mozzers, Question for all of you. Any idea why this page isn't indexing in Google? It's indexing in Bing, but we don't see it in Google's results. It doesn't seem like we have any noindex tags or anyway issues with the robots files either. Any ideas? http://ohva.k12.com/
Intermediate & Advanced SEO | | petertong230 -
Combining 2 Websites
Any assistance/feedback is greatly appreciated. The scenario: We currently own two website, and we'd like to combine them and eliminate some expenses. Although the content is very similar in nature, it is not exact. www.KF.com that is managed by a third-party provider & www.KFA.com that is managed by the manufacturer of the product we sell. (*sites url's are not accurate) We have ended the contract of KF.com, however, this site has the best SERP/SEO.
Intermediate & Advanced SEO | | FX4nWOO
We assume we'll take a hit, no matter what we do - however when it comes to SEO, but what is the right move to make? Do a domain "Transfer/Redirect" of KF to KFA.com or Do we simply change the KFA.com to KF.com? Still very much a rookie when it comes to this stuff. I do have the ability to SEO the KFA.com webiste. Hoping this makes sense - and apologize for the bad url's just not sure I can actually post the true addresses. Thanks in advance.0 -
Is Google's reinclusion request process flawed?
We have been having a bit of a nightmare with a Google penalty (please see http://www.browsermedia.co.uk/2012/04/25/negative-seo-or-google-just-getting-it-painfully-wrong/ or http://econsultancy.com/uk/blog/10093-why-google-needs-to-be-less-kafkaesque for background information - any thoughts on why we have been penalised would be very, very welcome!) which has highlighted a slightly alarming aspect of Google's reinclusion process. As far as I can see (using Google Analytics), supporting material prepared as part of a reinclusion request is basically ignored. I have just written an open letter to the search quality team at http://www.browsermedia.co.uk/2012/06/19/dear-matt-cutts/ which gives more detail but the short story is that the supporting evidence that we prepared as part of a request was NOT viewed by anyone at Google. Has anyone monitored this before and experienced the same thing? Does anyone have any suggestions regarding how to navigate the treacherous waters of resolving a penalty? This no doubt sounds like a sob story for us, but I do think that this is a potentially big issue and one that I would love to explore more. If anyone could contribute from the search quality team, we would love to hear your thoughts! Cheers, Joe
Intermediate & Advanced SEO | | BrowserMediaLtd0 -
What's the "most valuable indirectly related skill" to SEO worth learning?
Hi, All! I have a little time on my hands that's not taken up by client work or our own marketing. What would you say is a skill worth learning during that time? My background is not techie, so while I've picked up a teeny bit of knowledge about code, etc. on the way, I still don't really know how to code, use APIs, etc. So I was thinking something along those lines, but anyone have specific suggestions? And resources for whatever you suggest? Thanks! Aviva
Intermediate & Advanced SEO | | debi_zyx0