Thousands of 503 Errors
-
I was just checking Google Webmaster Tools for one of the first times (I know this should have been a regular habit).
I noticed that on Feb 8th we had almost 80K errors of type 503. This is obviously very alarming because as far as I know our site was up and available that whole day. This makes me wonder if there is a firewall issue or something else that I'm not aware of.
Any ideas for the best way to determine what's causing this?
Thanks,
Chris
-
Cyrus,
Thanks for the props, but also cool on the crawl delay link. I wish I could say I knew about it before this answer, but I didn't; cool stuff for bigger high update sites.
Always appreciate what you have to say as I learn a lot from you.
Best
-
Hi Chris,
This is a really hard problem to diagnose from the outside like, so I'll just give you my thoughts.
1. Are the URLs throwing the 503 errors real pages? Can they be accessed normally by human visitors through the site? I only mention this because sometimes you get software generating a bunch of random links that go nowhere, and weird stuff starts to happen when Google crawls those URLs. Normally you'd see these result as 404s, however.
2. Is the date in Google Webmaster Tools for the 503 errors recent? Sometimes they log those for a long time after the problem is actually solved, especially for URLs they don't visit much.
3. How often does your site go down?
4. Try performing a "Fetch as Googlebot" test on some of the effected URLs
5. I doubt googlebot is crashing your site, but you could always try a crawl delay
6. If nothing else, you'll find the problem at the serving/hosting level. Can't be much help there, unfortunately.
-
It turns out that the Magento patch did NOT fix the problem. We are still receiving tens of thousands of 503 errors when Googlebot requests a page. The site is not down. I can look in the access_log and see that the request was responded to with a 503 error.
Any ideas? This has to be killing our chances for organic traffic until this gets resolve.
-
Hi Robert,
Thanks for the response. It turns out that this is due to a bug in our hosting software, Magento, that results in googlebot not being handled correctly. Apparently there's a patch that's being tested now.
Thanks,
Chris
-
Do you know where you are hosted? Have you called them to see if the server is down or intermittently down?
Here is a how to resolve link.
Look at bottom and follow the directions regarding using the wayback machine to see if it is temporary or the server is down for maintenance.
That given, if you give us a url, it is easier to assist you.
Best, let us know.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
4xx errors
I"m trying to fix 4xx errors but I"m not finding the pages in my admin. Where can I find this page? https://cracklefireplaces.com/collections/ethanol-wall-mounted-fireplaces/products/ignis-maximum-wall-mounted-ethanol-fireplace
Technical SEO | | carlbrekjern0 -
Increase in Crawl Errors
I had a problem with a lot of crawl errors (on Google Search Console) a while back, due to the removal of a shopping cart. I thought I'd dealt with this & Google seemed to agree (see attached pic), but now they're all back with a vengeance! The crawl errors are all the old shop pages that I thought I'd made clear weren't there anymore. The sitemaps (using Yoast on Wordpress to generate these) all updated 16 Aug but the increase didn't happen till 18-20. How do I make it clear to Google that these pages are gone forever? Screen-Shot-2016-08-22-at-10.19.05.png
Technical SEO | | abisti20 -
404 Error Pages being picked up as duplicate content
Hi, I recently noticed an increase in duplicate content, but all of the pages are 404 error pages. For instance, Moz site crawl says this page: https://www.allconnect.com/sc-internet/internet.html has 43 duplicates and all the duplicates are also 404 pages (https://www.allconnect.com/Coxstatic.html for instance is a duplicate of this page). Looking for insight on how to fix this issue, do I add an rel=canonical tag to these 60 error pages that points to the original error page? Thanks!
Technical SEO | | kfallconnect0 -
Hreflang Tags - error: 'en' - no return tags
Hello, We have recently implemented Hreflang tags to improve the findability of our content in each specific language. However, Webmaster tool is giving us this error... Does anyone know what it means and how to solve it? Here I attach a screenshot: http://screencast.com/t/a4AsqLNtF6J Thanks for your help!
Technical SEO | | Kilgray0 -
404 or 503 Malware Content ?
Hi Folks When it comes to malware , if I have a site that uses iframe to show content off 3rd party sites which at times gets infected. Would you recommend 404 or 503 ing those pages with the iframe till the issue is resolved ? ( I am inclined to use 503 .. ) Then take the 404/503 off and ask for a reindex ( from GWT malware section ) OR Ask for a reindex as soon as the 404/503 goes up. ( I do understand we are asking to index as non existing page , but the malware warning gets removed ) PS : it makes sense for this business to showcase content using iframe on these special pages . I do understand these are not the best way to go about SEO.
Technical SEO | | Saijo.George0 -
404 errors is webmaster - should I 301 all pages?
Currently working on a retail site that shows over 1200 404 errors coming from urls that are from products that were on the site, but have now been removed as they are seasonal/out of stock. What is the best way of dealing with this situation ongoing? I am aware of the fact that these 404s are being marked as url errors in Google Webmaster. Should I redirect these 404s to a more appropriate live page or should I leave them as they are and not redirect them? I am concerned that Google may give the site a penalty as these 404s are growing (as the site is a online retail store and has products removed from its page results regularly). I thought Google was able to recognise 404s and after a set period of time would push them out of the error report. Also is there a tool out there that on mass I can run all the 404s urls through to see their individual page strength and the number of links that point at each one? Thanks.
Technical SEO | | Oxfordcomma0 -
What could be the cause of this duplicate content error?
I only have one index.htm and I'm seeing a duplicate content error. What could be causing this? IUJvfZE.png
Technical SEO | | ScottMcPherson1 -
Robots.txt file getting a 500 error - is this a problem?
Hello all! While doing some routine health checks on a few of our client sites, I spotted that a new client of ours - who's website was not designed built by us - is returning a 500 internal server error when I try to look at the robots.txt file. As we don't host / maintain their site, I would have to go through their head office to get this changed, which isn't a problem but I just wanted to check whether this error will actually be having a negative effect on their site / whether there's a benefit to getting this changed? Thanks in advance!
Technical SEO | | themegroup0