How to handle Not found Crawl errors?
-
I'm using Google webmaster tools and able to see Not found Crawl errors. I have set up custom 404 page for all broken links. You can see my custom 404 page as follow.
http://www.vistastores.com/404
But, I have question about it.
Will it require to set 301 redirect for broken links which found in Google webmaster tools?
-
I agree with Ben on this one. There are plenty of 404s caused by scraper sites that don't and won't affect my time, especially on big sites.
Also, redirects aren't the only tool available. There are plenty of other ways to fix GWT 404 errors, particularly if there is a fundmental problem aside from the link in question.
-
Hi Commerce, I was certainly came across a blog post on this topic on Google's Webmaster Central blog, it covers most of the questions around 404 errors.
Generally speaking:
- If these are pages that you removed, then the 404 HTTP result code is fine.
- If these are pages that changed addresses, then you should 301 redirect to the new addresses. How you do this depends on your setup, for Apache-servers you may be able to use the .htaccess file for this.
- Unless these are pages that used to receive a lot of traffic from search, these 404s won't be the reason for your site's traffic dropping like that. Google understands that the web changes and that URLs disappear - that is not a reason for Google to stop showing your site.
So my recommendation would be to check the URLs that are listed as 404 crawl errors. If any are important, then set up redirects to the appropriate new URLs as soon as you can. If none of them are important, then keep this in mind as something worth cleaning up when you have time, but focus on the rest of your site first. Often drastic drops in traffic are due more to the general quality of the website, so that's what I'd recommend working on first.
For more deatails refer to How to Fix Crawl Errors.
I hope that your query had been solved.
-
Makes sense - in which case the homepage might not be the best place for you.
Another option for the custom 404 which works well in certain circumstances is to add a dynamic element to it.
For example, we know the referring URL has reference to product XYZ which may now be unavailable, but perhaps we can dynamically pull in other relevant products into the 404 page.
Thats something I am looking to do with hotels that become unavailable - pull in a dynamic element to the 404 which basically recommends some other hotels close by.
-
Well I would have to disagree with that principal. Sometimes you have to think a little broader than just SEO and ask yourself if it really makes commercial sense to redirect everything.
That's why I put a financial cost against each unique redirect. At the end of the day it requires someone to action it and that person has a cost associated with their time that may be better allocated working on something that will actually drive business uplift or improve customer experience.
Each to their own of course, but I see a lot of SEO's who don't think big picture and they up using up developer resource doing stuff that then has no impact. It just p!sses people off in my experience.
-
Hi Ben,
I agree with you that some links are not worth redirecting. However, in my experience a dead link never comes alone. Often there is some kind of reason that the link was created, and there might be others you don't know about.
For this reason I usually recommend redirecting all broken links, even if the individual link is not worth the trouble. Obviously there are exceptions to this rule, but most of the time it's worth your trouble.
Sven
-
Good to know! But, I have very bad experience to redirect such a strong page to home page. I have removed too many product pages for market umbrellas from my website and redirect it to home page. Because, I don't have specific landing page or inner level page for it. So, I'm able to see change over ranking for specific keywords. My home page is ranking well in Market Umbrellas keyword because too many external page link my product page with that keyword. It also create negative ranking impression for my actual targeted keyword which I'm using for my home page.
-
Yeah, which is basically what Kane is saying as well. If you don't have an appropriate internal page then you could send the 301 redirect to your homepage or if it was a specific product you might want to redirect it to the parent/child category.
If its a particularly strong URL that has been linked to from many good external sources then you might consider adding a replacement content page and redirecting to that.
Ben
-
Hi Ben,
I got your point. If my page is available on external page which have good value (Good page rank or heavy amount of traffic) so, I need to redirect it on specific internal page to save my page rank flow. Right?
-
Hopefully I am understanding your question correctly here....
The main benefit of the custom 404 page aside from the obvious improvement to user experience is that you provide additional links into content that otherwise wouldn't necessarily be available to the search bots.
In essence if you just had a standard 404 error page you'd send the search bots to a dead page where their only decision would be to leave the domain and go elsewhere.
Regards setting up 301 redirects I like to associate a cost to each 301 redirect. Imagine the time it will take you or someone else to set each redirect up (say $5 per redirect). Then consider the following:
Is the URL that is 404 worth redirecting?
(1) Does it hold some residual SEO value (i.e., is it present on external sites that is driving link equity? if so can you redirect that equity to somewhere more valuable?
(2) Is the URL present on an external site driving referral traffic? if so do you have a new content page that will still match the users intent?
if the URL(s) that are 404'ing have no real link equity associated to them and/or you don't have a genuinely useful page to redirect the user to then I would just let them hit the 404 page.
If in doubt put yourself in a users boots and ask yourself if the set-up you have done would offer a valuable experience? no point redirecting a user to something totally irrelevant to the original intent - it'll just p!ss them off most the time and increase your bounce rate.
-
If there is a link pointed at that 404 page, then I will almost always 301 it to regain that link value. If I control the source of the link, I'll change that instead. If the link is from a spammy or junky website, I don't worry about it.
Here is a worthwhile article on how to go about fixing GWT crawl errors:
http://www.seomoz.org/blog/how-to-fix-crawl-errors-in-google-webmaster-tools
I would suggest adding more content to your 404 page. Try to help people find what they're looking for by suggesting common pages, product segments, etc.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Crawl errors - 2,513 not found. Response code 404
Hi,
Technical SEO | | JamesHancocks1
I've just inherited a website that I'll be looking after. I've looked in the Search Console in the Crawl errors section and discovered thousands of urls that point to non- existent pages on Desktop. There's 1,128 on Smartphone.
Some are odd and make no sense. for example: | bdfqgnnl-z3543-qh-i39634-imbbfuceonkqrihpbptd/ | Not sure why these have are occurring but what's the best way to deal with them to improve our SEO? | northeast/ | 404 | 8/29/18 |
| | 2 | blog/2016/06/27/top-tips-for-getting-started-with-the-new-computing-curriculum/ | 404 | 8/10/18 |
| | 3 | eastmidlands | 404 | 8/21/18 |
| | 4 | eastmidlands/partner-schools/pingle-school/ | 404 | 8/27/18 |
| | 5 | z3540-hyhyxmw-i18967-fr/ | 404 | 8/19/18 |
| | 6 | northeast/jobs/maths-teacher-4/ | 404 | 8/24/18 |
| | 7 | qfscmpp-z3539-i967-mw/ | 404 | 8/29/18 |
| | 8 | manchester/jobs/history-teacher/ | 404 | 8/5/18 |
| | 9 | eastmidlands/jobs/geography-teacher-4/ | 404 | 8/30/18 |
| | 10 | resources | 404 | 8/26/18 |
| | 11 | blog/2016/03/01/world-book-day-how-can-you-get-your-pupils-involved/ | 404 | 8/31/18 |
| | 12 | onxhtltpudgjhs-z3548-i4967-mnwacunkyaduobb/ | Cheers.
Thanks in advance,
James.0 -
How to handle pagination for a large website?
I am currently doing a site audit on a large website that just went through a redesign. When looking through their webmaster tools, they have about 3,000 duplicate Title Tags. This is due to the way their pagination is set up on their site. For example. domain.com/books-in-english?page=1 // domain.com/books-in-english?page=4 What is the best way to handle these? According to Google Webmaster Tools, a viable solution is to do nothing because Google is good at distinguishing these. That said, it seems like their could be a better solution to help prevent duplicate content issues. Any advice would be much welcomed. 🙂
Technical SEO | | J-Banz0 -
404s in GWT - Not sure how they are being found
We have been getting multiple 404 errors in GWT that look like this: http://www.example.com/UpdateCart. The problem is that this is not a URL that is part of our structure, it is only a piece. The actual URL has a query string on the end, so if you take the query string off, the page does not work. I can't figure out how Google is finding these pages. Could it be removing the query string? Thanks.
Technical SEO | | Colbys0 -
How do you handle Wordpress sitemaps within your site?
I have a regular site map on my site and I also have a Wordpress site installed within it that we use for blog/news content. I currently have an auto-sitemap generator installed in Wordpress which automatically updates the sitemap and submits it to the search engines each time the blog is updated. The question I have (which I think I know the answer to but I just want to confirm) is do I have to include all of the articles within the blog in the main site's sitemap despite the Wordpress sitemap having them in there already? If I do include the articles in the main website's sitemap, they would also be in the Wordpress sitemap as well, which is redundant. Redundancy is not good, so I just want to make sure.
Technical SEO | | iresqkeith0 -
What could be the cause of this duplicate content error?
I only have one index.htm and I'm seeing a duplicate content error. What could be causing this? IUJvfZE.png
Technical SEO | | ScottMcPherson1 -
Are 404 Errors a bad thing?
Good Morning... I am trying to clean up my e-commerce site and i created a lot of new categories for my parts... I've made the old category pages (which have had their content removed) "hidden" to anyone who visits the site and starts browsing. The only way you could get to those "hidden" pages is either by knowing the URLS that I used to use or if for some reason one of them is spidering in Google. Since I'm trying to clean up the site and get rid of any duplicate content issues, would i be better served by adding those "hidden" pages that don't have much or any content to the Robots.txt file or should i just De-activate them so now even if you type the old URL you will get a 404 page... In this case, are 404 pages bad? You're typically not going to find those pages in the SERPS so the only way you'd land on these 404 pages is to know the old url i was using that has been disabled. Please let me know if you guys think i should be 404'ing them or adding them to Robots.txt Thanks
Technical SEO | | Prime850 -
Lots of overdynamic URL and crawl errors..
Just wanted some advice. SEOmoz crawl found out about 18,000 errors. The error URLs are all mainly URLs like the one below, which seem to be the registration URL with a re-direct on, going back the product after registration: http://www.DOMAIN.com/index.php?_g=co&_a=reg&redir=/index.php?_a=viewProd%26productId=3465 We have the following line in the robots file to stop the login page from being crawled: Disallow: /index.php?act=login If I add the following, will it stop the error? Disallow: /index.php?act=reg Thanks in advance**.**
Technical SEO | | filarinskis0 -
Crawl Errors In Webmaster Tools
Hi Guys, Searched the web in an answer to the importance of crawl errors in Webmaster tools but keep coming up with different answers. I have been working on a clients site for the last two months and (just completed one months of link bulding), however seems I have inherited issues I wasn't aware of from the previous guy that did the site. The site is currently at page 6 for the keyphrase 'boiler spares' with a keyword rich domain and a good onpage plan. Over the last couple of weeks he has been as high as page 4, only to be pushed back to page 8 and now settled at page 6. The only issue I can seem to find with the site in webmaster tools is crawl errors here are the stats:- In sitemaps : 123 Not Found : 2,079 Restricted by robots.txt 1 Unreachable: 2 I have read that ecommerce sites can often give off false negatives in terms of crawl errors from Google, however, these not found crawl errors are being linked from pages within the site. How have others solved the issue of crawl errors on ecommerce sites? could this be the reason for the bouncing round in the rankings or is it just a competitive niche and I need to be patient? Kind Regards Neil
Technical SEO | | optimiz10