Handling a Huge Amount of Crawl Errors
-
HI all,
I am faced with a crawl errors issue of a huge site (>1MiO pages) for which I am doing On-page Audit.
-
404 Erorrs: >80'000
-
Soft 404 Errors: 300
-
500 Errors: 1600
All of the above reported in GWT.
Many of the error links are simply not present on the pages "linked from". I investigated a sample of pages (and their source) looking for the error links footprints and yet nothing.
What would be the right way to address this issue from SEO perspective, anyway? Clearly. I am not able to investigate the reasons since I am seeing what is generated as HTML and NOT seeing what's behind.
So my question is: Generally, what is the appropriate way of handling this?
-
Telling the client that he has to investigate that (I gave my best to at least report the errors)
-
Engaging my firm further and get a developer from my side to investigate?
Thanks in advance!!
-
-
Usually an on page audit lists all of the problems and possible reasons why they are happening, not in depth info on how to fix all the issues. That is usually the next phase, "do you want me to work on the site or do you want your dev team to track down the cause of the issues and fix them"
It also depends what type of contract you have with him of course.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help recover lost traffic (70%) from robots.txt error.
Our site is a company information site with 15 million indexed pages (mostly company profiles). Recently we had an issue with a server that we replaced, and in the processes mistakenly copied the robots.txt block from the staging server to a live server. By the time we realized the error, we lost 2/3 of our indexed pages and a comparable amount of traffic. Apparently this error took place on 4/7/19, and was corrected two weeks later. We have submitted new sitemaps to Google and asked them to validate the fix approximately a week ago. Given the close to 10 million pages that need to be validated, so far we have not seen any meaningful change. Will we ever get this traffic back? How long will it take? Any assistance will be greatly appreciated. On another note, these indexed pages were never migrated to SSL for fear of losing traffic. If we have already lost the traffic and/or if it is going to take a long time to recover, should we migrate these pages to SSL? Thanks,
On-Page Optimization | | akin671 -
How do I reduce the amount of internal links on my site?
Hi, Can someone help me with reducing the amount of internal links on our site please? https://www.thepresentfinder.co.uk Thanks Charlie
On-Page Optimization | | The-Present-Finder0 -
Internal 404 Error
Hi sorry for the newbie question, I have a few 404 pages on my moz crawl report. so for example this one : http://www.dwliverpoolphotography.co.uk/blog/www.coraclecomm.wordpress.com. How can I find the page that is linking to it so I can fix the link or delete it? Best wishes. David.
On-Page Optimization | | WallerD0 -
What Next after Weekly Crawl Errors Fixed?
Hi MoZ Community! Moz weekly crawl used to report following errors : I have fixed them all over the time. Can somebody give an idea what to do next with the help of Moz or away from Moz to improve my optimization practices. I believe this is just start of the SEO. Or how is it if I focus link building efforts and start fighting for keywords. Also any suggestions about the web? Tanveer | Duplicate Page Content 4XX (Client Error) Title Missing or Empty 5XX (Server Error) Missing Meta Description Tag Duplicate Page Title Title Element Too Long (> 70 Characters) Title Element Too Short Meta Refresh Temporary Redirect |
On-Page Optimization | | Sequelmed0 -
Handling multiple locations in the footer
I have a client with several locations. Should I include only the main office's address in the footer? The client is wanting to add them all.
On-Page Optimization | | SearchParty0 -
I want to check which pages have been crawled
I would like to find out which pages have been crawled by seomoz on my site
On-Page Optimization | | seoworx1230 -
Can I exclude sub-domains from the crawl diagnostics?
I am working with a site with 7500 pages in a sub-domain. The root site has 650 pages, but I am having difficulty finding and working with those 650 pages due to all the "noise" from the sub-domain pages that are in Ning and can't be fixed. Can i exclude sub-domains from the crawl?
On-Page Optimization | | robertdonnell0 -
Waiting 3 days for Crawl Test to complete
Being new to seomoz Im not sure if I understand the crawl test completely. You setup a campaign, enter all your info, rogerbot goes out and crawls your site and gives you results as to what your doing right and what is wrong or could use looking into. So once I get my results, I make edits to my site pages. In my case Im getting lots of duplicate content and duplicate titles. So I go back and make adjustments and then submit a crawl test to see the change results. In other tools Ive used in past I was able to re run crawl immediately and fine tune results on the fly. seomoz crawl test is still pending after three days. is this normal? or is there another way to make changes and run reports to see results instantly? If your working on many sites and making changes, having to wait 3 or more days to see how your changes were received seems like a long time.
On-Page Optimization | | anthonytjm0