Crawl errors for pages that no longer exist
-
Hey folks,
I've been working on a site recently where I took a bunch of old, outdated pages down. In the Google Search Console "Crawl Errors" section, I've started seeing a bunch of "Not Found" errors for those pages. That makes perfect sense.
The thing that I'm confused about is that the "Linked From" list only shows a sitemap that I ALSO took down. Alternatively, some of them list other old, removed pages in the "Linked From" list.
Is there a reason that Google is trying to inform me that pages/sitemaps that don't exist are somehow still linking to other pages that don't exist? And is this ultimately something I should be concerned about?
Thanks!
-
Thanks for the question, this can definitely be annoying for webmasters!
Unfortunately, bots can don't everything in parallel. They have to take steps...
Step 1. Take List #1 of links.
Step 2. Crawl those links and build List #2.
Step 3. Crawl List #3 and build List #4...Now, sometimes it doesn't follow that same order. Let's say that in Step 3 it finds a bunch of pages with unique content. Maybe the next time around, it goes and checks some of those links in Step 3 without first checking if they were still linked. Why start the crawl all the way from the beginning again when you have a big list of URLs?
But, this creates a problem. When some of those links it crawled in Step 3 aren't there any more, Google will tell you they aren't there and tell you how they originally found them (which happened to be from a page in List #1). But what if Google hasn't checked that link in List #1 recently? What if you just removed it too?
Well, for a little while, at least, you will end up with errors.
Now, here comes the real rub - how long will it take for Google to find and correct that message it left you in the crawl report? Days? Weeks? Months? Who knows. Your best bet is to mark them as fixed and force Google to keep rechecking. Eventually, they will figure it out.
TL;DR; it is a data freshness and reporting issue that isn't your fault and isn't worth your time.
-
No - Google is just showing how slow it is when updating data in Webmaster tools.
Don't worry - if you wait long enough they'll go away. You could also mark them as solved (do this only if you are sure that there are no links pointing to these pages - to check if your internal linking is ok Screaming Frog is great tool)
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have to delete old thankyou page configuration?
Currently i am tracking thank you page conversion via google tag manager by doing following configuration in tag manager :- "Page URL Contains thankyoupage.html" But now i am implementing Enhance Ecommerce with tag manager with following configuration : - Custom event - "Event equals Transaction" I have test at staging level conversion working fine. But i have only one doubt that now I am implementing Enhance Ecommerce with tag manager so i have to delete old tag configuration right? i.e. "Page URL Contains thankyoupage.html"
Reporting & Analytics | | varo0 -
Google Webmaster indicates robots.text access error
Seems that Google has not been crawling due to an access issue with our robots.txt
Reporting & Analytics | | jmueller0823
Late 2013 we migrated to a new host, WPEngine, so things might have changed, however this issue appears to be recent. A quick test shows I can access the file. This is the Google Webmaster Tool message: http://www.growth trac dot com/: Googlebot can't access your site January 17, 2014 Over the last 24 hours, Googlebot encountered 62 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 8.8% Note the above message says 'over the last 24 hours', however the date is Jan-17 This is the response from our host:
Thanks for contacting WP Engine support! I looked into the suggestions listed below and it doesn't appear that these scenarios are the cause of the errors. I looked into the server logs and I was only able to find 200 server responses on the /robots.txt. Secondly I made sure that the server wasn't over loaded. The last suggestion doesn't apply to your setup on WP Engine. We do not have any leads as to why the errors occurred. If you have any other questions or concerns, please feel free to reach out to us. Google is crawling the site-- should I be concerned? If so, is there a way to remedy this? By the way, our robots file is very lean, only a few lines, not a big deal. Thanks!0 -
Why is Google Analytics reporting 20% fewer goals than Unique pageviews of same thank you page?
This is really puzzling me and my research has not thrown out the answer. I have always understood URL goals to be unique pageviews of the thank you page you are tracking. UPVs and goals should both only be counted once per session... Has anyone else seen this issue? Goals were not set up historically so I wanted to use unique pageviews of the thank you page for year on year comparisons, but 20% is a big difference! Background There are multiple pages to track so goal is set up using Regex There is no mistake in the goal set up (honest!) The goal URLs all match the unique pageview URLs, there are no rogue URLs There has been no change to the site or the tracking set up Data is not being sampled It's a lead gen site in an area where multiple enquiries within one visit would be very unusual Thanks in advance!
Reporting & Analytics | | McCannSEO0 -
Sitemap 404 error
I have generated a .xml sitemap of the site www.ihc.co.uk. The sitemap generated seems all fine, however when submitting to webmaster tools, it is returning a 404 error? anyone experienced this before. deleted and re-done the process. Tried different xml sitemap generators and even cleared cache along the way.
Reporting & Analytics | | dentaldesign0 -
Figuring Out the Source of "direct traffic" by looking at landing page parameters
I have a client who runs an e-commerce website, and I noticed that 40% of his traffic and 25% of his sales are all attributable to Direct Traffic. At first, I tried to solve this problem by tagging all of the previously untagged links in his e-newsletter, which I expect to be very helpful. However, then I looked at the landing pages for his direct traffic, and I see that it is almost entirely filled with thousands of unique URLs that begin with a question mark followed by the name of his e-newsletter or shopping cart vendor. It would be the equivalent of having a url like the following: "www.willmarlow.com/?constantcontact=keya;sldkfjsdlfkjdf;sldkjf" If we have this amount of information in the link, shouldn't there be a way to add additional parameters to the URL to move this traffic out of the Direct column? Has anyone encountered this before? Thanks.
Reporting & Analytics | | williammarlow0 -
Rel=nofollow link to a NoFollow, NoIndex Page?
I have a multitude of "schedule a demo" pages/forms on my site that are all identical, so I have on all of them. My question is, should I also place on the link to the "schedule a demo" pages? I know the generic rule is to never nofollow any internal links (per Matt Cutts http://www.youtube.com/watch?v=bVOOB_Q0MZY), but should that still apply if the link directs to a page that is noindex, nofollow? Thanks for your input in advance; don't want to run into, as Matt Cutts puts it, "a mini kerfuffle!"
Reporting & Analytics | | BethA0 -
Tracking pages in two separate analytics accounts
Hi All, I'm trying to track some pages on one website in two separate Google Analytics accounts. Has anybody done this before that could help with the tracking code? Thanks in advance, Elias
Reporting & Analytics | | A_Q0 -
Huge Ranking drop without touching the page
I was ranked #5 for the keyword "Become a Sportscaster" - had been in the top 10 for months, then all of a sudden dropped to #46. I haven't touched this page in a few months, so I don't think I messed anything up. Any thoughts on why such a dramatic drop would happen? Brian Clapp Founder, SportsTVJobs.com
Reporting & Analytics | | sportstvjobs0