Can we retrieve all 404 pages of my site?
-
Hi,
Can we retrieve all 404 pages of my site?
is there any syntax i can use in Google search to list just pages that give 404?
Tool/Site that can scan all pages in Google Index and give me this report.
Thanks
-
The 404s in webmaster tools relate to crawl errors. As such they will only appear if internally linked. It also limits the report to the top 1000 pages with errors only.
-
Set up a webmaster tools account for your site. You should be able to see all the 404 error urls.
-
I wouldn't try to manually remove that number of URLs. Mass individual removals can cause their own problems.
If the pages are 404ing correctly, then they will be removed. However it is a slow process. For the number you are looking at it will mostly likely take months. Google has to recrawl all of the URLs before it even knows that they are returning a 404 status. It will then likely wait a while and do it again before removing then. That's a painful truth and there really is not anything much you can do about it.
It might (and this is very arguable) be worth ensuring that there is a crawl path to the 404 content. So maybe a link from a high authority page to a "recently removed content" list that contains links to a selection and keep replacing that list. This will help that content get recrawled more quickly, but it will also mean that you are linking to 404 pages which might send quality signal issues. Something to weigh up.
What would work more quickly is to mass remove in particular directories (if you are lucky enough that some of your content fits that pattern). If you have a lot of urls in mysite.com/olddirectory and there is definitely nothing you want to keep in that directory then you can lose big swathes of URLs in one hit - see here: https://support.google.com/webmasters/answer/1663427?hl=en
Unfortunately that is only good for directories, not wildcards. However it's very helpful when it is an option.
So, how to find those URLs? (Your original question!!).
Unfortunately there is no way to get them all back from google. Even if you did a search for site:www.mysite.com and saved all of the results it will not return the number of results that you are looking for.
I tend to do this by looking for patterns and removing those to find more patterns. I'll try to explain:
- Search for site:www.yoursite.com
- Scroll down the list until you start seeing a pattern. (eg mysite.com/olddynamicpage-111.php , mysite.com/olddynamicpage-112.php , mysite.com/olddynamicpage-185.php etc) .
- Note that pattern (return later to check that they all return a 404 )
- Now search again with that pattern removed, site:www.mysite.com -inurl:olddynamicpage
- Return to step 2
Do this (a lot) and you start understanding the pattern that have been picked up. There are usually a few that account for large number of the incorrectly indexed URLs. In the recent problem I did they were almost all relating to "faceted search gone wrong".
Once you know the patterns you can check that the correct headers are being returned so that they start dropping out of the index. If any are directory patterns then you can remove than in big hits through GWMT.
It's painful. It's slow, but it does work.
-
Yes you need right at the same time to know which of the google indexed ones are 404
As google does not remove the dead 404 pages for months and was thinking to manually add them for removal in webmaster tools but need to find all of them that are indexed but 404
-
OK - that is a bit of a different problem (and a rather familiar one). So the aim is to figure out what the 330 "phantom" pages are and then how to remove them?
Let me know if I have that right. If I have then I'll give you some tips based on me doing to same with a few million URLs recently. I'll check first though, as it might get long!
-
Thanks you
I will try explaining my query again and you can correct me if the above is the solution again
1. My site has 70K pages
2. Google has indexed 500K pages from the site
Site:mysitename shows this
We have noindexed etc on most of them which is got down the counts to 300K
Now i want to find the pages that show 404 for our site checking the 300K pages
Webmaster shows few hundred as 404 but am sure there are many more
Can we scan the index rather then the site to find the ones Google search engine has indexed that are 404
-
As you say, on site crawlers such as Xenu & Screaming frog will only tell you when you are linking to 404 pages, not where people are linking to your 404 pages.
There are a few ways you can get to this data:
Your server logs : All 404 errors will be recorded on your server. If someone links to a non-existent page and that link is ever followed by a single user or a crawler like google-bot, that will be recorded in your server log files. You can access those directly (or pull 404s out of them on a regular, automatic basis). Alternatively most hosting comes with some form of log analysis built in (awstats being one of the most common). That will show you the 404 errors.
That isn't quite what you asked, as it doesn't mean that they have all been indexed, however that will be an exhaustive list that you can then check against.
Check that backlinks resolve : Download all of your backlinks (OSE, webmaster tools, ahreafs, majestic), look at the target and see what header is returned. We use a custom build tools called linkwatchman to do this on an automatic regular basis. However as an occasional check you can download in to excel and use the excellent SEO Tools for excel to do this for free. ( http://nielsbosma.se/projects/seotools/ <- best seo tool around)
Analytics : As long as your error pages trigger the google analytics tracking code you can get the data from here as well. Most helpful when the page either triggers a custom variable, or uses a virtual url ( 404/requestedurl.html for instance). Isolate the pages and look at where the traffic came from.
-
It will scan and list you all results, like 301 redirect, 200, 404 errors, 403 errors. However, screaming frog can spider upto 500 urls in there free product
If you have more, suggest to go with Xenu Link Sleuth. Download it, get your site crawled and get all pages including server error 404 to unlimited pages.
-
Thanks but this would be scanning pages in my site. How will i find 404 pages that are indexed in Google?
-
Hey there
Screaming Frog is a great (and free!) tool that lets you do this. You can download it here
Simply insert your URL and it will spider all of the URLs it can find for your site. It will then serve up a ton of information about the page, including whether it is a 200, 404, 301 or so on. You can even export this information into excel for easy filtering.
Hope this helps.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How long will old pages stay in Google's cache index. We have a new site that is two months old but we are seeing old pages even though we used 301 redirects.
Two months ago we launched a new website (same domain) and implemented 301 re-directs for all of the pages. Two months later we are still seeing old pages in Google's cache index. So how long should I tell the client this should take for them all to be removed in search?
Intermediate & Advanced SEO | | Liamis0 -
SEO implications of off-site secure Donation page
Hi Mozzers, I have a non-profit client that defends wildlife and public lands in the western US. The huge website is currently not responsive so we are working on that. In the meantime, we will be making the Action pages (such as Donations, Sign Petition, Get Newsletter) pages responsive. This will be housed under a new domain. My question is, what are SEO best practices for doing this? Does it negatively impact SEO to have a visitor "booted" from a site to a second secure site? Does Google know that the Donation site is in fact fulfillment and is linked to the original site? Also, what about domain implications? Is it best to have the name of the non-profit in the domain or sub, like this: https://saveanimals.secure.com/donate? Thanks everyone!
Intermediate & Advanced SEO | | CalamityJane770 -
Making Filtered Search Results Pages Crawlable on an eCommerce Site
Hi Moz Community! Most of the category & sub-category pages on one of our client's ecommerce site are actually filtered internal search results pages. They can configure their CMS for these filtered cat/sub-cat pages to have unique meta titles & meta descriptions, but currently they can't apply custom H1s, URLs or breadcrumbs to filtered pages. We're debating whether 2 out of 5 areas for keyword optimization is enough for Google to crawl these pages and rank them for the keywords they are being optimized for, or if we really need three or more areas covered on these pages as well to make them truly crawlable (i.e. custom H1s, URLs and/or breadcrumbs)…what do you think? Thank you for your time & support, community!
Intermediate & Advanced SEO | | accpar0 -
How to remove my site's pages in search results?
I have tested hundreds of pages to see if Google will properly crawl, index and cached them. Now, I want these pages to be removed in Google search except for homepage. What should be the rule in robots.txt? I use this rule, but I am not sure if Google will remove the hundreds of pages (for my testing). User-agent: *
Intermediate & Advanced SEO | | esiow2013
Disallow: /
Allow: /$0 -
Pages Titles in SERPs - Wordpress Site
In Google SERPs we have several websites (built in wordpress) who's pages are being displayed without using the page title - is this google ignoring the page title or is there a problem in our code - also if this is google is it still taking notice of the page title to determine what content is on the page?I have read several articles on this but wondered if someone can advise - I can provide the URL if required.Also I wanted to 100% that our robots.txt is behaving its self.
Intermediate & Advanced SEO | | JohnW-UK0 -
How to Build High Quality eCommerce Web Site during Low Quality Web Pages?
Today, I was reading Official Google Webmaster Central Blog: More guidance on building high-quality sites. I found one interesting statement over there. Low-quality content on some parts of a website can impact the whole site’s rankings. Why should I like to discuss on this topic? Because, I have made big change on my website via narrow by search. I want to give specific result to know more about it. This is my category page: http://www.vistastores.com/patio-umbrellas Left narrow by search section is creating accurate page for specific attribute products. California Umbrella:
Intermediate & Advanced SEO | | CommercePundit
http://www.vistastores.com/patio-umbrellas/shopby/manufacturer-california-umbrella From above page following page is accessible. http://www.vistastores.com/patio-umbrellas/shopby/canopy-shape-search-octagonal/manufacturer-california-umbrella Sunbrella Patio Umbrellas:
http://www.vistastores.com/patio-umbrellas/shopby/canopy-fabric-search-sunbrella Similar story for this page. Following page can accessible from above page. http://www.vistastores.com/patio-umbrellas/shopby/canopy-fabric-search-sunbrella/finish-search-wood My website have 100+ categories, 11,000 products. I have checked indexed pages in Google for my website. https://www.google.com/search?q=info%3Awww.vistastores.com&pws=0&gl=US#hl=en&safe=off&pws=0&gl=US&q=site:www.vistastores.com&bav=on.2,or.r_gc.r_pw.r_cp.r_qf.,cf.osb&fp=910893d99351c8f7&biw=1366&bih=547 It shows me 35,000+ crawled pages which are developed by left navigation section. So, Will it consider as low quality pages? I want to improve my website performance without delete these pages.0 -
How can scraper sites be successful post Panda?
I read this article on SEJ: http://www.searchenginejournal.com/scrapers-and-the-panda-update/34192/ And, I'm a bit confused as to how a scraper site can be successful post Panda? Didn't panda specifically target sites that have duplicate content & shouldn't scraper sites actually be suffering?
Intermediate & Advanced SEO | | nicole.healthline0 -
Do sites with a small number of content pages get penalized by Google?
If my site has just five content pages, instead of 25 or 50, then will it get penalized by Google for a given moderately competitive keyword?
Intermediate & Advanced SEO | | RightDirection0