Does it matter? 404 v.s. 302 > Page Not Found
-
Hey Mozers,
What are your thoughts of this situation i'm stuck in all inputs welcome
I am in the middle of this massive domain migration to a new server. Also we are going to be having a very clean SEO friendly url structure. While I was doing some parsing and cleaning up some old urls I stumbled upon a strange situation on my website.
I have a bunch of "dead pages" and they are 302'd to a "page not found" probably a old mistake of one of the past developers. (To clarify the HTTP Status code is not 404)
Should I try to fight to get all these "dead pages" a 404 error code or could I just leave the temp redirect 302 > "page not found" ( even though I know for a fact theses pages are not going to turn on again)
-
Ya, that's a mess. Either 301 to a live page or let the 404 be delivered on its own.
-
Looks like my stystem is set up to 301 to a 404. Which do you think Google with honor? (the 301 or the 404)
Also wouldn't having a 301 to a 404 keep bots crawling our server because its a redirect instead of the server providing a 404 or 410
-
Depending upon how many you have (you said "a bunch"), there are multiple ways of fixing this.
If you are using a CMS, you can install a universal 404 handler, which will either direct a 404 page to the top level menu item of that group, or to the home page. If using a asp or html, I'm sure a decent developer could code one up for you.
If the dead pages are ranked (indexed) pages, I would make sure they forward to the new page that exists on your new server.
-
It sounds like they're 302'ing to a 200 instead of a 404 to a page that says, "page not found"... In that case, you should get them to be legitimate 404s or 410s. A custom 404 page would be a nice touch as well to help clean up the navigation.
By the time you migrate (via 301) you could point these to the most applicable section of your new site where they should have pointed on the old site and be pretty much done with your clean up.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Need only tens of pages to be indexed out of hundreds: Robots.txt is Okay for Google to proceed with?
Hi all, We 2 sub domains with hundreds of pages where we need only 50 pages to get indexed which are important. Unfortunately the CMS of these sub domains is very old and not supporting "noindex" tag to be deployed on page level. So we are planning to block the entire sites from robots.txt and allow the 50 pages needed. But we are not sure if this is the right approach as Google been suggesting to depend mostly on "noindex" than robots.txt. Please suggest whether we can proceed with robots.txt file. Thanks
Algorithm Updates | | vtmoz0 -
How much content is duplicate content? Differentiate between website pages, help-guides and blog-posts.
Hi all, I wonder that duplicate content is the strong reason beside our ranking drop. We have multiple pages of same "topic" (not exactly same content; not even 30% similar) spread across different pages like website pages (product info), blog-posts and helpguides. This happens with many websites and I wonder is there any specific way we need to differentiate the content? Does Google find the difference across website pages and blog-pots of same topic? Any good reference about this? Thanks
Algorithm Updates | | vtmoz0 -
Product pages - should the meta description match our product description?
Hi, I am currently adding new products to my website and was wondering, should I use our product description (which is keyword optimised) in the meta description for SEO purposes? Or would this be picked up by Google as duplicate content? Thanks in advance.
Algorithm Updates | | markjoyce1 -
Dates appear before home page description in the SERPs- HUGE drop in rankings
We have been on the first page of Google for a number of years for search terms including 'SEO Agency', 'SEO Agency London' etc. A few months ago we made some changes to the design of the home page (added a blog feed), and made changes to the website sitemap. Two days ago (two months after last site changes were made) we dropped subsantially in the SERPs for all home page keywords. Where we are found, a date appears before the description in the SERPs, dating February 2012 (which is when we launched the original website). The site has been through a revamp since then, yet it still shows 2012. This has been followed by a few additional strange things, including the sitelinks that Google is choosing to show (which including author bio pages showing in homepage site links), and googling our brand name no longer brings up sitelinks in the SERPs. The problem only affects the home page. All other pages are performing as standard. When Penguin 4.0 came out we saw a noted improvement in our SERP performance, and our backlinks are good and quality, largely from PR efforts. Of course, I would be interested in additional pairs of eyes on the back links to see if anyone thinks that I have missed anything! We have 3 of our senior SEOs working on trying to figure out what is going on and how to resolve it, but I would be very interested if anyone has any thoughts?
Algorithm Updates | | GoUp3 -
What's the best way to go about building/using interactive snippets?
I'm starting to see interactive snippets (I guess they're called islands) like the attached image in our SERPs, so I figured I would look into experimenting with them, but I'm not entirely clear how to proceed. I have only seen them in adwords, so is that the only way you can use them? Is there some way to set them up or some service you need to set them up organically? Lost, but intrigued, Ruben SW7ak4d.jpg
Algorithm Updates | | KempRugeLawGroup0 -
Diluting your authority - adding pages diluting rankings of other pages?
I'm looking after a site that has around 400 pages. All of these pages rank pretty well for the KW they are targetting. My question is: if we add another 400 pages without doing any link building work, holding DA the same, 1) would the rankings of those 400 previously good pages diminish? and 2) Would the new pages, as more and more new ones are created, rank less and less well?
Algorithm Updates | | xoffie0 -
Why won't my keyword search results appear on google's top 50?
They are only appearing on Bing and Yahoo, can anyone share some insight?Help? Here is the URL: www.aaexs.com
Algorithm Updates | | RealmindTechnology0 -
Google said that low-quality pages on your site may affect rankings on other parts
One of my sites got hit pretty hard during the latest Google update. It lost about 30-40% of its US traffic and the future does not look bright considering that Google plans a worldwide roll-out. Problem is, my site is a six year old heavy linked, popular Wordpress blog. I do not know why the article believes that it is low quality. The only reason I came up with is the statement that low-quality pages on a site may affect other pages (think it was in the Wired article). If that is so, would you recommend blocking and de-indexing of Wordpress tag, archive and category pages from the Google index? Or would you suggest to wait a bit more before doing something that drastically. Or do you have another idea what I could to do? I invite you to take a look at the site www.ghacks.net
Algorithm Updates | | badabing0