404 errors
-
Hey everyone. Appreciate your insight on this. I just finished redesigning my website today and just published it to my server. I decided to go with a real basic html site figuring I may get better results with the search engines. I still have a bunch of optomizing to do but I have a question.
Since I was using aspx it is safe to say that many sites will be linked to those old pages. In the interest of not loosing this traffic I told IIS7 to do a 302 redirect to my home page for any 404 errors. Is this the best thing to do or is their a better way?
Thanks much
Ron
-
I agree with Ryan. You need to setup 301 redirects in IIS for all the old pages.
Never use a 302.
-
You want to go through each page of your old site and if it had any links or authority at all, create a 301 redirect for it. That tells the engines your page has permanently moved, and tells them where to find the new page. This will preserve the majority of your authority for every page.
If you 302 them all to the homepage you're telling the search engine "this content is related and is a substitute, for now". However, you want -all- your pages to continue being indexed with as much authority as possible, not just the homepage.
After you've got your 301's set up correctly, then you can focus on creating a 404 page that is helpful to your users. Usually this means suggesting content based on the keywords in the URL, or including a search or link to the homepage.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
AJAX endpoints returning 404 errors in GWT. Why!?!?
Hi guys, So I'm working through a large dataset of 404 errors and trying to clean up the site's crawl-ability. A piece of the puzzle I can't seem to wrap my head around has to do with AJAX endpoints. It looks like GWT thinks these are URLs that don't exist and, therefore, is reporting them as 404 errors. Anyone experience this before?
Web Design | | brad_dubs0 -
My 404 page is showing a 4xx error. How can that be fixed?
My actual 404 page is giving a 4xx error.
Web Design | | sbetzen
The page address is http://www.ecowindchimes.com/v/404.asp It loads fine... it is the page all 404's are directed to. Why is it showing a 404 error. The page works. How can this be fixed? Stephen0 -
Redirects (301/302) versus errors (404)
I am not able to convincingly decide between using redirects versus using 404 errors. People are giving varied opinions. Here are my cases 1. Coding errors - we put out a bad link a. Some people are saying redirect to home page; the user at least has something to do PLUS more importantly it does NOT hurt your SEO ranking. b. Counter - the page ain't there. Return 404 2. Product removed - link1 to product 1 was out there. We removed product1; so link1 is also gone. It is either lying in people's bookmarks, OR because of coding errors we left it hanging out at some places on our site.
Web Design | | proptiger0 -
Which Content Causes Duplicate Content Errors
My Duplicate Content list starts off with this URL: http://www.nebraskamed.com/about-us/branding/bellevue-medical-center-logo Then it lists the five below as Duplicate Content: http://www.nebraskamed.com/about-us/branding/fonts http://www.nebraskamed.com/about-us/branding/clear-zone http://www.nebraskamed.com/about-us/social-media http://www.nebraskamed.com/about-us/branding/order-stationery http://www.nebraskamed.com/about-us/branding/logo I do notice that most of these pages have images and/or little or no content outside of our sites template. Is this causing SEOmoz to see it as duplicate? Should I use noindex, follows to fix this? This error is happening with branding pages so noindex is an option. What should I do if that's not an option? Should I change our mega menus to be ajax driven to so the links aren't showing up in the code of every page?
Web Design | | Patrick_at_Nebraska_Medicine0 -
Is it common to have some of error/warning(currency duplicate,redirect, etc...) in most website that rank well?
Hi could any body could give me some idea on 'on page optimisation' Currently in my campaign I have around 3000+ errors, 14,000+ warning, 7000+ notices for the following reasons: Overly-Dynamic URL
Web Design | | LauraHT
Temporary Redirect
Title Element Too Long (> 70 Characters)
Duplicate Page Title
etc... First of all I know these have negative effect on SEO. Now to fix towards those issues it involve lots of works and times. At the same time most of our important keywords/url rank position have not changed over the last 12 months. Does that mean the above has only limited negative effect? I just want to know is it worthy to invest the man/hour/money to clean those issues. As it involves decent development time. Is it common to have some of error/warning in most website that rank well? (e.g. I 've seem may big website have duplicate title/meta-desc on their currency variant page)0 -
404 page not found after site migration
Hi, A question from our developer. We have an issue in Google Webmaster Tools. A few months ago we killed off one of our e-commerce sites and set up another to replace it. The new site uses different software on a different domain. I set up a mass 301 redirect that would redirect any URLs to the new domain, so domain-one.com/product would redirect to domain-two.com/product. As it turns out, the new site doesn’t use the same URLs for products as the old one did, so I deleted the mass 301 redirect. We’re getting a lot of URLs showing up as 404 not found in Webmaster tools. These URLs used to exist on the old site and be linked to from the old sitemap. Even URLs that are showing up as 404 recently say that they are linked to in the old sitemap. The old sitemap no longer exists and has been returning a 404 error for some time now. Normally I would set up 301 redirects for each one and mark them as fixed, but there are almost quarter of a million URLs that are returning 404 errors, and rising. I’m sure there are some genuine problems that need sorting out in that list, but I just can’t see them under the mass of errors for pages that have been redirected from the old site. Because of this, I’m reluctant to set up a robots file that disallows all of the 404 URLs. The old site is no longer in the index. Searching google for site:domain-one.com returns no results. Ideally, I’d like anything that was linked from the old sitemap to be removed from webmaster tools and for Google to stop attempting to crawl those pages. Thanks in advance.
Web Design | | PASSLtd0 -
URL parameters causing duplicate content errors
My ISP implemented product reviews. In doing so, each page has a possible parameter string of ?wr=1. I am not receiving duplicate page content and duplicate page title errors for all my product URLs. The report shows the base URL and the base URL?wr=1. My ISP says that the search engines won't have a problem with the parameters and a check of Google Webmaster Tools for my site says I don't have any errors and recommends against configuring URL parameters. How can I get SEOmoz to stop reporting these errors?
Web Design | | NiftySon1 -
Custom 404 Page Indexing
Hi - We created a custom 404 page based on SEOMoz recommendations. But.... the page seems to be receiving traffic via organic search. Does it make more sense to set this page as "noindex" by its metatag?
Web Design | | sftravel0