My 404 page is showing a 4xx error. How can that be fixed?
-
My actual 404 page is giving a 4xx error.
The page address is http://www.ecowindchimes.com/v/404.aspIt loads fine... it is the page all 404's are directed to. Why is it showing a 404 error. The page works.
How can this be fixed?
Stephen
-
I think what you're seeing here is intentional behaviour, Stephen. It's Volusion's hack for working around the fact their system doesn't handle 404's "correctly".
Bottom line, when you see these, you still need to fix the issue with whatever URL was being sent to the 404 page, but don't worry that the 404 page itself seems to be "not found" according to its status code.
Here's an explanation for "why" this is happening, if you're interested:
Normally, when a user enters a URL that doesn't exist, the server sends back a 404 error header. In addition, the server's settings know that when sending back a 404 status because there's no such page, they should also show the server error page directly.
For a number of reasons, Volusion can't do this, so instead, they've instituted a catch-all redirect so visitors to non-existent pages get a 301-redirect to a regular website page that has been faked to look like a 404 page. Because that 404-looking page has been found and shown, it would normally have a 200 status, which means page found OK.
A little unorthodox, but OK so far as far as the user is concerned.
BUT! When a user hits a "page not found", the search engines want to get an actual 404 status error code back so they know not to index that non-existent URL. See the problem?
If the search engine gets a 200 response, it will assume that is the real page the visitor was trying to reach and will index the non-existent URL with the 404-ish looking content. Bad. So even though you - the user - can see the error page (200), Volusion has to give it a fake 404 status to give the search engines the correct information.
For a demonstration, go to this non-existent page http://www.cochranemusic.ca/oops You can see in your browser's URL bar that the page address is still http://www.cochranemusic.ca/oops even though the page itself shows the server error page content.
Now go to http://www.ecowindchimes.com/oops and notice that the URL in the address bar actually changes, because you've been forwarded to a page on your site called 404.asp. That's a real page on your website you're seeing that's been made to "look" like a server error page. Even though you've been redirected to a real (200) page, the server has to pretend it's a 404 status to mimic the correct behaviour.
Whew - that was confusing to try to explain, so let me know if it's still not clear.
Paul
P.S. To server admins: I know I've oversimplified the difference between a server's own 404 error page and an actual website page made to look like a 404. I do know the difference, but for the sake of keeping this explanation as straightforward as possible, I've glossed over it.
-
I agree you need to update your web.config file with the desired instructions!
-
I suppose that, the link path is incorrect in your web.config file
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ecommerce Category Pages
First, let's define the terminology for the various types of ecommerce pages. The terminology differs from organization to organization: Product Description Pages (PDPs): These pages have a single product, pricing, an "add to cart" button, reviews, and a product description. Product Listing Pages (PLPs): These are product category/subcategory pages that have product image links and text links to Product Description Pages (PDPs). Category Pages: These pages have subcategory image and text links to subcategory pages. No product images are displayed Hybrid Category Pages: these pages combine sub-Category Images and text at the top of the page and product listings below. Our CMS currently does not allow us to create hybrids. This conversation revolves primarily around mobile. Our ecommerce team is having discussions around the appropriate use of PLPs vs Category pages. After doing a quick audit of the mobile sites of some top ecommerce players, there is definitely a trend to use Category Pages at the top of the category and sub-category hierarchy and use PLPs at the very bottom. The logic from a usability perspective is to allow visitors to navigate a site without ever using the hamburger navigation. ex: Baby (Category Page) => Car Seats (Category Page) => Convertible Car Seats (PLP) The sites I audited all had hamburger menus. A visitor would navigate from a home page image for "Baby," an image on the "Baby" page to "Car Seats", and an image on the "Car Seats" page to the Convertible Car Seats page. At that point, they would be able to shop for "Convertible Car Seats" on a PLP. This appears to be excellent UX and easy to use navigation. Theoretically, good for SEO as well. In short, category and subcategory pages are being used as navigation to allow visitors to easily navigate to the bottom of the hierarchy and shop on the most narrow page in the hierarchy. Much easier to use than a hamburger menu, but it does entail more clicks. The discussion revolves around allowing users to shop for product at a higher level in the taxonomy. For example, what if a visitor wants to shop all Car Seats? In the above taxonomy, we are precluding users from shopping in this manner. There is no "Car Seats" PLP. Our CMS has the ability to create both a Category Page and a PLP for "Car Seats". We could theoretically place an image on the "Car Seats" category page for "View All Car Seats", and allow users to click to a "Car Seats" PLP. None of the major ecommerce players I've audited are adding a PLP option higher up in the hierarchy. That doesn't mean that it's not good UX. Problems: From an SEO perspective, having a Category Page and a PLP for "Car Seats" would cause cannibalization - they would be competing for the same keywords. I am skeptical that canonicals would work. The pages are not near duplicate content. One page has category images, the other has product images. We could place content blocks on the page to make them more similar. We could noindex the PLP, but that's a waste of internal link juice. Need advice: Will canonicals work in this situation? Should we trash this idea entirely? Does adding a PLP add value or confusion? Is noindex a good idea? Is there an option to target keyword variations with the PLP? Is there another solution?
Web Design | | Satans_Apprentice0 -
A question about title tag when the page has 2 services.
Hi all, Assuming a company has two services: SEO and PPC. Here is the situation: I would like to focus on SEO for now but also don't want to leave my PPC service out of the page. SEO accounts for 60% of the content, while PPC accounts for 40%. Assuming the content (SEO + PPC) of the page will not change, which title tag would you prefer, and why? SEO | brand name (Is it appropriate that the title focus on SEO but the content of the page contains PPC) SEO | PPC | brand name (Will the keywords dilute each other?) SEO | SEM Agency | brand name (The idea behind it is that SEM includes SEO and PPC so I think Google would be OK with the page ranking for SEO and also including PPC in the content. I really appreciate your help and explanation. Thank you!
Web Design | | Raymondlee0 -
Https pages indexed but all web pages are http - please can you offer some help?
Dear Moz Community, Please could you see what you think and offer some definite steps or advice.. I contacted the host provider and his initial thought was that WordPress was causing the https problem ?: eg when an https version of a page is called, things like videos and media don't always show up. A SSL certificate that is attached to a website, can allow pages to load over https. The host said that there is no active configured SSL it's just waiting as part of the hosting package just in case, but I found that the SSL certificate is still showing up during a crawl.It's important to eliminate the https problem before external backlinks link to any of the unwanted https pages that are currently indexed. Luckily I haven't started any intense backlinking work yet, and any links I have posted in search land have all been http version.I checked a few more url's to see if it’s necessary to create a permanent redirect from https to http. For example, I tried requesting domain.co.uk using the https:// and the https:// page loaded instead of redirecting automatically to http prefix version. I know that if I am automatically redirected to the http:// version of the page, then that is the way it should be. Search engines and visitors will stay on the http version of the site and not get lost anywhere in https. This also helps to eliminate duplicate content and to preserve link juice. What are your thoughts regarding that?As I understand it, most server configurations should redirect by default when https isn’t configured, and from my experience I’ve seen cases where pages requested via https return the default server page, a 404 error, or duplicate content. So I'm confused as to where to take this.One suggestion would be to disable all https since there is no need to have any traces to SSL when the site is even crawled ?. I don't want to enable https in the htaccess only to then create a https to http rewrite rule; https shouldn't even be a crawlable function of the site at all.RewriteEngine OnRewriteCond %{HTTPS} offor to disable the SSL completely for now until it becomes a necessity for the website.I would really welcome your thoughts as I'm really stuck as to what to do for the best, short term and long term.Kind Regards
Web Design | | SEOguy10 -
Footer links on my site... bad for passing page rank?
i've been told that it is possible that google discounts the weight or page rank passed in footer links of websites and my website has the navigation to many of my pages in the footer of each page. My whole website is about 20 pages so each page has links to the 5 most popular pages at the top and the rest of the links are in the footer of each page. Am i losing page rank by having these links in the footer? Should i make my navigation different? I have lots of articles on my site so i thought it might be not only helpful to my readers but give my pages an seo boost if i placed in context links in the body of my articles to other pages of my site. Does this sound like a good idea? Thanks mozzers! Thanks mozzers!
Web Design | | Ron100 -
Which Content Causes Duplicate Content Errors
My Duplicate Content list starts off with this URL: http://www.nebraskamed.com/about-us/branding/bellevue-medical-center-logo Then it lists the five below as Duplicate Content: http://www.nebraskamed.com/about-us/branding/fonts http://www.nebraskamed.com/about-us/branding/clear-zone http://www.nebraskamed.com/about-us/social-media http://www.nebraskamed.com/about-us/branding/order-stationery http://www.nebraskamed.com/about-us/branding/logo I do notice that most of these pages have images and/or little or no content outside of our sites template. Is this causing SEOmoz to see it as duplicate? Should I use noindex, follows to fix this? This error is happening with branding pages so noindex is an option. What should I do if that's not an option? Should I change our mega menus to be ajax driven to so the links aren't showing up in the code of every page?
Web Design | | Patrick_at_Nebraska_Medicine0 -
How keywords per page to keep from being "spammy"?
Hi all, I am currently doing a marketing internship for a B2B company that does all sorts of out-sourced recruiting work. I have some experience with SEO, but not completely confident. My first question is, I know Google sees websites that load up on keywords as "spammy", so what is the appropriate number of keywords per page? Currently, I was thinking about this setup: 1 keyword for the URL 1 keyword per alt tag (1 per page, at most) 2 keywords per each title tag (approximately 4 pages that I am going to follow internally, not following the "about us" page). After that, I was thinking of adding 2-3 more keywords in each meta description and 2-3 in the body copy. That would equate to 6-8 keywords on each page, is this too many and should keywords be repeated (on the same page or across multiple pages)? Since this website is brand new (zero links), would it make sense to nofollow all of the internal links so that they homepage can gain ranking as quickly as possible within Google?
Web Design | | wlw20090 -
Wordpress Pages not indexing in Google
Hi, I've created a Wordpress site for my client. I've produced 4 content pages and 1 home page but in my sitemap it only says I have 1 page indexed. Also SEOmoz only finds 1 page. I'm lost on what the problem could be. The domain name is www.dobermandeen.co.uk Many thanks for any help. Alex
Web Design | | SeoSheikh0 -
Why is this page removed from Google & Bing indices?
This page has been removed from indices at Bing and Google, and I can't figure out why. http://www.pingg.com/occasion/weddings This page used to be in those indices There are plenty of internal links to it The rest of the site is fine It's not blocked by meta robots, robots.txt or canonical URL There's nothing else to suggest that the page is being penalized
Web Design | | Ehren0