Hundreds of thousands of 404's on expired listings - issue.
-
Hey guys,
We have a conundrum, with a large E-Commerce site we operate. Classified listings older than 45 days are throwing up 404's - hundreds of thousands, maybe millions. Note that Webmaster Tools peaks at 100,000.
Many of these listings receive links.
Classified listings that are less than 45 days show other possible products to buy based on an algorithm.
It is not possible for Google to crawl expired listings pages from within our site. They are indexed because they were crawled before they expired, which means that many of them show in search results.
-> My thought at this stage, for usability reasons, is to replace the 404's with content - other product suggestions, and add a meta noindex in order to help our crawl equity, and get the pages we really want to be indexed prioritised.
-> Another consideration is to 301 from each expired listing to the category heirarchy to pass possible link juice. But we feel that as many of these listings are findable in Google, it is not a great user experience.
-> Or, shall we just leave them as 404's? : google sort of says it's ok
Very curious on your opinions, and how you would handle this.
Cheers,
Croozie.
P.S I have read other Q & A's regarding this, but given our large volumes and situation, thought it was worth asking as I'm not satisfied that solutions offered would match our needs.
-
Wow! Thanks Ryan.
I'm sure it won't surprise you to know that I'm always reading eagerly when I see you respond to a question as well.
-
Thanks Ian, good to know Again, good confirmation.
-
Hi Sha,
Spot on. Yes that was my original thinking, then I switched to the school of 200's with meta index's. But having you guys confirming this, makes me realise that doing 301's to the parent category is most certainly the way to go.
Permanently redirecting will have the added benefit of effectively 'de-indexing' the original classified's and of course throwing a ton of link juice over to the category levels.
What a wonderful, helpful community!
Many thanks,
Croozie.
-
Sha, your responses continuously offer outstanding actionable items which offer so much value. I love them so much as they offer such great ideas and demonstrate a lot of experience.
-
Hi Croozie,
Awesome work once again from Ryan!
Since your question feels like a request for suggestions on "how" to create a solution, just wanted to add the following.
When you say "classified listings" I hear "once off, here for a while, gone in 45 days content".
If that is the case, then no individual expired listing will ever be matched identically with another (unless it happens to be a complete duplicate of the original listing).
This would mean that it would certainly be relevant to send any expired listing to a higher order category page. If your site structure is such that you have a clear heirarchy, then this is very easy to do.
For example:
If your listing URL were something like http://www.mysite.com/listings/home/furniture/couches/couch-i-hate.php, then you can use URL rewrites to strip out the file name and 301 the listing to http://www.mysite.com/listings/home/furniture/couches/, which in most cases will offer a perfectly suitable alternative for the user.
There is another alternative you could consider if you have a search program built in - you could send the traffic to a relevant search. In the above example, mysite.com/search.php?s=couch.
Hope that helps,
Sha
-
We are now doing something similar with our site. We have several thousand products that have been discontinued and didn't think about how much link juice we were throwing away until we got Panda pounded. It's amazing how many things you find to fix when times get tough.
We started with our most popular discontinued products and are 301 redirecting them to either a new equivalent or the main category if no exact match can be found.
We are also going to be reusing the same product pages for annual products instead of creating new pages each year. Why waste all that link juice from past years?
-
If you perform a redirect, I recommend you offer a 301 header response, not a 200. The 301 response will let Google and others know the URL should be updated in their database. Google would then offer the new URL in search results. Additionally any link value can be properly forwarded to the new page.
-
Thanks Ryan,
Massive response! Awesome!
It's interesting that you talk a lot about the 301's.
Are you suggesting this would be far more preferable than simply producing a 200 status code page, listing product choices based on an algorithm - which we currently offer our customers for listings expired less than 45 days?
I suppose, to clarify, I'm worried that if we were to do that (produce 200 status code pages), then crawl equity would be reduced for Google, that we would be wasting a lot of their bandwidth on 200 status pages, when they could be better off crawling and indexing more recent pages.
Whereas with 301's to relevant products as you suggest, we solve that issue.
BTW, our 404 pages offer the usual navigation and search options.
Cheers,
Croozie.
-
Hi Croozie.
The challenge with your site is the volume of pages. Most large sites with 100k+ pages have huge SEO opportunities. Ideally you need a team which can manually review every page of your site to ensure it is optimized correctly. Such a team would be a large expense which many site owners choose to avoid. The problem is your site quality and SEO are negatively impacted.
Whenever a page is removed from your site or otherwise becomes unavailable, a plan should be in place PRIOR to removing the page. The plan should address the simple question: how will we handle traffic to the page whether it is from a search engine or a person who bookmarked the page or a link. The suggested answer is the same whether your site has 10 pages or a million pages:
- if the product is being replaced with a very similar product, or you have a very similar product, then you can choose to 301 the page to the new product. If the product is truly similar, then the 301 redirect is a win for everyone.
Example A: You offer a Casio watch model X1000. You stop carrying this watch and replace it with Casio watch model X1001. It is the same watch design but the new model has a slight variation such as a larger dial. Most users who were interested in the old page would be interested in the new page.
Example B: You offered the 2011 version of the Miami Dolphins T-shirt. It is now 2012 and you have the 2012 version of the shirt which is a different design. You can use a 301 to direct users to the latest design. Some users may be unhappy and want the old design, but it is still probably the right call for most users.
Example You discontinue the Casio X1000 and do not have a very close replacement. You could 301 the page to the Casio category page, or you could let it 404.
The best thing to do in each case is to put on your user hat and ask yourself what would be the most helpful thing you can do to assist a person seeking the old content. There is absolutely nothing wrong with allowing a page to 404. It is a natural part of the internet.
One last point. Be sure your 404 page is optimized, especially considering how many 404s you present. The page should have the normal site navigation along with a search function. Help users find the content they seek.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Indexed Site A's Content On Site B, Site C etc
Hi All, I have an issue where the content (pages and images) of Site A (www.ericreynolds.photography) are showing up in Google under different domains Site B (www.fastphonerepair.com), Site C (www.quarryhillvet.com), Site D (www.spacasey.com). I believe this happened because I installed an SSL cert on Site A but didn't have the default SSL domain set on the server. You were able to access Site B and any page from Site A and it would pull up properly. I have since fixed that SSL issue and am now doing a 301 redirect from Sites B, C and D to Site A for anything https since Sites B, C, D are not using an SSL cert. My question is, how can I trigger google to re-index all of the sites to remove the wrong listings in the index. I have a screen shot attached so you can see the issue clearer. I have resubmitted my site map but I'm not seeing much of a change in the index for my site. Any help on what I could do would be great. Thanks
Intermediate & Advanced SEO | | cwscontent
Eric TeVM49b.png qPtXvME.png1 -
Category Page - Optimization the product's anchor.
Hello, Does anybody have real experience optimizing internal links in the category page? The category pages of my actual client uses a weird way to link to their own products. Instead of creating diferents links (one in the picture, one in the photo and one in the headline), they create only one huge link, using everything as anchor (picture, text, price, etc.). URL: http://www.friasneto.com.br/imoveis/apartamentos/para-alugar/campinas/ This technique can reduce the total number of links in the page, improving the strenght of the other links, but also can create a "crazy" anchor text for the product. Could I improve my results creating the standard category link (one in the picture, one in the photo and one in the headline)? Hope it's not to confuse.
Intermediate & Advanced SEO | | Nobody15569049633980 -
Complicated Duplicate Content Question...but it's fun, so please help.
Quick background: I have a page that is absolutely terrible, but it has links and it's a category page so it ranks. I have a landing page which is significantly - a bizillion times - better, but it is omitted in the search results for the most important query we need. I'm considering switching the content of the two pages, but I have no idea what they will do. I'm not sure if it will cause duplicate content issues or what will happen. Here are the two urls: Terrible page that ranks (not well but it's what comes up eventually) https://kemprugegreen.com/personal-injury/ Far better page that keeps getting omitted: https://kemprugegreen.com/location/tampa/tampa-personal-injury-attorney/ Any suggestions (other than just wait on google to stop omitting the page, because that's just not going to happen) would be greatly appreciated. Thanks, Ruben
Intermediate & Advanced SEO | | KempRugeLawGroup0 -
What is happening with this page's rankings? (G Analytics screenprint attached) help me.
Hi, At the moment im confused. I have a page which shows up for the query 'bank holidays' first page solid for 2 years - this also applies to the terms 'mothers day', 'pancake day' and a few others (UK Google). And there still ranking. Here is the problem: Usually I would rank for 'bank holidays 2014' (the terms with the year in are the real traffic drivers) and would be position 3/5. Over the last 3 months this has decayed dropping position to 30+. From the screenprint you can see the term 'Bank Holidays' is holding on but the term 'bank holidays 2014' is slowly decaying. If you query 'bank holidays 2015' we don't appear in rankings at all. What is causing this? The content is ok, social sharing happens and the odd link is picked up hear and there. I need help, how do I start pushing this back in the other direction, its like the site is slowly dying. And what really kills me, is 2 pages are ranking on page1 off link farms. URL: followuk.co.uk/bank-holidays serp-decay.jpg
Intermediate & Advanced SEO | | followuk0 -
Big discrepancies between pages in Google's index and pages in sitemap
Hi, I'm noticing a huge difference in the number of pages in Googles index (using 'site:' search) versus the number of pages indexed by Google in Webmaster tools. (ie 20,600 in 'site:' search vs 5,100 submitted via the dynamic sitemap.) Anyone know possible causes for this and how i can fix? It's an ecommerce site but i can't see any issues with duplicate content - they employ a very good canonical tag strategy. Could it be that Google has decided to ignore the canonical tag? Any help appreciated, Karen
Intermediate & Advanced SEO | | Digirank0 -
How does Google determine 'top refeferences'?
Does anyone have any insight into how Google determines 'top references' from medical websites?
Intermediate & Advanced SEO | | nicole.healthline
For example, if you search 'skin disorders,' you'll see 'Sources include <cite>nih.gov</cite>, <cite>medicinenet.com</cite> and <cite>dmoz.org</cite>'--how is that determined?0 -
How to check a website's architecture?
Hello everyone, I am an SEO analyst - a good one - but I am weak in technical aspects. I do not know any programming and only a little HTML. I know this is a major weakness for an SEO so my first request to you all is to guide me how to learn HTML and some basic PHP programming. Secondly... about the topic of this particular question - I know that a website should have a flat architecture... but I do not know how to find out if a website's architecture is flat or not, good or bad. Please help me out on this... I would be obliged. Eagerly awaiting your responses, BEst Regards, Talha
Intermediate & Advanced SEO | | MTalhaImtiaz0 -
Culling 99% of a website's pages. Will this cause irreparable damage?
I have a large travel site that has over 140,000 pages. The problem I have is that the majority of pages are filled with dupe content. When Panda came in, our rankings were obliterated, so I am trying to isolate the unique content on the site and go forward with that. The problem is, the site has been going for over 10 years, with every man and his dog copying content from it. It seems that our travel guides have been largely left untouched and are the only unique content that I can find. We have 1000 travel guides in total. My first question is, would reducing 140,000 pages to just 1,000 ruin the site's authority in any way? The site does use internal linking within these pages, so culling them will remove thousands of internal links throughout the site. Also, am I right in saying that the link juice should now move to the more important pages with unique content, if redirects are set up correctly? And finally, how would you go about redirecting all theses pages? I will be culling a huge amount of hotel pages, would you consider redirecting all of these to the generic hotels page of the site? Thanks for your time, I know this is quite a long one, Nick
Intermediate & Advanced SEO | | Townpages0