Site architecture change - +30,000 404's in GWT
-
So recently we decided to change the URL structure of our online e-commerce catalogue - to make it easier to maintain in the future.
But since the change, we have (partially expected) +30K 404's in GWT - when we did the change, I was doing 301 redirects from our Apache server logs but it's just escalated.
Should I be concerned of "plugging" these 404's, by either removing them via URL removal tool or carry on doing 301 redirections? It's quite labour intensive - no incoming links to most of these URL's, so is there any point?
Thanks,
Ben
-
Hi Ben,
The answer to your question boils down to usability and link equity:
- Usability: Did the old URLs get lots of Direct and Referring traffic? E.g., do people have them bookmarked, type them directly into the address bar, or follow links from other sites? If so, there's an argument to be made for 301 redirecting the old URLs to their equivalent, new URLs. That makes for a much more seamless user experience, and increases the odds that visitors from these traffic sources will become customers, continue to be customers, etc.
- Link equity: When you look at a Top Pages report (in Google Webmaster Tools, Open Site Explorer, or ahrefs), how many of those most-linked and / or best-ranking pages are old product URLs? If product URLs are showing up in these reports, they definitely require a 301 redirect to an equivalent, new URL so that link equity isn't lost.
However, if (as is common with a large number of ecommerce sites), your old product URLs got virtually zero Direct or Referring traffic, and had virtually zero deep links, then letting the URLs go 404 is just fine. I think I remember a link churn report in the early days of LinkScape when they reported that something on the order of 80% of the URLs they had discovered would be 404 within a year. URL churn is a part of the web.
If you decide not to 301 those old URLs, then you simply want to serve a really consistent signal to engines that they're gone, and not coming back. Recently, JohnMu from Google suggested recently that there's a tiny difference in how Google treats 404 versus 410 response codes - 404s are often re-crawled (which leads to those 404 error reports in GWT), whereas 410 is treated as a more "permanent" indicator that the URL is gone for good, so 410s are removed from the index a tiny bit faster. Read more: http://www.seroundtable.com/google-content-removal-16851.html
Hope that helps!
-
Hi,
Are you sure these old urls are not being linked from somewhere (probably internally)? Maybe the sitemap.xml was forgotten and is pointing to all the old urls still? I think that for 404's to show in GWT there needs to be a link to them from somewhere, so in the first instance in GWT go to the 404s and have a look at where they are linked from (you can do this with moz reports also). If it is an internal page like a sitemap, or some forgotten menu/footer feature or similar that is still linking to old pages then yes you certainly want to clear this up! If this is the case, once you have fixed the internal linking issues you should have significantly reduced list of 404s and can then concentrate on these on a more case by case basis (assuming they are being triggered by external links).
Hope that helps!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ecommerce sites we own have similar products, is this OK?
Hello, In one of our niches, we have a big site with all products and a couple more sites that are smaller niches of the same niche. The product descriptions are different with different product names. Is this OK. We've got one big site and 2 smaller subsides in different niches that cross over with the big site. Let me know if Google is OK with this. We will have a separate blog for each with completely different content. There's not really duplicate content issues and although only the big site has a blog right now, the small ones eventually will have their own unique blog. Is this OK in Google's eyes now and in the future? What can we do to ensure we are OK? Thank you.
White Hat / Black Hat SEO | | BobGW1 -
I've purchased a PR 6 domain what will be best use of it ?
I've purchased a PR 6 domain what will be best use of it ? Should make a new site or redirect it to my low pr sites? Or I wasted my $100 ?
White Hat / Black Hat SEO | | IndiaFPS0 -
GWT 404 best practices
I'm getting back lots of 404 errors for old websites that are linking back to my current website. If the website content/anchor text has no relevancy to my current content is it still best practice to redirect to current home page, contact the web master to remove link or any other suggestions? Not exactly sure if redirecting looks spammy since it's irrelevant content. Thanks for your help!
White Hat / Black Hat SEO | | IceIcebaby0 -
Same template site same products but different content?
for the sake of this post I am selling lighters. I have 3 domains small-lighters.com medium-lighter.com large-lighters.com On all of the websites I have the same template same images etc and same products. The only difference is the way the content is worded described etc different bullet points. My domains are all strong keyword domains not spammy and bring in type in traffic. Is it ok to continue in this manner in your opinion?
White Hat / Black Hat SEO | | dynamic080 -
Opinions Wanted: Links Can Get Your Site Penalized?
I'm sure by now a lot of you have had a chance to read the Let's Kill the "Bad Inbound Links Can Get Your Site Penalized" Myth over at SearchEngineJournal. When I initially read this article, I was happy. It was confirming something that I believed, and supporting a stance that SEOmoz has taken time and time again. The idea that bad links can only hurt via loss of link juice when they get devalued, but not from any sort of penalization, is indeed located in many articles across SEOmoz. Then I perused the comments section, and I was shocked and unsettled to see some industry names that I recognized were taking the opposite side of the issue. There seems to be a few different opinions: The SEOmoz opinion that bad links can't hurt except for when they get devalued. The idea that you wouldn't be penalized algorithmically, but a manual penalty is within the realm of possibility. The idea that both manual and algorithmic penalties were a factor. Now, I know that SEOmoz preaches a link building strategy that targets high quality back links, and so if you completely prescribe to the Moz method, you've got nothing to worry about. I don't want to hear those answers here - they're right, but they're missing the point. It would still be prudent to have a correct stance on this issue, and I'm wondering if we have that. What do you guys think? Does anybody have an opinion one way or the other? Does anyone have evidence of it being one way or another? Can we setup some kind of test, rank a keyword for an arbitrary term, and go to town blasting low quality links at it as a proof of concept? I'm curious to hear your responses.
White Hat / Black Hat SEO | | AnthonyMangia0 -
From page 3 to page 75 on Google. Is my site really so bad?
So, a couple of weeks ago I started my first CPA website, just as an experiment and to see how well I could do out of it. My rankings were getting better every day, and I’ve been producing constant unique content for the site to improve my rankings even more. 2 days ago my rankings went straight to the last page of Google for the keyword “acne scar treatment” but Google has not banned me or given my domain a minus penalty. I’m still ranking number 1 for my domain, and they have not dropped the PR as my keyword is still in the main index. I’m not even sure what has happened? Am I not allowed to have a CPA website in the search results? The best information I could find on this is: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=76465 But I’ve been adding new pages with unique content. My site is www.acne-scar-treatment.co Any advice would be appreciated.
White Hat / Black Hat SEO | | tommythecat1 -
Hi, I found that one of my competitors have zero backlings in google, zero in yahoo but about 50.000 in Bing. How is that possible?
Hi, I found that one of my competitors have zero backlings in google, zero in yahoo but about 50.000 in Bing. How is that possible? I assumed that all search engines would finde the backlinks. Besides that he ranks fair well and better than I do with only a single site and with only one article of content while I have a lot of content and sites. I do not undersdtand why he is ranking better in google, while google assumingly does not see any backlinks of the 50.000 bing is finding. Thx, Dan
White Hat / Black Hat SEO | | docschmitti0 -
Somebody hacked many sites and put links to my sites in hidden div
I had 300 good natural links to my site from different sites and site ranked great for my keywords. Somebody (I suppose my competitor) has hacked other sites 2 days ago (checked Google cache) and now Yahoo Site Explorer shows 600 backlinks. I've checked new links - they all are in the same hidden div block - top:-100px; position:absolute;. I'm afraid that Google may penalize my site for these links. I'm contacting webmasters of these sites and their hosting so they remove these links. Is it possible to give Google a notice that these links are not mine so it could just skip them not penalizing me? Is it safe to make "Spam report" regarding links to my own site?
White Hat / Black Hat SEO | | zarades0