Site architecture change - +30,000 404's in GWT
-
So recently we decided to change the URL structure of our online e-commerce catalogue - to make it easier to maintain in the future.
But since the change, we have (partially expected) +30K 404's in GWT - when we did the change, I was doing 301 redirects from our Apache server logs but it's just escalated.
Should I be concerned of "plugging" these 404's, by either removing them via URL removal tool or carry on doing 301 redirections? It's quite labour intensive - no incoming links to most of these URL's, so is there any point?
Thanks,
Ben
-
Hi Ben,
The answer to your question boils down to usability and link equity:
- Usability: Did the old URLs get lots of Direct and Referring traffic? E.g., do people have them bookmarked, type them directly into the address bar, or follow links from other sites? If so, there's an argument to be made for 301 redirecting the old URLs to their equivalent, new URLs. That makes for a much more seamless user experience, and increases the odds that visitors from these traffic sources will become customers, continue to be customers, etc.
- Link equity: When you look at a Top Pages report (in Google Webmaster Tools, Open Site Explorer, or ahrefs), how many of those most-linked and / or best-ranking pages are old product URLs? If product URLs are showing up in these reports, they definitely require a 301 redirect to an equivalent, new URL so that link equity isn't lost.
However, if (as is common with a large number of ecommerce sites), your old product URLs got virtually zero Direct or Referring traffic, and had virtually zero deep links, then letting the URLs go 404 is just fine. I think I remember a link churn report in the early days of LinkScape when they reported that something on the order of 80% of the URLs they had discovered would be 404 within a year. URL churn is a part of the web.
If you decide not to 301 those old URLs, then you simply want to serve a really consistent signal to engines that they're gone, and not coming back. Recently, JohnMu from Google suggested recently that there's a tiny difference in how Google treats 404 versus 410 response codes - 404s are often re-crawled (which leads to those 404 error reports in GWT), whereas 410 is treated as a more "permanent" indicator that the URL is gone for good, so 410s are removed from the index a tiny bit faster. Read more: http://www.seroundtable.com/google-content-removal-16851.html
Hope that helps!
-
Hi,
Are you sure these old urls are not being linked from somewhere (probably internally)? Maybe the sitemap.xml was forgotten and is pointing to all the old urls still? I think that for 404's to show in GWT there needs to be a link to them from somewhere, so in the first instance in GWT go to the 404s and have a look at where they are linked from (you can do this with moz reports also). If it is an internal page like a sitemap, or some forgotten menu/footer feature or similar that is still linking to old pages then yes you certainly want to clear this up! If this is the case, once you have fixed the internal linking issues you should have significantly reduced list of 404s and can then concentrate on these on a more case by case basis (assuming they are being triggered by external links).
Hope that helps!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Server and multiple sites
We have multiple sites selling similar products in different ways but have always kept them separate on the off chance that google does not like it or they penalize one site. We have always put them on different servers but now thinking for performance as they are on shared hosting to put them on a single server which would be our own but we do not know the SEO considerations. We can assign multiple IPs to a server but I am not 100% sure whether there is still a negative impact of running multiple sites on the same server even if from a different IP. Any help would be appreciated, what I am really asking is could if they are on the same server with different IP's be still linked together by google?
White Hat / Black Hat SEO | | BobAnderson0 -
What is the difference between rel canonical and 301's?
Hi Guys I have been told a few times to add the rel canonical tag to my category pages - however every category page actually is different from the other - besides the listings that I have for my staff on each pages. Some of them specialise in areas that cross over in other areas - but over really if I'm re directing for eg: Psychic Readings over to Love and Relationships because 5 of my staff members are in both categories - the actual delivering of content and in depth of the actual category which skills are provided at different levels don't justify me creating a rel tag from Psychic Readings over to Love and Relationships just because i have 5 staff members listed under both categories. Tell me have I got this right or completely wrong? Here is an eg: Psychic Readings category https://www.zenory.com/psychic-readings And love and relationships category - https://www.zenory.com/love-relationships Hope this makes sense - I really look forward to your guys feedback! Cheers
White Hat / Black Hat SEO | | edward-may0 -
Obscene anchor text linking to non-existent pages on my site
My website seems to be rapidly accumulating links from what seem to be reputable websites and which are going to non-existent pages on my website. The anchor text of many of these links is obscene. Here is the URL of one of the pages that is linking to me. I contacted the originating site a couple of weeks ago and they are looking into it but I've not heard back. I'm guessing the originating sites have been hacked. Should I be concerned? Why are they linking to pages on my site that don't exist? http://www.radicalartistsagency.com/htmlarea/language/0content_abo_utus.html Looking at the page source of this page reveals the hidden links.
White Hat / Black Hat SEO | | MartinDS0 -
Changes to SEO with disavow?
Has the game changed a lot with the disavow tool I can see people still saying check out what our competitors are doing but with just going through a disavow myself how do you actually know what the correct link diversity is as 0 - 100% of the links could be disavowed. Also could a competitor not just buy a load of spammy links and disavow them to mask there real links. (I know in my backlinks on 150 are good and the rest is disavowed crap)
White Hat / Black Hat SEO | | BobAnderson0 -
2 sites in one niche?
Hello, Can you be penalized for having 2 ecommerce sites in the same niche? Is there a way to do it white-hat? Please explain.
White Hat / Black Hat SEO | | BobGW0 -
Build Backlinks on this site? - Advice Please
Hello, I am trying to build some backlinks to my E-Commerce site and was wondering how you all view sites like this: http://www.bookmark4you.com/ If I were to put a listing for my company/site on that site, would that be considered a good backlink or a bad backlink (in terms of Google's guidelines)... There are a bunch of sites like these, online directory or bookmark sites, and i was wondering what the general opinion is on using them for backlinking purposes. Any help or advice would be greatly appreciated. THANKS!!
White Hat / Black Hat SEO | | Prime850 -
Partner Site Hit with Penguin - Links hurt me
I work for a network of international websites, the site I work on is for Canada. Our partners in Australia were hit by penguin hard because they hired a black hat SEO guy and didn't know. He was creating profiles on highly authoritative sites and keyword stuffing them. Now, they've completely dropped off the SERP. This is where the issue occurs, because we are all international partners we are all linked together on the header of every page so visitors can choose their country. Now, because they were hit hard and we have reciprocal links (not for rankings but for usability) will we be affected? It seems like we have, but I just want some opinions out there. Also, should we go ahead and stop linking our sites between countries to avoid this mess?
White Hat / Black Hat SEO | | BeTheBoss0 -
Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
Hi All, In relation to this thread http://www.seomoz.org/q/what-happend-to-my-ranks-began-dec-22-detailed-info-inside I'm still getting whipped hard from Google, this week for some reason all rankings have gone for the past few days. What I was wondering though is this, when Google says- Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations? I assume my site hits the nail on the head- [removed links at request of author] As you can see I target LG Optimus 3D Sim Free, LG Optimus 3D Contract and LG Optimus 3D Deals. Based on what Google has said, I know think there needs to be 1 page that covers it all instead of 3. What I'm wondering is the best way to deal with the situation? I think it should be something like this but please correct me along the way 🙂 1. Pick the strongest page out of the 3 2. Merge the content from the 2 weaker pages into the strongest 3. Update the title/meta info of the strongest page to include the KW variations of all 3 eg- LG Optimus 3D Contract Deals And Sim Free Pricing 4. Then scatter contract, deals and sim free throughout the text naturally 5. Then delete the weaker 2 pages and 301 redirect to the strongest page 6. Submit URL removal via webmastertools for the 2 weaker pages What would you do to correct this situation? Am I on the right track?
White Hat / Black Hat SEO | | mwoody0