Fixing 404s
-
One of our sites is littered in 404s and I have the lucky task of clearing them up. I'm using a mixture of 410s and 301s. Party central, I know.
Wondered what ways people measure the success of clearing these 404s up on your site?
Whether this is CTR or rankings... do you look at site benefits outside of your console errors coming down?
Basically trying to understand how much this is worth doing and how to measure it's progress in real terms.
Thanks!
-
I'm getting into the habit of fixing them where i can - if the page is linked from anywhere and fixing/ redirecting if i can.
Then i'm marking them as fixed in search console repeatedly - any pages I'm very worried about, I recrawl via the 'fetch as google' with the hope that google will recognise them as gone.
On reading google documentation, it says that 404s won't necessarily harm your site and 'in most cases' should be left to 404. so basically i'm trying to use the tools available to us to read what errors might be there and fixing the ones that are worth fixing.
I know what you mean about having loads of frustrating 404s - as an seo trying to find things to fix, it's an absolute nightmare. but try marking them as fixed and recrawling them if necessary?
perhaps also - since your site is ecom, making a canonical campaign whereby you can canonicalise the most up-to-date product and maybe creating a custom 404 page. like "this is a 404, but here's some similar products you can browse, <a>red shirt</a>, <a>blue shirt</a> etc. hope that helps?
-
Thanks GR,
It's tricky since our pages are information based - we're not providing a form or selling a product. And there are thousands of these historic 404s to work through!
But sounds like it is worth working our way through, thank you!
-
Hi Fubra!
First of all, you should consider the intent of those 404 pages. What were their goals? Conversion? Completion of a form?
Then, as you said that some are being R301, anlayze if those redirection are being taken well by Google and the rankings for those URLs are improving and/or started appearing in the searches that used to lead to a 404.Secondly, in my opinion, the two more improtant metric to always consider are:
- Rankings, if you do not improve rankings or, in the case of being in a high traffic and too competitive search, do not maintain the ranking, then you might be doing something wrong.
- Conversions, always, the main goal is to convert more and/or more efficiently. Of course this meas that you MUST understand what a conversion is for every site and for every part of the site. Then, set up the correct triggers in your analytics tool and measure the improvement.
And last but not least, Google doesnt like 404s, unless that is what the user is intended to get, so its an always on taks in any SEO project.
Hope it helps.
Best Luck.
GR
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Next scheduled update?
Hello my new website is showing everything at minimum like domain and page rating and backlinks, when will the next update will be?
Algorithm Updates | | raffaelegurrieri0 -
SEO: High intent organic revenue down in Europe
Our team is stumped and we are hoping some of you might have some insight! We are seeing a drop in Europe organic revenue and we can't seem to figure out what the core cause of the problem is. What's interesting, the high intent traffic is increasing across the business, as is organic-attributed revenue. And in Europe specifically, other channels appear to be doing just fine. This seems to be a Europe high-intent SEO problem. What we have established: Revenue was at a peak in Q4 2017 and Q1 2018 Revenue dips in mid-late Q2 2018 and again in Q4 2018 where it has stayed low since Organic traffic has gone up, conversion rate has gone down, purchases have gone down Paid search traffic has gone up, conversion rate has gone down slightly, submissions have gone up Currency changes are minimal We cannot find any site load issues What we know happened during this time frame (January 2018 onward): Updates to the website (homepage layout, some text changes) end of April 2018 GDPR end of May 2018 Google Analytics stops being able to track Firefox Europe is a key market for us and we cant figure out what might be causing this to happen - again, only in Europe - beyond GDPR and the changes we've made on our site is there anything else major that we're missing that could be causing this? Or does anyone have any insights as to where we should look? Thank you in advance!
Algorithm Updates | | RS-Marketing0 -
International Homepage Advice
Hello, colleagues! We have a conundrum. A client website has a good subdirectory strategy for localized/translated content for its various international markets, but nothing currently "lives" at the root. In my mind, this presents a challenge to search engines (note that we have had some trouble getting proper visibility overall, which is why I'm asking this question). I'm looking for any links or just plain old good advice on why it's important to have a global homepage. Should that global homepage be in English? Most enterprise sites I've worked with do have a homepage that's in English, with the ability to select a country from a drop down in a nav across the site. Any advice, best practices, etc. about why a global homepage is important and what language it could/should be in would be really helpful. Hreflang tags would make sense, I guess, but each country has slightly different offerings so I'm not sure that it makes complete sense. In other words, one country's homepage may have completely different content than another's. Thank you!
Algorithm Updates | | SimpleSearch0 -
Our Sites Organic Traffic Went Down Significantly After The June Core Algorithm Update, What Can I Do?
After the June Core Algorithim Update, the site suffered a loss of about 30-35% of traffic. My suggestions to try to get traffic back up have been to add metadata (since the majority of our content is lacking it), as well ask linking if possible, adding keywords to alt images, expanding and adding content as it's thin content wise. I know that from a technical standpoint there are a lot of fixes we can implement, but I do not want to suggest anything as we are onboarding an SEO agency soon. Last week, I saw that traffic for the site went back to "normal" for one day and then saw a dip of 30% the next day. Despite my efforts, traffic has been up and down, but the majority of organic traffic has dipped overall this month. I have been told by my company that I am not doing a good job of getting numbers back up, and have been given a warning stating that I need to increase traffic by 25% by the end of the month and keep it steady, or else. Does anyone have any suggestions? Is it realistic and/or possible to reach that goal?
Algorithm Updates | | NBJ_SM2 -
Is using REACT SEO friendly?
Hi Guys Is REACT SEO friendly? Has anyone used REACT and what was the results? Or do you recommend something else that is better suited for SEO? Many thanks for your help in advance. Cheers Martin
Algorithm Updates | | martin19700 -
On page vs Off page vs Technical SEO: Priority, easy to handle, easy to measure.
Hi community, I am just trying to figure out which can be priority in on page, off page and technical SEO. Which one you prefer to go first? Which one is easy to handle? Which one is easy to measure? Your opinions and suggestions please. Expecting more realistic answers rather than usual check list. Thanks
Algorithm Updates | | vtmoz0 -
I'm Pulling Hairs! - Duplicate Content Issue on 3 Sites
Hi, I'm an SEO intern trying to solve a duplicate content issue on three wine retailer sites. I have read up on the Moz Blog Posts and other helpful articles that were flooded with information on how to fix duplicate content. However, I have tried using canonical tags for duplicates and redirects for expiring pages on these sites and it hasn't fixed the duplicate content problem. My Moz report indicated that we have 1000s of duplicates content pages. I understand that it's a common problem among other e-commerce sites and the way we create landing pages and apply dynamic search results pages kind of conflicts with our SEO progress. Sometimes we'll create landing pages with the same URLs as an older landing page that expired. Unfortunately, I can't go around this problem since this is how customer marketing and recruitment manage their offers and landing pages. Would it be best to nofollow these expired pages or redirect them? Also I tried to use self-referencing canonical tags and canonical tags that point to the higher authority on search results pages and even though it worked for some pages on the site, it didn't work for a lot of the other search result pages. Is there something that we can do to these search result pages that will let google understand that these search results pages on our site are original pages? There are a lot of factors that I can't change and I'm kind of concerned that the three sites won't rank as well and also drive traffic that won't convert on the site. I understand that Google won't penalize your sites with duplicate content unless it's spammy. So If I can't fix these errors -- since the company I work conducts business where we won't ever run out of duplicate content -- Is it worth going on to other priorities in SEO like Keyword research, On/Off page optimization? Or should we really concentrate on fixing these technical issues before doing anything else? I'm curious to know what you think. Thanks!
Algorithm Updates | | drewstorys0 -
Will this fix my bounce rate?
If I understand bounce rate correctly, what it basically means is that someone clicks on your SERP, and then clicks back to google? But, it doesn't matter if they spent 10 minutes on your page or 10 seconds...so if that's right, then can you lower you bounce rate by getting someone to click on another internal link inside the original page they visited from the SERPs? So for example, if a user clicks on the SERP result for our webpage X, then the users clicks on an internal link on our page X to another one of our webpages ,Y, will that lower the bounce rate, even if the user eventually backs out to the original SERP page? Thanks, Ruben
Algorithm Updates | | KempRugeLawGroup0