301s - A Year Late
-
A website I recently was asked to help with was redesigned last year, but no 301s were setup. Looking at the old URLs 95% of the ones from early 2013 are 404s. Their traffic dropped from 50,000 per month to 10,000 and I believe this is one of the reasons.
Now the question is: a year later, will it do any good to setup 301 redirects from those old urls. My current thought is that the old URLs have probably lost any link juice they had. But it should hurt anything to setup the 301s anyway.
Any thoughts on whether this is worth my time and effort?
-
Absolutely get those 301s into place as soon as possible Beth! Not only will you likely see some increased traffic from links that are out there to the old pages, but you'll also likely see a nice rankings boost. Right now, any links to the old pages are essentially "lost" to your site for ranking influence purposes. Getting the redirects in place will allow that ranking influence to again be credited to the client's new pages.
When you do start adding the redirects, make sure to add an Annotation to the related Google Analytics profile. Depending on the number and quality of the redirected pages, and on whether the site's 404 page currently has Analytics tracking, you're going to see a bit of a shift in engagement metrics. If there's no tracking on the 404 page, you'll see an increase in visits as visitors land on "real" pages instead of the 404. If there was 404 tracking before, you'll see a decrease in Bounce Rate and increase in pages/visit as far more visitors stick around the real pages instead of just bouncing from the 404 page. You'll want to be able to refer back to the date the redirecting started so you'll always be able to put stats changes into context around this process. (e.g. a year form now when the client is trying to figure out why there was a site improvement around this time)
[Hint - make sure you've got solid 404 page tracking in Analytics and keep checking it as you go along. It's an essential addition to just watching for what's showing up in Webmaster Tools, for example.]
Some more suggestions for the process:
- Use Analytics to track improvements in the metrics you expect to benefit from this process. This is how you'll demonstrate the benefit of the work, and get credit (and therefore reputation) for your efforts. You can even set up Goals around the expected improvements to make them easier to track.
- Use Screaming Frog, Xenu Link Sleuth or equivalent tool to run a check of all internal pages to ensure none of your own pages include broken internal links. Screaming Frog (paid version) can also be used to bulk-test your redirects immediately after implementation.
- Watch for any high-value incoming links to old pages that you think you might be able to get corrected at source (i.e. an external site you have any sort of relationship with). Since each redirect wastes a bit of "link juice" you're even better off getting the original link corrected to point to the right page, instead of having to go through the redirect. Only worth it for strong links.
- Watch for opportunities to use REGEX to combine several redirects into one rule. Fewer rules is better for site speed.
- If you don't have a copy of the original site to extract the URLs from, you can use the Wayback Machine to see a version of the site form before the migration.
- to create a list of the old URLs that are still indexed, use the site:mydomain.com search operator to find the majority of still-indexed URLs. You can then use the SERPSRedux bookmarklet to scrape all the results into a csv and use Excel filtering to find all the old URLs (tip - set your Google Search results to show 100 results per page to make the scraping faster)
- Set up an ongoing and regular process for checking for and dealing with such 404s. Any site should have this in place, but especially one that has been redeveloped.
Lastly, since you know you've got a lot of 404's coming in, make certain you have a really top-notch 404 error page that is designed to capture as many visitors and possible and help move them to real content without losing them. Again, important for any site, but well worth extra attention for any site that knows it has a 404 problem. (This is far better than "soft 404ing" to a home page, for example, for a number of technical and usability reasons.)
So bottom line on "whether this is worth my time and effort?" You better believe it is. probably one of the best things you could do for the site at this point. I have direct experience doing this for several sites and the improvements are significant and quite gratifying - both for you and the site owner.
Hope those are useful ideas?
Paul
-
Hiya, they may have lost link juice but then again there may be a blog giving you praise with a link that still "active". It's never too late to make a 301, remember though its best to 301 to the most relevant category or closest page. You can also set up a soft 404 page so even if you miss one the user can still navigate to a page like home page.
Moz has some great tips if you want a read or to refresh your mind.
-
Yes. It's better late than never. You might not get any rank but I consider 301s to be good policy beyond the SEO aspect. I hate going to a page where there's a link and I click it and I get a 404 or bounced to the front page. Perhaps I have a bookmark. Perhaps it's an old link. Whatever the case, do your visitors a courtesy and redirect them to the correct page.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2
Whole website moved to https://www. HTTP/2 version 3 years ago. When we review log files, it is clear that - for the home page - GoogleBot continues to only access via HTTP/1.1 protocol Robots file is correct (simply allowing all and referring to https://www. sitemap Sitemap is referencing https://www. pages including homepage Hosting provider has confirmed server is correctly configured to support HTTP/2 and provided evidence of accessing via HTTP/2 working 301 redirects set up for non-secure and non-www versions of website all to https://www. version Not using a CDN or proxy GSC reports home page as correctly indexed (with https://www. version canonicalised) but does still have the non-secure version of website as the referring page in the Discovery section. GSC also reports homepage as being crawled every day or so. Totally understand it can take time to update index, but we are at a complete loss to understand why GoogleBot continues to only go through HTTP/1.1 version not 2 Possibly related issue - and of course what is causing concern - is that new pages of site seem to index and perform well in SERP ... except home page. This never makes it to page 1 (other than for brand name) despite rating multiples higher in terms of content, speed etc than other pages which still get indexed in preference to home page. Any thoughts, further tests, ideas, direction or anything will be much appreciated!
Technical SEO | | AKCAC1 -
Do 301s still work after hosting is discontinued?
I am in the process of phasing out a website that has been acquired by another company. Its web pages are being 301 redirected to their counterparts on the website of the company that has acquired them. How long should I maintain the hosting of the phased out website? Technically, do 301s still work after the hosting has been discontinued? Thanks, Caro
Technical SEO | | Caro-O0 -
301 redirect to WWW on a 2 year old website with good SERPs and organic traffic?
Hi everyone, Recently someone pointed out that my website can be accessed in both ways i.e. by typing www.example.com or example.com. He further added that Google might identify this as duplicate content and penalize my website. So now I'm thinking about 301 redirection from non WWW to WWW using htaccess method. But my website is 2 year old now and I'm getting some decent traffic from Google. Will this redirection have an adverse effect on my rankings? Is there any other way to resolve this issue? I don’t want to lose my current rankings or organic traffic. Any help would be very much appreciated. P.S. Currently Google index my website pages with WWW.
Technical SEO | | nicksharma040 -
XCart Directory 301s Not Working
I'm working with someone to make fixes to an xcart site but I'm at a loss for some fixes. Some directory URLs had been changed around on their ecommerce site to make them more descriptive & more human friendly. The problem is that according to the team's coder, simple redirects won't work for the directories and mod rewrite and redirectmatch didn't work for some unknown reason. I don't really know anything about xcart. I've made some basic changes and redirects before though their admin panel but I don't have any clue as to how to make directories 301 properly. Any insights? Thanks!
Technical SEO | | MikeRoberts0 -
Question on 301s
Hi Everyone, I have a questions on 301 redirects, i hope someone can give me some help on this. There was some 301 redirects made on some of the URLs at the beginning of the year, however we are now re-structuring the whole website, which means the URLs which had been given a 301 redirect are now getting another 301. The question is, should i delete the first 301 redirect from the htaccess file? Kind Regards
Technical SEO | | Paul780 -
Htaccess 301s to 3 different sites
Hi, I'm an htaccess newbie, and I have to redirect and split traffic to three new domains from site A. The original home page has most of the inbound links so I've set up a 301 that goes to site B, the new corporate domain. Options +FollowSymLinks
Technical SEO | | ellenru
RewriteEngine on
RewriteRule (.*) http://www.newdomain.com/$1 [R=301,L] Brand websites C and D need 301s for their folders in site A but I have no idea how to write that in relationship to the first redirect, which really is about the home page, contact and only a few other pages. The urls are duplicates except for the new domain names. They're all on Linux..Site A is about 150 pages, should I write it by page, or can I do some kind of catch all (the first 301) plus the two folders? I'd really appreciate any insight you have and especially if you can show me how to write it. Thanks 🙂0 -
Are there any SEO implications if a page does two 301s and then a 304?
Curious to see if this is a positive or negative thing for SEO...or even perhaps, neutral. h9SZz
Technical SEO | | RodrigoStockebrand0 -
Are 301s advisable for low-traffic URL's?
We are using some branded terms in URLs that we have been recently told we need to stop using. If the pages in question get little traffic, so we're not concerned about losing traffic from broken URLs, should we still do 301 redirects for those pages after they are renamed? In other words, are there other serious considerations besides any loss in traffic from direct clicks on those broken URLs that need to be considered? This comes up because we don't have anyone in-house that can do the redirects, so we need to pay our outside web development company. Is it worth it?
Technical SEO | | PGRob0