301s - A Year Late
-
A website I recently was asked to help with was redesigned last year, but no 301s were setup. Looking at the old URLs 95% of the ones from early 2013 are 404s. Their traffic dropped from 50,000 per month to 10,000 and I believe this is one of the reasons.
Now the question is: a year later, will it do any good to setup 301 redirects from those old urls. My current thought is that the old URLs have probably lost any link juice they had. But it should hurt anything to setup the 301s anyway.
Any thoughts on whether this is worth my time and effort?
-
Absolutely get those 301s into place as soon as possible Beth! Not only will you likely see some increased traffic from links that are out there to the old pages, but you'll also likely see a nice rankings boost. Right now, any links to the old pages are essentially "lost" to your site for ranking influence purposes. Getting the redirects in place will allow that ranking influence to again be credited to the client's new pages.
When you do start adding the redirects, make sure to add an Annotation to the related Google Analytics profile. Depending on the number and quality of the redirected pages, and on whether the site's 404 page currently has Analytics tracking, you're going to see a bit of a shift in engagement metrics. If there's no tracking on the 404 page, you'll see an increase in visits as visitors land on "real" pages instead of the 404. If there was 404 tracking before, you'll see a decrease in Bounce Rate and increase in pages/visit as far more visitors stick around the real pages instead of just bouncing from the 404 page. You'll want to be able to refer back to the date the redirecting started so you'll always be able to put stats changes into context around this process. (e.g. a year form now when the client is trying to figure out why there was a site improvement around this time)
[Hint - make sure you've got solid 404 page tracking in Analytics and keep checking it as you go along. It's an essential addition to just watching for what's showing up in Webmaster Tools, for example.]
Some more suggestions for the process:
- Use Analytics to track improvements in the metrics you expect to benefit from this process. This is how you'll demonstrate the benefit of the work, and get credit (and therefore reputation) for your efforts. You can even set up Goals around the expected improvements to make them easier to track.
- Use Screaming Frog, Xenu Link Sleuth or equivalent tool to run a check of all internal pages to ensure none of your own pages include broken internal links. Screaming Frog (paid version) can also be used to bulk-test your redirects immediately after implementation.
- Watch for any high-value incoming links to old pages that you think you might be able to get corrected at source (i.e. an external site you have any sort of relationship with). Since each redirect wastes a bit of "link juice" you're even better off getting the original link corrected to point to the right page, instead of having to go through the redirect. Only worth it for strong links.
- Watch for opportunities to use REGEX to combine several redirects into one rule. Fewer rules is better for site speed.
- If you don't have a copy of the original site to extract the URLs from, you can use the Wayback Machine to see a version of the site form before the migration.
- to create a list of the old URLs that are still indexed, use the site:mydomain.com search operator to find the majority of still-indexed URLs. You can then use the SERPSRedux bookmarklet to scrape all the results into a csv and use Excel filtering to find all the old URLs (tip - set your Google Search results to show 100 results per page to make the scraping faster)
- Set up an ongoing and regular process for checking for and dealing with such 404s. Any site should have this in place, but especially one that has been redeveloped.
Lastly, since you know you've got a lot of 404's coming in, make certain you have a really top-notch 404 error page that is designed to capture as many visitors and possible and help move them to real content without losing them. Again, important for any site, but well worth extra attention for any site that knows it has a 404 problem. (This is far better than "soft 404ing" to a home page, for example, for a number of technical and usability reasons.)
So bottom line on "whether this is worth my time and effort?" You better believe it is. probably one of the best things you could do for the site at this point. I have direct experience doing this for several sites and the improvements are significant and quite gratifying - both for you and the site owner.
Hope those are useful ideas?
Paul
-
Hiya, they may have lost link juice but then again there may be a blog giving you praise with a link that still "active". It's never too late to make a 301, remember though its best to 301 to the most relevant category or closest page. You can also set up a soft 404 page so even if you miss one the user can still navigate to a page like home page.
Moz has some great tips if you want a read or to refresh your mind.
-
Yes. It's better late than never. You might not get any rank but I consider 301s to be good policy beyond the SEO aspect. I hate going to a page where there's a link and I click it and I get a 404 or bounced to the front page. Perhaps I have a bookmark. Perhaps it's an old link. Whatever the case, do your visitors a courtesy and redirect them to the correct page.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 Redirects a Year Later
I inherited the digital maintenance of a website that was relaunched a year ago. In looking at Google Analytics, organic search a year later is still down 33%. I fear they did not install 301 Redirects but can't really get a specific answer from them. Is it possible to install them a year later to help with Google indexing and get back some of the organic traffic?
Technical SEO | | stansamples0 -
Old site selected as canonical on GSC 3 years after migration?
Recently my company started consulting for a SaaS company. They're clearly the best known, most trusted company on their area of work and they have the strongest brand, best product and therefore more users than any of their competitors by a big margin. Still, 99% of their traffic comes from branded, despite having 3x more domains, better performance scores and more content. Even using tools such as SimilarWeb for comparing user satisfaction metrics, they seem to have lower bounce rates and more visits per session. Still, they rank for almost nothing that is non branded on Google (they rank extremely well for almost everything on bing and DuckDuckGo). They don't have any obvious issues with crawling or indexation - we've gone to great depths to tick off any issues that could be affecting this. My conclusion is that it's either a penalty or a bug, but GSC is not flagging any manual actions. These are the things we've identified: All the content was moved from domain1.com to domain2.com at the end of 2017. 301s were put in place, migration was confirmed on GSC. Everything was done with great care and we couldn't identify any issues with it. Some subdomains of the site, especially support, rank extremely well for all sorts of keywords, even very competitive ones but the www subdomain ranks for almost nothing on Google. The www subdomain has 1,000s of domains pointing to it while the support has only a few 100s. Google is performing delayed rendering attempts on old pages, JS and CSS particularly versions of assets that were live before the migration in 2017, including the old homepage. Again, the redirects have been in place for 3 years. Search Console frequently showing old HTML (at least a year old) in cache despite a recent crawl date and a current 301. Search Console frequently processing old HTML (at least a year old) when reporting on schema. Search Console is sometimes selecting pages from the old domain as the canonical of a URL of an existing page of the current domain, despite a long-standing 301 and the canonicals being well configured for 3 years now. Has anyone experienced anything similar in the past? We've been doing an analysis of old SEO practices, link profile, disavow... nothing points to black hat practices and at this point we're wondering if it's just Google doing a terrible job with this particular domain.
Technical SEO | | oline1230 -
XCart Directory 301s Not Working
I'm working with someone to make fixes to an xcart site but I'm at a loss for some fixes. Some directory URLs had been changed around on their ecommerce site to make them more descriptive & more human friendly. The problem is that according to the team's coder, simple redirects won't work for the directories and mod rewrite and redirectmatch didn't work for some unknown reason. I don't really know anything about xcart. I've made some basic changes and redirects before though their admin panel but I don't have any clue as to how to make directories 301 properly. Any insights? Thanks!
Technical SEO | | MikeRoberts0 -
Guidance for setting up new 301s after having just done so (
Hi I've recently set up a load of 301 redirects for a clients new site design/structure relaunch One of the things we have done is take the kw out of the sub-category landing page url's since they now feature in the top level category page urls and don't want to risk over-optimisation by having kw repeats across the full urls. So the urls have changed and the original pages 301'd to the new current pages. However If rankings start to drop & i decide to change urls again to include kw in final part of url too for the sub category landing pages, whats best way to manage the new redirects ? Do i redirect the current urls (which have only been live for a week and have the original/old urls 301'd to them) to the new url's ? (worried this would create a chain of 301's which ive heard is not ideal) Or just redirect the original urls to the new ones, and can forget about the current pages/url's since only been live for a week ?
Technical SEO | | Dan-Lawrence
(I presume best not since GWT sitemaps area says most new urls indexed now so I presume sees those as the original pages replacement now) Or should they all be 301'd (original urls and current urls to the new) ? Or best to just run with current set up and avoid making too many changes again, and setting up even more 301's after having just done so ? Many Thanks 🙂 Dan0 -
Need specifics about mod_proxy for blog domain and 301s
I am getting the IT staff to move our blog from "blog." to "/blog" using mod_proxy for apache, but I had a couple of questions about this I was hoping someone here might be able to help with. Is it correct that just setting up mod_proxy will make the blog available at both URLs? the "blog." subdomain and the "/blog" folder? If so, what is the best way to 301 redirect all traffic from "blog." to "/blog"? I assume this could be handled with a blanket 301 style rewrite, but I wanted to get some other opinions before getting with my IT guys to do it. I am technical enough to talk about this, but not do it myself, so experienced opinions are appreciated. Thanks!
Technical SEO | | SL_SEM0 -
Htaccess 301s to 3 different sites
Hi, I'm an htaccess newbie, and I have to redirect and split traffic to three new domains from site A. The original home page has most of the inbound links so I've set up a 301 that goes to site B, the new corporate domain. Options +FollowSymLinks
Technical SEO | | ellenru
RewriteEngine on
RewriteRule (.*) http://www.newdomain.com/$1 [R=301,L] Brand websites C and D need 301s for their folders in site A but I have no idea how to write that in relationship to the first redirect, which really is about the home page, contact and only a few other pages. The urls are duplicates except for the new domain names. They're all on Linux..Site A is about 150 pages, should I write it by page, or can I do some kind of catch all (the first 301) plus the two folders? I'd really appreciate any insight you have and especially if you can show me how to write it. Thanks 🙂0 -
Very well established blog, new posts now being indexed very late
I have an established blog.We update it on daily basis. In the past, when I would publish a new post, it would get indexed within a minute or so. But since a month or so, its taking hours. Sometimes like 10-12 hours for new posts to get indexed. Only thing I have changed is robots.txt. This is the current robots file. User-agent: * Disallow: /cgi-bin Disallow: /wp-admin Disallow: /wp-includes Disallow: /wp-content/plugins Disallow: /wp-content/cache Disallow: /wp-content/themes Disallow: /wp-login.php Disallow: /*wp-login.php* Disallow: /trackback Disallow: /feed Disallow: /comments Disallow: /author Disallow: /category Disallow: */trackback Disallow: */feed Disallow: */comments Disallow: /login/ Disallow: /wget/ Disallow: /httpd/ Disallow: /*.php$ Disallow: /*?* Disallow: /*.js$ Disallow: /*.inc$ Disallow: /*.css$ Disallow: /*.gz$ Disallow: /*.wmv$ Disallow: /*.cgi$ Disallow: /*.xhtml$ Disallow: /*?* Disallow: /*? Allow: /wp-content/uploads User-agent: TechnoratiBot/8.1 Disallow: # ia_archiver User-agent: ia_archiver Disallow: / # disable duggmirror User-agent: duggmirror Disallow: / # allow google image bot to search all images User-agent: Googlebot-Image Disallow: /wp-includes/ Allow: /* # allow adsense bot on entire site User-agent: Mediapartners-Google* Disallow: Allow: /* Sitemap: http://www.domainname.com/sitemap.xml.gz Site has tons of backlinks. Just wondering if something is wrong with the robots file or if it could be something else.
Technical SEO | | rookie1230 -
New Sub-domains or New Directories for 10+ Year Domain?
We've got a one-page, 10+ year old domain that has a 65/100 domain authority that gets about 10k page views a day (I'm happy to share the URL but didn't know if that's permitted). The content changes daily (it's a daily bible verse) so most of this question is focused on domain authority, not the content. We're getting ready to provide translations of that daily content in 4 languages. Would it be better to create sub-domains for those translations (same content, different language) or sub-folders? Example: http://cn.example.com
Technical SEO | | ipllc
http://es.example.com
http://ru.example.com or http://example.com/cn
http://example.com/es
http://example.com/ru We're able to do either but want to pick the one that would give the translated version the most authority both now and moving forward. (We definitely don't want to penalize the root domain.) Thanks in advance for your input.0