Thousands of 301 redirections - .htaccess alternatives?
-
Hi guys,
I just want to ask if there are other possible issues/problems (other than server load) once we implement 301 redirections for 10,000+ URLs using .htaccess. Are there other alternatives?
-
Thank you for your answer ! I will share it with our IT team.
-
Why don't you just have a VPS server with NGINX the stream handler/reverse proxy for your IIS web server?
- https://www.digitalocean.com/community/tutorials/how-to-set-up-nginx-load-balancing
- http://www.iborgelt.com/windows-home-server-behind-nginx-reverse-proxy/
You're just using the VPS as an interface to handle your redirects and for $5 a month. You can't beat it. Im sure if your IT department googles: nginx reverse proxy iis they will get the idea.
-
Hi guys, I have a similar problem, but on IIS7. Our IT department says our 301 redirections file is at it's max size in the webconfig. They could increase the limit, but says it will impact page load speed negatively. What's the impact on page speed of having 5000 to 10000 urls in the rewrite map ?
Also, they're also looking at a solution to look at the redirections only when the site gives a 404, so it would hit 404, then 301, then 200. I am a little scared of this SEO wize. Would it be a problem?
Thanks !
-
Putting aside server load / config issues, and from the pure SEO point of view.
No, you shouldn't have any major issues with that many 301s. However, what you might find is that depending on the size of your site and the frequency of Googlebots visits that some of these pages take a long time (months) to drop out of the index and be replaced by their newer alternatives. This normally isn't cause for alarm.
In some instances you might end up with pages that now have now links to them (as their parent categories were all redirected also) and so seem to get stuck and never get recrawled by Google to update. In a couple of instances I have had success using XML sitemap files that just include these 'blocked' pages (the old URLs still in the index) to prompt Google to recrawl them.
Also there is Google Webmaster Tools feature to 'crawl as Googlebot' which then prompts you to 'submit to index' which you can use to prompt recrawls on a per-page basis (but you have credits here, so should only be for the more important pages).
Best of luck!
-
The main benefit of this would be in reducing server load / response time, and potentially in maintainability of the server config.
The most important aspect of this side of thing would be based on how many separate rules you have in your .htaccess file for those 10,000 redirects.
-
Hi Kevin,
What's the difference of this method to the standard 301 redirection using .htaccess?
-
Do you guys have a step-by-step guide in implementing 301 redirection using this httpd main server config file?
-
Well, if you're on a VPS/Dedicated Machine. - I would take a look at http://httpd.apache.org/docs/current/rewrite/rewritemap.html
RewriteMap has 0 effect on the load time like if you were to have the same in .htaccess it will eat those redirect rules. Remember 301s cache in the browser so when you're testing have them all 302s until you're happy and then watch your rewrite log when you launch. If you need help let us know.
This does take some knowhow and learning but you should be able to get this done in a few days. ( testing, reading documentation )
-
Do you have access to the httpd main server config file? If so, please read Apache HTTP Server Tutorial: .htaccess files.
".htaccess files should be used in a case where the content providers need to make configuration changes to the server on a per-directory basis, but do not have root access on the server system. In the event that the server administrator is not willing to make frequent configuration changes, it might be desirable to permit individual users to make these changes in .htaccess files for themselves. This is particularly true, for example, in cases where ISPs are hosting multiple user sites on a single machine, and want their users to be able to alter their configuration.
However, in general, use of .htaccess files should be avoided when possible. Any configuration that you would consider putting in a .htaccess file, can just as effectively be made in a <directory>section in your main server configuration file."</directory>
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Changing Links to Spans with Robots.txt Blocked Redirects using Linkify/jQuery
Hi, I was recently penalized most likely because Google started following javascript links to bad neighborhoods that were not no-followed. The first thing I did was remove the Linkify plugin from my site so that all those links would disappear, but now I think I have a solution that works with Linkify without creating crawlable links. I did the following: I blocked access to the Linkify scripts using robots.txt so that Google won't execute the scripts that create the links. This has worked for me in the past with banner ads linking to other sites of mine. At least it appears to work because those sites did not get links from pages running those banners in search console. I created a /redirect/ directory that redirects all offsite URLs. I put a robots.txt block on this directory. I configured the Linkify plugin to parse URLs into span elements instead of a elements and add no follow attributes. They still have an href attribute, but the URLs in the href now point to the redirect directory and the span onclick event redirects the user. I have implemented this solution on another site of mine and I am hoping this will make it impossible for Google to categorize my pages as liking to any neighborhoods good or bad. Most of the content is UGC, so this should discourage link spam while giving users clickable URLs and still letting people post complaints about people that have profiles on adult websites. Here is a page where the solution has been implemented https://cyberbullyingreport.com/bully/predators-watch-owner-scott-breitenstein-of-dayton-ohio-5463.aspx, the Linkify plugin can be found at https://soapbox.github.io/linkifyjs/, and the custom jQuery is as follows: jQuery(document).ready(function ($) { 2 $('p').linkify({ tagName: 'span', attributes: { rel: 'nofollow' }, formatHref: function (href) { href = 'https://cyberbullyingreport.com/redirect/?url=' + href; return href; }, events:{ click: function (e) { var href = $(this).attr('href'); window.location.href = href; } } }); 3 });
White Hat / Black Hat SEO | | STDCarriers0 -
Redirect a sub-domain to other domain
Hi there! Suppose a domain 'abc.com' has a subdomain 'news.abc.com'. If we redirect (301) only subdomain 'news.abc.com' to 'xyz.com'. so is there any SEO harm on main domain 'abc.com'? Even both abc.com and xyz.com are running separately. Rajiv
White Hat / Black Hat SEO | | gamesecure0 -
2 Questions about 301 Redirects
So I have a couple of questions about 301 redirects: Do Google penalties EVER pass through a 301? I've done 20+ domain 301s in the last year and have yet to see it happen, but the other day I read a an article (or maybe it was a QA post?) that suggested doing 302s to avoid transferring penalties. Has anyone seen any authoritative information regarding this? I 301'd a domain in February that another SEO firm had built a lot of spammy links and I began building contextual links for it at a very slow rate (like 10 or so a month). Within a month, my domain authority was a 26 on the new domain and my inbound links were non existent. By month 2, my links were 70k and domain authority was 34. By month 3, down to 25k inbound links and domain authority of 29, where it has settled for the last 3 months despite some really high quality links. My question (don't worry it's coming), is does anyone have any clue why my links shot up so quickly and then dropped? I'm assuming the 301 links kicked in and then only about 45% ended up 'sticking'?? Thanks in advance
White Hat / Black Hat SEO | | BrianJGomez0 -
All pages going through 302 redirect - bad?
So, our web development company did something I don't agree with and I need a second opinion. Most of our pages are statically cached (the CMS creates .html files), which is required because of our traffic volume. To get geotargeting to work, they've set up every page to 302 redirect to a geodetection script, and back to the geotargeted version of the page. Eg: www.example.com/category 302 redirects to www.example.com/geodetect.hp?ip=ip_address. Then that page 302 redirects back to either www.example.com/category, or www.example.com/geo/category for the geo-targeted version. **So all of our pages - thousands - go through a double 302 redirect. It's fairly invisible to the user, and 302 is more appropriate than 301 in this case, but it really worries me. I've done lots of research and can't find anything specifically saying this is bad, but I can't imagine Google being happy with this. ** Thoughts? Is this bad for SEO? Is there a better way (keeping in mind all of our files are statically generated)? Is this perfectly fine?
White Hat / Black Hat SEO | | dholowiski0 -
301, 404 or 410? what is the best practice
Hi I'm currently working on a project to correct some really bad practices from years of different SEO's. Basically they had made around 1500 pages of delivery counties and town, only change 3 words on every page. Now apart from duplicate content issues, this has really hammered the site with the latest round of Panda updates. I've pulled the pages, but i'm in several frames of mind on how to best fix this. The pages won't ever be used again, so i'm thinking a 410 code would be best, but reading another post: http://moz.com/community/q/server-redirect-query i'm not sure if i should just let them go to 404's if anyone ever finds them. Incidentally i'm Disavowing over 1100 root domains, so extremely unlikely to find links out there.
White Hat / Black Hat SEO | | eminent1 -
Homepage redirect
I'd like to get some thoughts about redirecting your homepage URL (www.site.com) to a keyword rich URL (www.site.com/super-awesome-best-thing-ever). Thank you, in advance.
White Hat / Black Hat SEO | | FrankSweeney0 -
Opportunity for Redirect?
Hi there! I've got a site selling outdoor jackets and remembered about a friend's old business website (that also sold outdoor jackets) which is now dormant. He's kindly agreed to let me host a splash page on his old domain, or to use the domain to redirect. I wasn't sure if Google looked negatively at redirects, so I suggested the page host option? What do you think? I guess what it would mean is for us to supply our name server details to him, and then ask him to put these into his DNS settings. If we were to host a page in this way, would we add a page of relevant content, a simple link? Would this pull the link juice through? Any help with this would be greatly appreciated. Matt
White Hat / Black Hat SEO | | Macinasac0 -
Blogspot or Wordpress.com Redirect?
I have multiple domains with the same registrar. Is there an SEO benefit to create complimentary blogs on blogspot, wordpress.com or other "free" blog sites and forward these domains with the purpose of backlinking to the main site?
White Hat / Black Hat SEO | | reeljerc0