Nginx vs. Apache, All Things Considered
-
Hey Peeps,
I've been struggling lately with a new static site, and I'm looking for anyone's opinion who's had to optimize a site using Nginx. I understand that Nginx is recommended for static sites, however I want to avoid being in a situation where I can't do things like write redirect rules the way I want to.
Considering that it will be hosting a Static site, are there any features or functions that Nginx lacks when compared to Apache, such as ability to write rewrite rules, etc.?
-
Great to hear. Let me know if you have any question when you start that project.
Casey
-
Yup, I'm in the same boat as you, I'd much rather do server-side redirects..
As an update on this "project", we used the pageless redirects in our staging environment on S3 just now, but were unsuccessful. Certain redirects that we set up in pageless redirects, (such as adding a trailing slash to URLs without,) got clobbered by S3's default setting of 302ing to adding a trailing slash. Weak sauce, Amazon!
At this point, we're going with Apache, since it's the App that our developers know best and we've had too many problems to experiment with our live environment. This being said, our next project after we relaunch with proper redirects will be to begin testing on our stage with Nginx
Thanks for your input!
-
Hey Danny,
I've always done 301 redirects from the server and avoided any other method. This was more for my sanity to make sure that I was getting all the equity I could if there was a difference, not saying there is a difference but if there way, I wanted to be safe. Since it sounds like you may be constrained by your technology, the solution you are going with is fine but if you had both options available, I'd go with the server side redirect always.
-
Thanks Casey!
We've actually found a different work-around that we are looking at right now, using the "pageless redirects" plugin for Jekyll. Basically it uses the meta refresh + rel canon redirection method that Matt Cutts got called out on a while ago. This would allow us to stay on S3 and maintain our blazing fast site speed.
Through my research so far, this seems to pass equity in much the same way as a Server App 301.. Have you had any experiences/heard anything to the contrary?
-
Hi Danny,
The Moz.com website/blog are running on PHP/Nginx. As Matthew said, Nginx is much faster and less intensive on the servers for both CPU and memory. Nginx has some great documentation and is really easy to get things to redirect. It's as easy as adding lines like the following to your configuration and your good to go:
rewrite ^/q$ /community/q permanent;
rewrite ^/q/(.*)$ /community/q/$1 permanent;Making the switch from Apache to Nginx was one of the best things we ever did and I would highly suggest you do the same thing for both static and any dynamic sites you may have. I'll most likely never use Apache again.
Casey
-
From the little I know of Nginx, I know it is meant to be faster, less intensive on server memory and able to handle more concurrent connections, but Apache is more widely supported across different servers and is more flexible out of the box.
The one thing I have had to get my head around in working on clients sites that run on Nginx is the different URL rewrite rules i.e. http://nginx.org/en/docs/http/converting_rewrite_rules.html
-
Thanks Jeff!
I think we're going to go with Apache for now, since it's what all of us are well-versed in. We'll probably be switching to Nginx at some point in the future, and focusing on other aspects that you mentioned, such as caching and compression, in the meantime.
Cheers.
-
Danny - We use Nginx on our WordPress site, and it's pretty quick and easy. We're able to use the same .htaccess rules to handle rewrites, and for the most part, there's very little downside. You do want to make sure that your site isn't going to break before you launch it on Nginx, so I'd test it with a test URL first before you push it live.
We're also running Varnish as a caching system, and our page load speed takes the page from a slowwww load time to a really fast 1.5 second load time.
Hope this helps...
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content change within the same URL/Page (UX vs SEO)
Context: I'm asking my client to create city pages so he can present all of his appartements in that specific sector so i can have a page that ranks for "appartement for rent in +sector". The page will present a map with all the sector so the user can navigate and choose the sector he wants after he landed on the page. Question: The UX team is asking if we absolutly need to reload the sector page when the user is clicking the location on the map or if they can switch the content within the same page/url once the user is on the landing page. My concern: 1. Can this be analysed as duplicate content if Google can crawl within the javascript app or if Google only analyse his "first view" of the page. 2. Do you consider that it would be preferable to keep the "page change" so i'm increasing the number of page viewed ?
Technical SEO | | alexrbrg0 -
Spanish United States Vs Puerto Rico Hreflang
Hey Moz, So we are trying to figure out weather it is the same if we have Hreflang for "US-ES" vs "US-PR", IF we do "US-PR" for Puerto Rico for its own links we then have to create 3 parts to our site, PR Spanish PR English US Spanish We looked at Apple as an example and they had a "Latin America" for their Hreflang and labeled everything has either "es-419" is that the same concept as having just "us-es" for Puerto Rico? ( see attached screenshot ) We are trying to figure out what would be more effective and weather or not "US-ES" search results will appear for Puerto Rico also. PZVwg16
Technical SEO | | uBreakiFix0 -
422 vs 404 Status Codes
We work with an automotive industry platform provider and whenever a vehicle is removed from inventory, a 404 error is returned. Being that inventory moves so quickly, we have a host of 404 errors in search console. The fix that the platform provider proposed was to return a 422 status code vs a 404. I'm not familiar with how a 422 may impact our optimization efforts. Is this a good approach, since there is no scalable way to 301 redirect all of those dead inventory pages.
Technical SEO | | AfroSEO0 -
All other things equal, do server rendered websites rank higher than JavaScript web apps that follow the AJAX Crawling Spec?
I instinctively feel like server rendered websites should rank higher since Google doesn't truly know that the content its getting from an AJAX site is what the user is seeing and Google isn't exactly sure of the page load time (and thus user experience). I can't find any evidence that would prove this, however. A website like Monocle.io uses pushstate, loads fast, has good page titles, etc., but it is a JavaScript single page application. Does it make any difference?
Technical SEO | | jeffwhelpley0 -
Https vs http sitemap
I have a site that does a 301 redirect from http to https I currently have a sitemap auto submitted to google webmaster tools using the http pages. (because i didnt have https before) should I disable that sitemap for http and create one for the https only?
Technical SEO | | puremobile0 -
Panda: Are our ads duplicate content or just structural and not even considered?
We have hundreds and hundreds of pages with similar ads on. We are getting content written for these pages right now and we're removing some pages, but we're wondering how Panda might see the ads which we have across the site? The ads consist of the name of a company and a description and a few other bits. The description is the same on all pages that a company's ad is listed on - and that can be hundreds of pages. You can see some examples here: http://www.agencycentral.co.uk/agencysearch/accounting/skills/indandcomm/financialanalyst.htm http://www.agencycentral.co.uk/agencysearch/accounting/skills/indandcomm/financialaccountant.htm http://www.agencycentral.co.uk/agencysearch/accounting/skills/indandcomm/assistantaccountant.htm What we're wondering is whether Google Panda might be seeing the description of the company as internal duplicate content or just structural and not even considered as part of the Panda algorithm? Or something else? Or wouldn't it be clear in this case? Clearly Panda wouldn't hit duplicate content in nav bards, sidebars etc... but this is in the content area of the page so it did make us wonder. This could make a difference to how we proceed so we appreciate your thoughts. Regards, Phil
Technical SEO | | agencycentral0 -
Internal search : rel=canonical vs noindex vs robots.txt
Hi everyone, I have a website with a lot of internal search results pages indexed. I'm not asking if they should be indexed or not, I know they should not according to Google's guidelines. And they make a bunch of duplicated pages so I want to solve this problem. The thing is, if I noindex them, the site is gonna lose a non-negligible chunk of traffic : nearly 13% according to google analytics !!! I thought of blocking them in robots.txt. This solution would not keep them out of the index. But the pages appearing in GG SERPS would then look empty (no title, no description), thus their CTR would plummet and I would lose a bit of traffic too... The last idea I had was to use a rel=canonical tag pointing to the original search page (that is empty, without results), but it would probably have the same effect as noindexing them, wouldn't it ? (never tried so I'm not sure of this) Of course I did some research on the subject, but each of my finding recommanded one of the 3 methods only ! One even recommanded noindex+robots.txt block which is stupid because the noindex would then be useless... Is there somebody who can tell me which option is the best to keep this traffic ? Thanks a million
Technical SEO | | JohannCR0 -
WordPress Pretty Permalinks vs Site Speed
A couple of issues at play here as I wrestle with the best permalink structure for a site I'm toying with now. 1. I know that WordPress wants a unique number in the post to improve performance and db calls. 2. I know that for basic on-page SEO, most of us would opt for CATEGORY/POST or maybe even just post. I constantly change those. It's a bad habit, but sometimes you want the killer headline and a decent title in the post. So here is the issue: I can rewrite or use a plugin (anyone have a favorite) the permalinks to speed up site performance. We all know Google wants that. Maybe the permalink becomes /1234-foo But you know, a number in front of the URL just isn't awfully user friendly. If someone wants to read the foo post, it's nice to send them directly there. So would you trade off a slowdown in site speed for the prettiest permalinks for usability and SEO? And since you're asking a WP question, has anyone heard of a hard cap on static pages where the database starts dragging? The site I have in mind has 400 each posts and pages. Would moving platforms to Drupal or Joomla allow handling that many pages more effectively? Thanks for contributing and any help you can give. George
Technical SEO | | georgebounacos0