Pitfalls when implementing the “VARY User-Agent” server response
-
We serve up different desktop/mobile optimized html on the same URL, based on a visitor’s device type.
While Google continue to recommend the HTTP Vary: User-Agent header for mobile specific versions of the page (http://www.youtube.com/watch?v=va6qtaiZRHg), we’re also aware of issues raised around CDN caching; http://searchengineland.com/mobile-site-configuration-the-varies-header-for-enterprise-seo-163004 / http://searchenginewatch.com/article/2249533/How-Googles-Mobile-Best-Practices-Can-Slow-Your-Site-Down / http://orcaman.blogspot.com/2013/08/cdn-caching-problems-vary-user-agent.html
As this is primarily for Google's benefit, it's been proposed that we only returning the Vary: User-Agent header when a Google user agent is detected (Googlebot/MobileBot/AdBot).
So here's the thing: as the server header response is not “content” per se I think this could be an okay solution, though wanted to throw it out there to the esteemed Moz community and get some additional feedback.
You guys see any issues/problems with implementing this solution?
Cheers!
linklater
-
So, there are lots of 'ifs' here, but the primary problem I see with your plan is that the CDN will return the content to Googlebot without the request hitting your server so you won't have the option to serve different headers to Googlebot.
Remember that every page is the main HTML content (which may be static or dynamically generated for every request), and then a whole bunch of other resources (Javascript and CSS files, images, font files etc.). These other resources are typically static and lend themselves far better to being cached.
Are your pages static or dynamic? If they are dynamic then you are possibly not benefitting from them being cached anyway, so you could use the 'vary' header on just these pages, and not on any static resources. This would ensure your static resources are cached by your CDN and give you a lot of the benefit of the CDN, and only the dynamic HTML content is served directly from the server.
If most of your pages are static you could still use this approach, but just without the full benefit of the CDN, which sucks.
Some of the CDNs are already working on this (see http://www.computerworld.com/s/article/9225343/Akamai_eyes_acceleration_boost_for_mobile_content and http://orcaman.blogspot.co.uk/2013/08/cdn-caching-problems-vary-user-agent.html) to try and find better solutions.
I hope some of this helps!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Mac-Friendly, DOM-Rendering Spidering Tool for Multiple Users
Hello! I am looking for a spidering tool that: Is Mac-friendly Can render the DOM and find JS links Can spider password-protected sites (prompts for password and then continues spider, etc.) Has competitive pricing for 8+ users. Screaming Frog is amazing - and maybe we're just going to have to bite the bullet there. But if anyone has any other ideas, I've love to hear them. Thanks!
Intermediate & Advanced SEO | | mirabile0 -
Do href lang tags need to be implemented at blogpost level?
Hey guys, Our site targets multiple territories. We use subfolders and hreflang tags on the site (built in WordPress) at a page level. We've added our hreflang tags manually in the section of each page. We're just re-doing the blog and we want to know if we need to add these tags to each individual blog post and if we do, how we would do it? Our developers have put them in at blog landing page level and told us that this will be fine. E.g.: /de/blog/ /gb/blog/ /uc/blog/ They have a slight tendency to push back on things though, and we just want to be sure we're doing this right. Hreflang tags are sooooo complicated so hoping you fine people can shed some light on the issue. Cheers!
Intermediate & Advanced SEO | | Twetman0 -
International SEO and server hosting
I'd appreciate feedback on a situation. We're going through a major overhaul in how we globally manage our websites. Regional servers were part of our original plan (one in Chicago, UK, and APAC) but we've identified a number of issues with this approach. Although it's considered a best practice among many, the challenges we'd face doing it are considerable (added complexity, added steps and delays to updating sites, among others). So, we shifted our plan and how are looking at hosting here in the US but to use Akami to deliver images and other heavier data pieces from their local servers (in the UK, etc.). This is how many of the larger companies like Amazon, etc. delivery their global websites. We hope that using Akami will allow us to have good performance while simplifying our process. Any warning signs we should be aware of? Anyone doing it this way and has a good experience/bad experience?
Intermediate & Advanced SEO | | josh-riley0 -
Google Re-Index or multiple 301 Redirects on the server?
Over a year ago we moved a site from Blogspot that was adding dates in the URL's (i.e.. blog/2012/08/10/) Additionally we've removed category folders (/category, /tag, etc). Overall if I add all these redirects (from the multiple date options, etc) I'm concerned it might be an overload on the server? After talking with the server team they had suggested using something like 'BWP Google Sitemaps' on our Wordpress site, which would allow Google some time to re-index our site. What do you suggest we do?
Intermediate & Advanced SEO | | seointern0 -
Translated site on same server as English version
Hi We are in the process of getting our .com (English) website translated to Chinese. My question is, what are the pitfulls if the site is hosted on the SAME server as the English version. So the server would host both .com and .com.cn versions Thoughts ?? Thanks Neil
Intermediate & Advanced SEO | | NeilTompkins0 -
How to handle a server outage if I have two sites
I operate a web application. It consists of two sites, www.mysite.com and app.mysite.com. As you might imagine, www is used for marketing purposes, and it's our main organic search entry point. The app.mysite.com domain is where our application portal is for customers, and it is also where our login and registration pages are located. Currently, www.mysite.com is experiencing a catastrophic outage and is returning 504 errors, but app.mysite.com is on a totally separate system with a lot redundancy, and is doing just fine. If we get traffic from referrals or search, we want that traffic to be able to login and register, so we've replaced the 504 error with a 302 redirect to app.mysite.com until the situation is resolved. This provides the best possible experience for users (nothing's worse than a 504). How will this affect SEO? Is there something other than a 302 that I should be doing with the broken www.mysite.com domain?
Intermediate & Advanced SEO | | Ehren0 -
Custom Error and page not found responses
When there is a 500 Internal Server Error, is it better to return an HTTP 500 response and custom error page from the requested URL, or is it better to return a 302 redirect? The redirect would send the browser to the custom error page, which would return the HTTP 500 result. We tell Google not to index or follow our error pages, so if Google sees an error at a URL, we don't necessarily want Google to think that the URL should be ignored. That's why the alternative would be to redirect to a custom error page with it's own URL. Similarly, what's the best approach if the response is a 404? Return HTTP 404 and custom 404 page from the requested URL, or redirect? Thanks.
Intermediate & Advanced SEO | | dbuckles0 -
How to Implement Massive SEO Modifications
Hi everyone, I'm implementing some fairly significant changes on a clients website and wanted to know if it was better to implement all the changes at once or if I should implement the changes gradually. The changes are: 1. Amended information architecture 2. Completely new URL's 3. New meta data and some new on page content 4. Meta robots 'no index, follow' approximately 90% of the site Can I make all these changes in one go (that would be my preference), or should I gradually implement? What are the risks? Many thanks James
Intermediate & Advanced SEO | | jamesjackson1