Technical Argument to Prefer non-www to www?
-
I've been recommending using non-www vs. www as a preferable set up if a client is starting a site from scratch, and there aren't any pre-existing links to consider.
I'm wondering if this recommendation still holds?
I’ve been looking on the interwebs and I’m seeing far fewer articles arguing for the non-www version. In the two courts, I’m seeing highlighted:
Pro www: (ex: www.domain.com)
- Works better with CDN networks, where a domain needs to be specified (though that argument is 3 years old)
- Ability to restrict cookies to one hostname (www) or subdomain (info. blog. promo.) if using multiple subdomains
- IT people generally prefer it
Pro non-www (ex: domain.com)
- If you ever want to support or add https://, you don’t have to support 2 sets of urls/domains
- Mindset: fewer and fewer people think in terms of typing in www before a site url, the future is heading towards dropping that anyway. Though that is a bit of a cosmetic argument….
Is there a trend going back to www? Is there a technical argument to recommend non-www over www?
Thanks!
-
Thanks Cesar, I appreciate your detailed response.
Pick one, set up our redirects properly and we're good to go!
Thanks much!
-
I do not believe there really is a technical argument for this anymore because of the advancements we have now with HTML/Apache and so on. I have been developing for about 15 years and at this point it really doesn't matter. Just choose one and go with it.
Works better with CDN networks, where a domain needs to be specified (though that argument is 3 years old)
Not sure what you are meaning by "specifying a domain"?. Either way a domain has to be specified whether its www.example.com or example.com. Now the standard to specify a CDN any pretty much everything else is in this format. "//www.example.com" or "//example.com". The "//" now tells the browser to just go to that server and the server will do the rest and tell the client where they should go.
For instance say you setup your .htaccess file to redirect (301) everyone to https and www. The client only needs to worry about "//"
Ability to restrict cookies to one hostname (www) or subdomain (info. blog. promo.) if using multiple subdomains
Cookies should always be set for both just in case. You cant control how someone will type in your domain, but you can control the redirects to www.
**IT people generally prefer it **
Not true
If you ever want to support or add https://, you don’t have to support 2 sets of urls/domains
Again with just using "//" you don't have to worry about this anymore
Mindset: fewer and fewer people think in terms of typing in www before a site url, the future is heading towards dropping that anyway. Though that is a bit of a cosmetic argument….
As long as you setup your redirect, www or none-www does not matter, even if you had your domain for years before you implemented the change.
Here is the current trend
With the amount of mobile devices and how "on the go" we are the less we can type to get our answer, the better. So yes the most preferred is example.com. In fact people now will just type in the brand name/domain and let Google direct them.
All in all everyone should have a redirect to either www or none-www. All that matters to you is how do you want users to see your domain...www or none-www. Send them to whatever method you prefer. Since Google can determine the difference and you setup your 301 properly your Golden.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Increase of non-relevant back-links drop page ranking?
Hi community, Let's say there is a page with 50 back-links where 40 are non-relevant back-links and only 10 are relevant in-terms of content around the link, etc....Will these non-relevant back-links impact the ranking of the page by diluting the back-link profile? Thanks
Algorithm Updates | | vtmoz0 -
How much do branded search organic traffic & direct traffic impact the ranking for their non-branded topic/keyword?
Hi Moz community, We can see many websites with a reputation will have more number of visitors landing with these two types of traffic mostly (>90%): organic traffic of brand queries and direct traffic. Will these visits help and impact the ranking of these websites for the keywords/topics they been employing? Ex: Moz will have many such visitors. Will this really impact the ranking of Moz for non-brand queries they try to rank for, like "SEO Software". If so, will this have a huge impact or it's just a minor ranking factor. Because we have this with our website and we don't see such boost in rankings compared to our competitors with less direct traffic; where as I been looking at some SEO articles that direct traffic is one of the most important ranking factors. Thanks
Algorithm Updates | | vtmoz0 -
Adding non-important folders to disallow in robots.txt file
Hi all, If we have many non-important folders like /category/ in blog.....these will multiply the links. These are strictly for users who access very rarely but not for bots. Can we add such to disallow list in robots to stop link juice passing from them, so internal linking will me minimised to an extent. Can we add any such paths or pages in disallow list? Is this going to work pure technical or any penalty? Thanks, Satish
Algorithm Updates | | vtmoz0 -
Google Webmaster Tools: Quality Issues on http://www.enakliyat.com.tr/
Specifically, we detected low-quality pages on your site which do not provide substantially unique content or added value. Examples could include thin affiliate pages, doorway pages, automatically generated content, or copied content. We encourage you to make changes to your site so that it meets our quality guidelines. Once you've made these changes, please submit your site for reconsideration in Google's search results. Google Webmaster Tool send me this message I think the low-quality pages is like the this http://www.enakliyat.com.tr/detaylar/bursa-fethiye-ucevler-nakliye-5834 page and we have so many pages like this... Example 1: http://www.enakliyat.com.tr/detaylar/evden-eve-nakliyat-5906 **Example 2 : **http://www.enakliyat.com.tr/detaylar/cekmekoy-izmit-5905 **Example 3: **http://www.enakliyat.com.tr/detaylar/evden-eve-nakliyat-5906 What should I do to these pages HELP 😞
Algorithm Updates | | iskq0 -
Www vs nonwww domain
Since about 5 years out site was launched as "www.example.com" but last June 2012, we relaunched new design but somehow went without www subdmain - "http://example.com". We didn't check that time but now find duplicate pages and very confused what next. Please answer: Do search engines penalize for the change of domain name? www.example.com vs example.com? How can go back (or, should we really?) to www.example.com? I did redirect .htaccess rewrite from nonwww to www - but now our site is launched as without www. Confused so Please advise ASAP. Thanks a Million
Algorithm Updates | | GreenBirdMedia0 -
Does google index non-public pages ie. members logged in page
hi, I was trying to locate resources on the topics regarding how much the google bot indexes in order to qualify a 'good' site on their engine. For example, our site has many pages that are associated with logged in users and not available to the public until they acquire a login username and password. Although those pages show up in google analytics, they should not be made public in the google index which is what happens. In light of Google trying to qualify a site according to how 'engaged' a user is on the site, I would feel that the activities on those member pages are very important. Can anyone offer suggestions on how Google treats those pages since we are planning to do further SEO optimization of those pages. Thanks
Algorithm Updates | | jumpdates0 -
Plural vs non-plural domain name
I'm sure this question has been answered and asked a 1,000 different ways but what would be the best domain name to use in the long term (2 years +)? The plural versions (examples.com) which has a decent domain authority and is ranking 1st in Google search results yet has less search volume or the singular version (example.com) that has no current SEO value for the search term that we'd like to target however the singular version of the keyword has a much higher search volume? so basically will it be better to have the exact match that has more volume or the plural form that has better rankings after 2 years of doing SEO for each domain? My guess is that using (examples.com) with the better domain authority and tightening the grip on its dominance in Google will still be more effective than having the exact match domain with more search volume for that keyword while performing the same amount of SEO even after two years. Any suggestions?
Algorithm Updates | | ydop0 -
Should I block non-informative pages from Google's index?
Our site has about 1000 pages indexed, and the vast majority of them are not useful, and/or contain little content. Some of these are: -Galleries
Algorithm Updates | | UnderRugSwept
-Pages of images with no text except for navigation
-Popup windows that contain further information about something but contain no navigation, and sometimes only a couple sentences My question is whether or not I should put a noindex in the meta tags. I think it would be good because the ratio of quality to low quality pages right now is not good at all. I am apprehensive because if I'm blocking more than half my site from Google, won't Google see that as a suspicious or bad practice?1