Is it better to put all your CSS in 1 file or is it no problem to use 10 files or more like on most frameworks?
-
Is it better to put all your CSS in 1 file or is it no problem to use 10 files or more like on most frameworks?
-
Thanks a lot for this usefull info, it helped me understand this better.
-
Hi,
From a code management point of view - as Peter says it's very common practice to split your CSS into different files as they are then much easier to manage and maintain. You can use a tool like Yahoo's YUI compressor to minify - as Bradley says - and aggregate (merge) these files.
From a web performance point of view, less files does not always mean better performance. Web browsers used to only download up to 2 files per domain, but now it's pretty standard for them to support 6 or more. See a browser breakdown for Max Connections and Connections per hostname here: http://www.browserscope.org/?category=network&v=top. I wouldn't recommend trying to split across 6 files, but you might find that if you have one massive CSS file it will download quicker when split up.
There is another disadvantage to having a single, CSS file in that you're not making the most of web browser caching. Every time you change any CSS, all users will have to download the entire file again. Again this may not be a problem for you, but something to bear in mind.
My advice would be to point Google Pagespeed at your website's key pages and act on as much as the feedback as possible: https://developers.google.com/speed/pagespeed/. It is a fantastic resource and presents its findings very clearly.
George
@methodicalweb -
That's what I was thinking too.. Currently, most of my frameworks have 10 CSS files, which means you have 10 server requests. Page speed as in my eyes a very important factor, therefore this question...
-
You could split them up based on where they are needed but that would become complicated. The advantage of splitting CSS on a large site is really to better organise the functionality of the CSS, e.g. system.css.
Peter
-
For a production environment, I would suggest having one minified CSS file. This will reduce file size (minifying) and server requests (1 file as opposed to 10). This will help reduce page load time.
Of course, on your staging environment, or in an archive of the website, it would be best to have your stylesheets broken down into an easier to manage system. That might mean multiple CSS files, it might not - it's up to you to manage.
-
Thanks for your answer!
It makes sense, because on large sites you will need different styling on different type of pages? So when you would put it all in 1 file, al this CSS would be loaded on ALL pages, while it's only needed on some particular?
Or what's the advantage here?
-
It really depends on how big your site is and how complex your CSS. On a small site or if it has minimal CSS one is perfectly adequate. On a larger site with lots of pages and CSS it makes sense to break down the the CSS around their function.Peter
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Yoast SEO sitemap link 404 problem
I have recently moved my wordpress blog from a subdomain into a directory e.g. www.mysite.com/blog/ and installed yeast SEO however when I go to the site map as directed in the pluign panel www.mysite.com/blog/sitemap_index.xml its not there are I get a 404 error? Any help much appreciated.
On-Page Optimization | | SamCUK0 -
"Heading 1" vs. "Title" Style for SEO
In Word, you can specify "Heading 1" text which Google presumably treats the same as an HTML tag. Is there any benefit in using the "Title" style? Is it the equivalent of a web page's title?
On-Page Optimization | | BlueLinkERP0 -
Best Way to Use Date in Title
Hi, I do most of the current copy for our blog which you can find here http://appointedd.com/blog/ I believe having a regular blog structure with a mix of irregular ad hoc posts to go in around these. So, for this blog, I write an article on "Beauty Industry News" every week. Now, I don't want to use the same title for each post, so I've peen butting in the date after each one i.e. "Beauty Industry News - 24/04/13". Is this best practice or is there a better way of naming regular posts? Thanks in advance!
On-Page Optimization | | LeahHutcheon0 -
Sitemap error is reported when using a sitemap-index generated by Yoast
I've installed the Yoast SEO Plugin for wordpress and I've setup the sitemaps using it. I saw the tool has generated the Sitemap index file http://www.phraseexpander.com/sitemap_index.xml with different indexes for posts and pages I've submitted that to google and it's indexed. When I use Seoquake to check my website, I see that it says that the sitemap is missing (in fact http://www.phraseexpander.com/sitemap.xml) is returning 404. Shall I fix that? Shall I do a 301 redirect in my .htaccess file to http://www.phraseexpander.com/sitemap_index.xml Thanks.
On-Page Optimization | | nagar0 -
Using Transcription Service For Videos - Have Question Around Search and Spiders
Hi All, So I have put together a weekly video series on security topics. I have read an SEOmoz post around how you can boost SEO by adding the transcription to the page, which makes perfectly good sense. My question is, can I include the first couple of paragraphs and then have a "read the full transcription" so when the user clicks, the rest of the content appears? Do the search engine spiders only crawl the first two paragraphs in this instance or do they crawl the whole thing even though the entire content is not on the page? I dont mind making the page longer and including the entire transcription if it is easier for SEO but if there is no difference, than I think the first option would be the best user experience. Thanks for the help Pat
On-Page Optimization | | PatBausemer0 -
New CMS system - 100,000 old urls - use robots.txt to block?
Hello. My website has recently switched to a new CMS system. Over the last 10 years or so, we've used 3 different CMS systems on our current domain. As expected, this has resulted in lots of urls. Up until this most recent iteration, we were unable to 301 redirect or use any page-level indexation techniques like rel 'canonical' Using SEOmoz's tools and GWMT, I've been able to locate and redirect all pertinent, page-rank bearing, "older" urls to their new counterparts..however, according to Google Webmaster tools 'Not Found' report, there are literally over 100,000 additional urls out there it's trying to find. My question is, is there an advantage to using robots.txt to stop search engines from looking for some of these older directories? Currently, we allow everything - only using page level robots tags to disallow where necessary. Thanks!
On-Page Optimization | | Blenny0 -
Can I use the first sentence of my page content as a meta description tag as well?
I just want to copy my content on the page and use the first or as well the second sentence of the content self for my meta description tag. Is that OK? Or should the Meta description tag be different?
On-Page Optimization | | paulinap19830 -
I have a direct question about file structure.
This question is about a new file structure and SEO friendly URL's. Does a file name make a difference? I have a direct question about file structure. Our old site was formated with a URL of http://rousechamberlin.com/about_us.aspx our new site is structured http://rousechamberlin.com/AboutUs/ no file no extension. As the SEO guy of the company and not the programmer my feeling is this is killing us. Does anybody have any thoughts on this?
On-Page Optimization | | HeadWebChef0