Skip to content
Moz news 2 649533a

Site Performance

C

This YouMoz entry was submitted by one of our community members. The author’s views are entirely their own (excluding an unlikely case of hypnosis) and may not reflect the views of Moz.

Table of Contents

C

Site Performance

This YouMoz entry was submitted by one of our community members. The author’s views are entirely their own (excluding an unlikely case of hypnosis) and may not reflect the views of Moz.

In order to make sure my main site is working within the new parameters of what Google considers good ‘site performance’ (which is now part of the google algo) I did some testing. I appreciate that this is only one ranking factor out of 200, and a very new one at that, but it could turn out to be a very important one for me and my work, so I wanted to understand things a little better and get the best ‘performance’ I could without sacrificing site design.

_______________

You can see Matt Cutts discuss site performance on this video (two-thirds in) at a guest speaker slot he did in Belgium. In it he does say that all things being equal Google will rank a faster loading site above a slower loading site in the SERPs  ...

http://www.youtube.com/watch?v=hky-tXyAcqA

What is 'site performance'? From Google: “Page load time is the total time from the moment the user clicks on a link to your page until the time the entire page is loaded and displayed in a browser. It is collected directly from users who have installed the Google Toolbar and have enabled the optional PageRank feature”. [sic. I am aware of trusting this data set implicitly and of what is hidden from the data set].

_______________

I used a test site over 4 months to check if average site speeds affected a sites relationship with Google. The site was around 100 pages of full content at the start of April, and I regularly added unique content pages over the 4 months, up to 200 pages. Rather than just test visitor reactions to different loading times, bounce rates etc. (which I also did) here I am specifically talking about the sites relationship with google. Now here is the caveat, this is only one site, using a very shaky variable google puts under it's 'labs' area in webmaster tools, so read the following with the caution and spirit with which it is written. Call it a set of interesting observations, no more.

The test site is full of unique and interesting content which would eventually (in some form, without duplication) ship over to my main site, and so I did not waste google’s time in crawling or indexing it etc. or my own in creating it. The data I amassed has allowed me to improve my main site design immeasurably, and keep the average page loading times of that site down to below 2.0 seconds (it is actually at 1.8 seconds) for my targeted audience. So the data I pulled out of the test was of incredible value to me, and a case in point for me to learn to do more testing.

To do the test I cranked up the average page loading times of the test site over the 4 months to a very slow 15 seconds (average) and then brought the load times back down to about 1.7 seconds (average) – to do this I used Java scripts, and plugins, and all sorts of slow loading high rez images etc. etc that I might want to use on my main site – therefore to slow the page load times down, or speed them back up, I either added or took away these components. I was particularly keen to see what size and quality of images could be used, and how many I could use on a page, as well as how certain plugins behaved etc that might be of use.

I also kept the data points for each of the days of the 4 months google updated me with my new average page loading times and how fast new pages were being indexed, what effect (if any) it had on SERPs and User Experience etc. I did a ‘double- dip’ check, (doing a quick fall and rise test either side of the main peak) and I repeated the whole test over a shorter time period. The double dip check and the repeat test really link in with what the google crawl stats were telling me, and are perhaps the most interesting of the data points from my point of view.

Site Speed Graph

The first thing I noticed, and it is obvious within the other data sets I collected (and something you probably already knew) is that over a 4 month period, and within the double dip experiment, and in a repeat of the whole test in July, is that Google crawled far more web-pages, in more depth, with more thoroughness, when the site had a fast (average) page loading speed. The visits of Googlebot over these 4 months came almost in tandem with the graph shown in my google webmasters site performance page, it came far more often, and crawled in more depth, when the average page load times were low.

At the point when the site was loading pages at 15 seconds on average, at the start of May, it came less than 10% of the time, as when the page load times were below 2 seconds on average in April and June. In May there was caffeine etc. but over 4 months, and with the double dip check, and the repeat test, the results were always the same. Googlebot did indeed react to site performance (see a bit more on this later).

Theoretically, from my collected data, I started to know at what speeds the google algo started to punish my site and where the algo started to reward it for this ranking factor. By punish and reward I mean there was more crawling, faster indexing, and more pages being indexed etc. or the reverse. This is just one ranking factor out of 200 and I realize having domain authority and great content etc also influences googlebot's visit rate and indexing of pages ... but the only variable which changed in this test over 4 months was the design content i.e. an image with a very high KB size was removed and replaced with the same image with a medium or low KB size - and this was done for the majority of pages (which took hours at a time). New pages were added with regularity, with the same great original content written by me - and the target audience, website rationale etc. never changed. I used Pingdom tools in this regard to help over the 4 months as it also kept a history of each test I ran with them and the Pingdom results were always telling me the same page load times that google were, plus or minus 10% and the size of each component on the page, which helped in removing and adding components.

Note:  I have used the word ‘might’ in these following observations, as I do not want them to come across as anything other than something of interest to think about. It is hard to get a proper grip on the problems and issues of global (average) web site speeds and performance because of all the different levels of infrastructure and development accross the world; however, one early report I read (which I agree with) said, “Site Performance does have one notable advantage: global data. Because Site Performance tracks the 'experiences' of a wide range of users around the world who are using the Google Toolbar, the data may paint a more useful portrait of 'average' performance, likely making it useful to every Website owner in some fashion”.

My site would not be reporting 1.8 seconds loading time if the majority of my targeted visitors were not using broadband infrastructure. But no matter what the level of technology of the country's Internet service provision is, (broadband / dial up etc) the theory is that the average Website will adjust the overall size of each web page according to the targeted area. This is obviously going to be the case if you live in the country you are targeting and can see for yourself the page load times of your own site, using that country's Internet service infrastructure - which is probably the majority of all webmasters, who are locally or nationally targeting. And because Google are the ones globally doing the adjusting and averaging of the page loading times in seconds for each user visit, rather than just reporting on the page size in Kilobytes, the data set should still be relevant because it is all about actual visitor experiences reported back to Google.

  • The average page loading times across the Internet, (as of the middle months of 2010) is 2.9 seconds. If you have a site that loads pages on average quicker or slower than 2.9 seconds (to the people visiting your site)  then you are somewhere, plus or minus, the average. Theoretically this average is both independent and specific to geo locations, (taken from where google get their global data set from, i.e. “collected directly from users who have installed the Google Toolbar”). So with 2.9 seconds seeming to be the global 'average' you can see where you fall in terms of what your visitors to your site are experiencing - are you performing better or worse than the expected user experience in waiting for pages to load?
  • If you have an average page load speed of below 1.9 seconds, (to the people you are targeting, or visiting your site), you are in the top percentage and the algo might reward you (in my data sets by crawling the site far more, indexing new pages far more quickly, and keeping them within the index; and with a small jump in SERPs also, more so than even just a few points above 2 seconds). Google seemed to really like my test site below this 2 seconds barrier, which is faster than 70% of all Internet sites. Now whether this is 70% of all sites or 70% of sites targeting the same audience (and therefore is a peculiarity of just this site and target audience) I would not like to say. Other data sets from other sites would need to be collected, so I am not advocating anything, merely pointing out what I saw from this one data set, and repeated testing within it. However it is a performance barrier I am now keen to keep on my main site, and I have started to see very similar results over the last month on that site too. So in my humble opinion google likes the 2 seconds barrier to be broken for the average target audience visiting the site. Note: User experience was also seen to improve, bounce rates, length of stay, number of pages viewed etc. etc. all reacted very positively to breaking the 2 seconds barrier as well. This instance of just a small jump in performance is something Matt Cutts also says google found out (in the video link I gave at the start). Think of it like fly fishing, cast the bait too far out and the fish doesn't bite, cast it closer in and the fish follows it, cast it in the right spot and the fish bites. The trick is to get googlebot to bite.
  • If you have an average page load speed of above 6 seconds then you are in the bottom percentage, slower than 70% of sites, and the algo might start to punish you (in my data by crawling the site less, indexing new pages less quickly and dropping pages from the index).
  • The google algo might consider your site neutral if your average page loading times are between 2.5 seconds and 4.5 seconds (but only if my data set holds true for the rest of the Internet). By neutral I mean that there was nothing exceptional; crawl rates and crawl depth etc. seemed to be very similar across this performance range.
  •  As a newly announced ranking factor it seems they are more lenient on the upper loading times, (4.5 to 6 seconds) there was not a huge difference to crawl rates to the site when the times were just below or above 4.5 seconds, but this may change as they place more emphasis on this as a ranking factor.
  • If pages on your site take on average above 10 seconds to load (to the visitors coming to your site) then you might be significantly punished by the google algo, I saw the googlebot far less often, (and pages dropped out of the index etc. on a significant level) which other graphs and SERP’s results within my data set also trended.

As noted earlier, it does seem as if googlebot reacts to site performance and what it perceives to be 'user experience'. When I was looking at things over the real time 4 month experience, I saw that there was a lag time of around 6 days for the google algo to decide to punish or reward – i.e. it seems you might have about a week to rectify any problems before the google algo considers this a permanent state of affairs and not a server glitch. The good news is that if you improve page loading times within a week you will start to see the rewards. This did not just happen once, but many times over the 4 months, always following the trends in the graph, with a lag of 6 days. It seems Google doesn’t take long to react to the good and bad stuff, which is only a good thing in my books.

I hope some of this helps, or at least does not confuse ...

P.S.

I am also aware that where the server is based has a small impact on load times of the viewer in their browser – i.e. a server hosting the website based in the UK loads pages faster for UK visitors than visitors from the USA (due to the small lag in transporting the data over distance). I know this for a fact because I did a previous test (in the few months before I did this test) and on the basis of those results moved my server to the UK where I am based and where the main target market is, from the USA, and the times correspondingly altered (with enough significance to make it definitely worth-while). If you are targeting globally then not to worry, but if you have a country based target market then get your server placed in that country.

Back to Top

Snag your MozCon video bundle for even more SEO insights.

Read Next

The MozCon 2024 Video Bundle Has Arrived! (Bonus: Our 2023 Videos are FREE!)

The MozCon 2024 Video Bundle Has Arrived! (Bonus: Our 2023 Videos are FREE!)

Jul 24, 2024
That's a Wrap: The MozCon 2024 Day Two Recap

That's a Wrap: The MozCon 2024 Day Two Recap

Jun 05, 2024
Diving Into the Future of Digital Marketing: The MozCon 2024 Day One Recap

Diving Into the Future of Digital Marketing: The MozCon 2024 Day One Recap

Jun 04, 2024

Comments

Please keep your comments TAGFEE by following the community etiquette

Comments are closed. Got a burning question? Head to our Q&A section to start a new conversation.