Just a heads up, and the post notes this, that this way of timing is already well into the page load sequence: the server has received the request, built the HTML and has started sending it, and the browser is already parsing it.
Overall though, a cool hack even with the caveats above as it could give you actionable data.
Another good source of loading time metrics is Google Webmaster Tools. They have two metrics, one in the download time in the Crawl Stats section the Site Performance metric in the Labs section. Combined, they give you a really good insight into how your website is doing over time. A good game to play is spot the (very clear) inverse relationship between the number of pages crawled vs download time.
Yes definitely some issues with measuring page load time this way. It is useful data to get started with. The web timing API will make this data much better, but it's still early (it's in IE9 and some early developer builds of chrome and firefox). Once this API is more widely deployed, this is the right way to measure performance as it allows you to take into account navigation time as well as rendering time.
Google Webmaster Tools does indeed give you a good perspective on performance. I believe this data comes from the google toolbar (can someone confirm this?). My problem with the data reported by webmaster tools is that it's an average. It doesn't tell me what the _worst_ page load time of the _best_ page load time my users are experiencing.
As a shameless plug, you can also check out our own tools from yottaa: http://www.yottaa.com that give you a bunch of detailed information about your websites performance. I think tools like these complement the approach detailed in the post.
Hello, just a questions about yottaa.com: on my firebug Yslow(V2) I get an Overall performance score 91 (and in Page Speed I get 95). How come on yottaa I get a YSlow score of 78?
Looks like this was a result of old data.. The score of 78 was measured on October 2nd. I hit the "Click here to re-check now" button and the score is now listed as 88. We're looking at the sub-scores now to see exactly what's different.
Although I prefer Chrome Dev Tools for this task - measuring page load time of people with different connection speeds, CPUs (for rendering), browsers seems a little weird.
The graph about bounce rate caught my attention. Although the author draws a straight line, it looks more like a parabola with minimum around 3.5 seconds. Which is kind of unexpected (and I can't even think of any reasonable explanation for this).
- Page 'load' time is a bit ambiguous, I'd prefer page 'rendering' time, perhaps. (The author does note the limitation of this approach.)
- Time to window.onload may be a _very_ long time in some environments, especially when 3rd party scripts are present, especially (if memory serves) in IE.
- From eyeballing the "Bounce rate by page load time" the linear trend placed on top seems dubious -- if anything bounce rate seems to drop until the 3700ms mark. (The author notes in the text however that users seem tolerant up to the 5000ms mark).
Quibbles aside, having more page rendering time data correlated to bounce rate is an excellent idea, as I've seen some posts in web design land get pretty carried away with n-th degree optimization because they read speed matters for Google and Amazon.
In my experience, I optimized a site's page rendering time dramatically, expecting a bump in average pages per visit, and saw absolutely no change whatsoever. We need more data :)
Yes, I agree. GA is being used more and more as an ad-hoc analytics data store these days, and hopefully with some creative uses of the GA API we'll see more interesting uses that negate the need to use Excel.
You're correct about the issues with "page load" time. The approach in the post is really measuring the amount of time that the browser spends processing the page. However, for many modern web apps, most of the time is spent in these internal aspects of your page such as loading CSS, running javascripts, fetching images, etc...
The web timing API (there's another post about that here: http://blog.yottaa.com/2010/10/using-web-timing-api-to-measu...) we can get more detailed timers that count not just the browser time, but actually the full amount of time between typing in the URL into the browser (or clicking a link) and finishing the load of the page.
The linear trend line was created by excel automatically so I can't vouch for its accuracy other than my implicit trust of that feature.
Like the result from the article. Showing the page load time vs percent of users is a very useful way to analyze site performance:how many percentage of site users are experiencing slow performance?
Further, combing this with web timing API would be even more interesting.
Also, be careful with when different browsers fire the onLoad event. For example, Safari 3: http://www.howtocreate.co.uk/safaribenchmarks.html
Overall though, a cool hack even with the caveats above as it could give you actionable data.
Another good source of loading time metrics is Google Webmaster Tools. They have two metrics, one in the download time in the Crawl Stats section the Site Performance metric in the Labs section. Combined, they give you a really good insight into how your website is doing over time. A good game to play is spot the (very clear) inverse relationship between the number of pages crawled vs download time.