Automated webpage loading times measurements

In my dayjob I often have to evaluate the performance impact of web-proxies or different TCP Congestion algorithms on webpage loading times. In ye olde days it was pretty sufficient to use a simple recursive wget to measure the loading time, since there were only static items. However, today – or, better said, ever since this “Web 2.0” thing was introduced, there’s a lot of dynamic stuff going on between your web browser and the servers.

Javascript and AJAX might be used to load dynamic content; some webpages don’t even load without Javascript being activated; so even though wget might have loaded all static objects, this is not what a user would actually experience when a site is loaded in a real web browser. Wget simply lacks the ability to evaluate Javascript.
Also, new protocols like SPDY and HTTP/2 behave completly different than the older HTTP/1.1 in terms of parallel loading and simultaneous TCP connections, differences not easily measured with simple command line tools. Encryption complicates the process so that well known analysis tools like tcpdump or Wireshark won’t help you with microstatistics.

To measure the real loading times of a website one should first ask the question: What’s a real metric to measure the loading time from the user experience point of view? That’s where HAR comes to play. HAR is short for “HTTP Archive” and is a standard format for page loading metrics in JSON. If you’ve ever seen Firebug’s Network Timeline or Google Chrome’s developer mode network timeline – it’s all stored or can be exported to HAR.

You can use Firebug and other tools standalone, but what I need is a automated, reproducible and easily parseable process to collect enough samples to create large scale statistics. I’m talking about dozens or possibly thousands of individual samples. Nothing which can be done manually.

If you look into a HAR-file, you get an output like this:

{
    "log": {
        "version": "1.2",
        "creator": {
            "name": "Chrome HAR Capturer",
            "version": "0.5.0"
        },
        "pages": [
            {
                "startedDateTime": "1970-01-01T07:49:24.061Z",
                "id": "0",
                "title": "http://www.heise.de",
                "pageTimings": {
                    "onContentLoad": 2663.3759999983013,
                    "onLoad": 4179.255999997258
                }
            }
        ],
        "entries": [
            {

What we’re interested in are the pageTimings values, notably onContentLoad and onLoad. onLoad is the more interesting event, it’s basically the point in time when everything is loaded:

“The load event fires at the end of the document loading process. At this point, all of the objects in the document are in the DOM, and all the images, scripts, links and sub-frames have finished loading.”
Source: Mozilla Developer Network, https://developer.mozilla.org/de/docs/Web/API/GlobalEventHandlers/onload

Additionally, the HAR-spec says:

Page is loaded (onLoad event fired). Number of milliseconds since page load started (page.startedDateTime).
Source: HAR 1.2 Spec, http://www.softwareishard.com/blog/har-12-spec/

So. How is this useful? If I need to check on the performance of a proxy, or the TCP buffer settings on the server or just do a comparison between different HTTP-server configurations, I need to do a lot of measurements, not just a single one. It’s absolutely not feasible to open a web browser and note down all the timings manually or even save the HAR-files manually from Firebug.

Here I use the remote control feature of Google Chrome and chrome-har-capturer. It’s dead-simple to use. First, start Google Chrome with it’s debugging features enabled:

$ google-chrome --remote-debugging-port=9222 \
--enable-benchmarking \
--enable-net-benchmarking

Now just fire up chrome-har-capturer:

$ chrome-har-capturer http://www.heise.de|head -20
DONE http://www.heise.de
{
    "log": {
        "version": "1.2",
        "creator": {
            "name": "Chrome HAR Capturer",
            "version": "0.5.0"
        },
        "pages": [
            {
                "startedDateTime": "1970-01-01T07:49:24.061Z",
                "id": "0",
                "title": "http://www.heise.de",
                "pageTimings": {
                    "onContentLoad": 2663.3759999983013,
                    "onLoad": 4179.255999997258

You can now easily extract onLoad-timings, run this whole thing a few hundred times and do a proper comparison of the different settings you are benchmarking against each other.

The disadvantage of this approach is that you need a test machine with an actual screen, it won’t run headless. However, there are some headless tools, like DalekJS, which can do the whole thing without the need of a screen, with limitations though.

Good luck!

If you like this posting, share and/or comment.

A note: This was my first posting in this new blog. Stay tuned for more.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s