Grade 'F' all round for British newspaper homepages on YSlow

Martin Belam  by Martin Belam, 30 July 2007

Last week the Yahoo! Developers Network released a Firefox add-on called YSlow. It combines with the fantastic Firebug add-on to allow you to assess a web page against Yahoo!'s 13 criteria for delivering a really fast scalable web service.

I thought it might be interesting to benchmark British newspaper homepages against the YSlow performance metrics, to see how well they performed.

Some of the technical solutions that Yahoo! recommend require significant investment in infrastructure.

I suspect most newspaper editors in the UK would be more likely to think that a CDN was some type of new disc format they can give away free at the weekend to artificially boost their circulation, rather than understanding the benefits of a Content Distribution Network.

As it turned out, the results I got when I tested the pages using YSlow weren't all that great - all 8 newspaper homepages I looked at scored an 'F' grade.

20070730_mail-yslow-rank.gif

Newspaper sites in the UK fall foul of the Yahoo! suggestions in a number of ways.

One prominent problem was the number of different domain names that had to be resolved when rendering a page. It seems that, thanks to using a diverse range of advertising, the majority of British newspaper homepages serve their content and advertising from a plethora of fully-specified locations. This requires more frequent DNS look-ups for the ISP delivering the pages, and impairs performance.

At the time of testing, for example, The Mirror was serving content on their homepage from 11 different fully-qualified domains:

  • www.mirror.co.uk
  • images.icnetwork.co.uk
  • www.icnetwork.co.uk
  • ad.doubleclick.net
  • publish.vx.roo.com
  • altfarm.mediaplex.com
  • img-cdn.mediaplex.com
  • www.utarget.co.uk
  • www.google-analytics.com
  • m1.2mdn.net
  • img.mediaplex.com

There are always performance and maintenance trade-offs in serving a site, and one of those is how to maintain JavaScript and CSS files.

Having multiple small files which are externally called to produce a fully functioning JavaScript library for a site, or the full stylesheet, can make version control easier, and seem more efficient from the production side of things than maintaining one long file.

However, breaking CSS and JavaScript into multiple externally called files increases the number of HTTP requests that a page requires to render and function properly.

In this respect the Daily Mail is quite guilty, calling 15 external JavaScript files and 9 external stylesheets, but the newly redesigned Guardian homepage takes the biscuit here, requiring 19 JavaScript and 7 CSS files to be served.

The Express and The Times also seem to call the same external JavaScript file from several different locations in the page, causing additional unnecessary load and bandwidth expense on their hosting infrastructure.

Yahoo! suggest some optimisation techniques, like including all stylesheet links within the <HEAD> of the HTML, and putting external links to scripts at the bottom of the page. Quite a few newspaper website homepage featured code contrary to these suggestions.

The lowest rated British newspaper homepage that I checked with YSlow was The Sun's. The principle fault here was the sheer size of the download footprint of The Sun's homepage.

Somehow they've managed to produce a homepage which isn't particularly multimedia rich, yet which clocks in at a whopping 725.8k to download. This includes 107k of HTML, 130k of CSS and 157k of JavaScript. The Mirror wasn't far behind, with a 696.8k footprint on the day that I checked.

Pos. Newspaper YSlow grade YSlow score Page footprint
1: The Telegraph F 49 306.5k
2: The Independent F 47 329.6k
3: Daily Express F 40 537.7k
4: Daily Mail F 39 563.8k
5: The Times F 38 335.6k
6: The Guardian F 36 557.8k
7: The Mirror F 29 696.8k
8: The Sun F 29 725.8k

Newspapers online should be aware of two key performance metrics with serving their websites, which either impacts on the cost of serving the site, or on the potential reach of the site.

One is the amount of unnecessary bandwidth they are wasting, and paying for. If caching is enabled properly on a site, the page footprint can be drastically reduced for a second or subsequent download by the same user.

The Times front page, for example, clocks in with a footprint of 335k on initial download. With caching enabled, this can be greatly reduced, as the regular The Times screen furniture, CSS and scripts don't need to be downloaded again . Add that up for repeat visitors over the course of the year, and you will be talking about big cost savings.

20070730_times-cache.gif

Another issue the papers should be aware of is the impact on download speeds of these heavy pages.

On a big fat corporate or broadband connection, these pages load pretty instantaneously.

However, more than a third of the UK is still on only on dial-up access speeds. The Sun's 725.8k homepage footprint can take up to an astonishing two minutes to download every single element on a 56.6 Kbps connection.

When I used to work on the BBC.co.uk homepage, the aspiration was for the entire homepage footprint including graphics, HTML, CSS and JavaScript not to exceed 100k - a 7th of the size of The Sun's whopper.

So, according to YSlow anyway, it is pretty much a verdict of "could try harder" all round for British newspaper websites.

1 Comment

More interesting, perhaps, is how many of Yahoo's disparate portfolio follow their instructions. Although I think they might in general be improved by ignoring their 13 rules, and following Ian's 1st rule of optimisation for Yahoo: Stop Being Rubbish.

Keep up to date on my new blog