“Visualising big data” - Christopher Osborne at Hacks/Hackers London

 by Martin Belam, 3 August 2011

Yesterday I blogged my notes from my colleague Laura Oliver’s talk about news community management at Hacks/Hackers London. The other talk at last week’s event was from Christopher Osborne of ITO World, about visualising big data.

Christopher made it quite clear that visualising data should serve a purpose - “we are at peak infographic” he said.

“The coffee table infographic world is really red hot right now. Nobody cares, apart from if you want something for your coffee table.”

Instead Christopher wanted to concentrate on data graphics with a purpose, and to that end showed the way that crowd-sourced mapping of the disaster zone in the recent Haiti quake had enabled search-and-rescue missions and saved lives. He wrote about it for the Guardian - “Mapping a crisis

The Haiti mission had roots in the Open Streetmap project, and Christopher showed a time-lapse movie of the mapping of London. It starts with “one strange guy” relentlessly mapping his local area, and then begins to blossom into a fully comprehensive map of the capital.

Another stunning visualisation Christopher shared with us was the way that European airspace had been affected by the volcanic ash cloud last year. Not only was the image of thousands of flights over Europe ceasing and then restarting interesting in itself, it was coupled with a meter estimating how many tonnes of CO2 emissions had been saved during the enforced shut-down of commercial air travel.

Christopher was somewhat disparaging of data fetishists. He argued that an attitude of “Lets make some websites with lots of data” gets you nowhere since “ordinary people don’t give a shit” for data, they want “stories”

I did think there was a paradox at the heart of Christopher’s argument. There is no doubt, as Scott Byrne-Fraser and Alastair Dant have said before about dataviz at Hacks/Hackers events, that the story-telling needs to be more important than the aesthetic appeal of the graphics. Yet, inevitably, the demos that drew the most gasps and admiration from the Hacks/Hackers crowd were the most visually stunning ones.

After the talk in the Q&A session, there was an interesting exchange with Guardian data journalist, James Ball. One of the examples Christopher had used in his talk were some plain top ten tables he had produced for the Daily Mail, listing the most dangerous streets in the UK according to crime statistics. James tried to press him on the ethics of supplying that data to the press, when the underlying methodology of collection turned out to be flawed, as ruthlessly exposed by The Telegraph’s Conrad Quilty-Harper. Christopher argued that the responsibility for checking the data lay with the commissioner and the journalist, not with the chart-maker. I think it would be fair to say that there was quite a difference of opinion in the room on that - and that many journalists would assume that the data checking would be done by the data experts.

I thought Christopher was a fascinating example of the kind of roles digital journalism is enabling. He opened his talk by saying that he had wanted to be a writer, but there was no money in it, and then he got into computers. It does seem to be the case at the moment that “interest in journalism and media” + “ability with computers” = “opportunity”.

And Christopher’s best line?

“It is quite scary being emailed by Tim Berners-Lee. It is like an email from God basically.”

Hacks/Hackers London: Notes from the talks brings together notes from 16 talks, including those from Martin Rosenbaum, Stephen Grey, Alastair Dant, Scott Byrne-Fraser and Wendy Grossman. It looks at topics of interest to journalists and programers alike, including freedom of information, processing big data sets to tell stories, social activism hack camps, the future of interactive technologies, and using social media to cover your tracks - or uncover those of somebody else.
Hacks/Hackers London: Notes from the talks for Kindle is £1.14.

2 Comments

Hi there Martin

Some good points here, and you capture some of what I failed to articulate on the night. Presenting to a media/journalism crowd rather than a tech crowd is new to me.

The tables we produced from the police data are a good case study - it was "official" data, released to great fanfare with an embargo meaning that the turnaround from release of data to producing a story was short. When taking an official gov dataset, you do rely on the data owner performing quality control, error checking, and generally taking responsibility for the data before release. Sadly, this proved not to be the case. For that part, the responsibility clearly lies with the data owner. Imagine if the Bank of England or ONS released such flawed data, there would be very serious consequences.

How data analysts, visualisation artists, journalists etc work effectively with data driven stories is a different matter. I believe this is something that every news organisation is struggling with right now, how to integrate data journalism into their existing processes. Hence my frustration for the current infographic trend - it appears to come from a culture that doesn't understand the potential of what can be done with data, and just wants to publish a striking image on a page. On the flip side, I've seen a lot of data journalism that amounts to nothing more than throwing some data on an interactive map - where is the journalism in that?

There's more lessons here with the police data, being visual people we were very keen to produce lots of maps and being analysts we also wanted to dig deep into it to uncover stories. Being an outside organisation, and the hired "data experts" we also had to deliver what was wanted - top ten tables. We learnt that the way most people want to receive and absorb information is in a much simpler format than we would normally produce, when you spend all your days with huge datasets you quickly lose sight of that. The lesson for news organisations is that you have to understand data and integrate it with your journalist/news desk teams to make it work for you. When we receive media requests, which are quite common, it nearly always takes the form "we want an image of x", not "we are really interested in understanding this data, what can you do with it?"

With the police data, the story very quickly became about the poor quality of the data and the huge public demand for more (accurate) information about crime. We worked closely with the Mail as the story evolved, and I believe they were extremely happy with a story that was their most read on the Mail Online for two days in a row.

Wow. Those videos and the write-up was absolutely fascinating. Also, Christopher's comment above was extremely interesting. Thank you ;).

Keep up to date on my new blog