What would the LA Times' datajournalism about teachers look like in the UK?
Last week I was in Berlin for a meet-up around the themes of open data and datajournalism, and one of the speakers was Eric Ulken. He had worked on the LA Times 'datadesk', a loose affiliation of 'computer assisted reporters', investigative reporters and members of the interactive technology and graphics teams. They worked on projects like the 'Homicide blog' and the maps that accompanied it, which came out of a single reporter's dogged perseverance in blogging every single homicide that occurred in L.A. County.
Ulken said that reaction to that particular blog and the map attached to it had differed in Europe to the reaction in States.
Firstly, and rather grimly, he pointed out that some of the reaction in the US was to consider the level of homicides 'business as usual'. As the project was noticed around the world, he found that Europeans had a concern about the privacy implications that didn't factor into discussions in the US. Reporting the location of crimes and race of murder victims was commonplace in the States, but some people in Europe thought this was unnecessarily obtrusive.
If the 'Homicide blog' made Europeans feel uneasy about privacy issues, Eric then topped that by showing the L.A. Times project to asses individual named teachers within the public education system.
This sparked the longest interjection into anybody's talk in Berlin, as the ethics were debated. The tool uses data from standardised test scores to work out whether individual classes have achieved above or below what was expected of them, using that to consequently grade the teachers.
Some of the issues raised by the data meetup group were around whether the public has a right to know the effectiveness of individual named employees, even if they are public employees. Another issue was that the test score produced numbers, but that the scale displayed to public was qualitative, e.g. 'most effective' or 'least effective'. The teachers were distributed evenly between unsatisfactory and outstanding, but there was also no indicator of whether being 'unsatisfactory' by this measure actually meant you were a bad teacher. If the margin of error was large, and distribution from the mean low, then the results could actually be meaningless.
I got to wondering how this might play out in the UK.
The L.A. Times lists a top 100 teachers by name, but doesn't make a similar list of the bottom 100 available - although browsing through the data would soon uncover poorly rated teachers. I suspected, that given a similar set of data and the willingness to publish, our tabloids might well go at it from the opposite angle - naming and shaming the teachers who are a waste of taxpayers money. I think as a nation we'd be more likely to adopt the tone of the witch-hunt, as we've seen with social workers who have failed children, than try and praise high achievers.
I don't think a similar service would happen here at the moment though. I'm not a media laywer, but I'd think that most of the British press would consider it a legal risk to label a named schoolteacher as 'unsatisfactory' based upon their own interpretation of test results.
But it does beg a big question about datajournalism around schools in the UK.
Whilst we'd probably baulk at rating individual teachers on the test scores achieved by their pupils, we fall over ourselves to rank entire schools, colleges and universities on the same basis. And what is that except an aggregation of the individual effectiveness of the teachers? If we accept that academic institutions can be ranked based on test scores, why shouldn't the public be able to drill down further into that data?