My 'biased' view of the Biased BBC blog - part two
Yesterday I wrote about my visits to the Biased BBC blog, and why I think it is a useful and enjoyable place to spend a few minutes on the web each day. I also pointed out a couple of the success stories the site has had in getting the BBC to make changes to online content.
I find though, that there is a real difference between what is written on the blog "proper", and what is posted on the regular open comments thread. Biased BBC usually operates an "Open comments thread" near, or at the top, of the homepage, which is refreshed every few days. These threads can often run into hundreds of inter-twining comments which can be quite difficult to pick through.
The site has the following disclaimer for the open thread comments:
Please use this thread for off-topic, but preferably BBC related, comments. Please keep comments on other threads to the topic at hand. N.B. this is not an invitation for general off-topic comments - our aim is to maintain order and clarity on the topic-specific threads. This post will remain at or near the top of the blog. Please scroll down to find new topic-specific posts.
I can't help but wonder whether the site would be better off replacing that mechanism with more of a forum format, which made it easier for people to pick their topics to comment on and to follow some of the interesting conversations and debates that develop.
Despite the good intentions of the open thread, often, it seems to me, the content of some of the comments undermines the credibility of the site as a whole. This is a real shame, as I've mentioned that I believe the editorial content can raise interesting and useful points about BBC output, which I'll be coming to later in this series of posts.
Recurrent themes in the comments that I don't find to be particularly credible, though, include:
1) The BBC won't cover this.
It is quite common for users to pick up on something like a Yahoo! News "story", which is two paragraphs of copy from Reuters or AP, and huff and puff that the BBC isn't covering this story since it deosn't fit into the leftist BBC worldview or some such accusation.
This conveniently ignores the fact the BBC News online, unlike services like Yahoo!, is not simply a service for reproducing the newswires, and doesn't generally cover anything except the biggest breaking news stories until it has a fully fledged article on the topic, or has at least verified further the story being carried on the wires.
After all, as Chris Paterson observed in his 2006 paper on "News Agency Dominance in International News on the Internet" the BBC is one of only four major international news gathering organisation left on the web:
This leaves us with a picture of an online news world (in the English language) where only four organisations do extensive international reporting (Reuters, AP, AFP, BBC) a few others do some international reporting (CNN, MSN, New York Times, Guardian and a few other large newspaper and broadcasters), and most do no original international reporting.
It isn't just agency stories that provoke this response. Yesterday, for example, someone was complaining that the BBC wasn't covering the Blue Peter quiz story - even though it had featured in the breaking news ticker online, was story #3 on the UK BBC News homepage and story #1 in the Entertainment section. Which doesn't count the coverage on the Blue Peter site's own front page. Or the fact that it was the main BBC press release of the day.
2) Lack of fact-checking
Another regular occurence is the assertion that the BBC has made a factual error (examples of accusations in the last few months have included crediting a carol on Songs of Praise as "The First Nowell" or the BBC stating that Gerald Ford was America's only unelected President), only for the commentator to have not fact-checked it properly themselves, and for it to be shown that they were the one who was incorrect.
I must credit the Biased BBC comment site owners with allowing both the original accusation and the correction to remain published. In fact, on the site, the moderation is very slight and light-handed, and only posts that are excessively sweary or offensive get zapped. Calling Yasmin Alibhai-Brown a mad leftist loud-mouth would be acceptable, but calling her a "bitch" isn't, which seems to be a good example of pro-acive moderation that doesn't inhibit free expression. Even when people disagree on the site, out-and-out flame wars and trolling are very rare.
3) Searching Google for something "proves" that the BBC is biased
The only thing that searching Google proves is that if you execute a particular search query on Google on a specific date, you get a set of results from whichever datacentre your query reaches, once Google has adjusted the results for whereabouts you are searching from, what your personal settings are, and according to whatever they have in their index at the time.
Which doesn't imply a great deal of accuracy in determining what has historically been published on the internet, when it was published, or by whom.
This does not stop people using the raw numbers Google puts up at the top of the page (which like most search engine mechanisms are quickly calculatedguesstimates) to conclude that the BBC has published x% more articles about this, or y less articles with the word "Castro" and "dictator" in then than with the words "Pinochet" and "dictator".
You can't, though, rely on Google's figures or rankings being an accurate measure of the content of the BBC News site.
Take this example - at the time of writing, a search on Google for the words "nazi tiger" and restricting the results to the BBC News site brings up this story from February 23rd 2007 at #4 - Tiger kills girl at Chinese zoo
The word "nazi" does not appear in the text of the article at all - so why does Google rank it so highly for that search term?
Well, looking at the cached copy reveals that the story was last indexed on March 9th, when at the number one spot in the "Most Read" stories section was "Living hell of Norway's 'Nazi' children"
Anyone using this kind of Google 'evidence' is failing to take into account what was in the the right-hand menu at the time a piece was indexed (not published) for any given story. That includes not just the headlines of what was the most emailed or read story on the BBC that day, but also the top stories from the same index section, the titles of related internet links and the headlines of any previous story linked to that story ID by the journalist. All of which provide ample scope to combine words that do not actually appear together in the main body text of the article, which renders Google's figures virtually meaningless.
In the next post in this series I'll be looking at another three ways that the commentators on the Biased BBC site regularly try to demonstrate BBC bias and, to my mind, regularly miss the mark, before turning my attention to some areas where the Biased BBC site does raise interesting questions about the BBC.