Friday 25 April 2014

Lies, damned lies, and Cardiff University academics



Mark Twain is the likeliest candidate for having coined the killer quote, "There are three kinds of lies: lies, damned lies, and statistics".

I've always felt that to be deeply unfair to statistics. 

Statistics are invaluable. It's how they are used - and misused - that's the problem. 

Think of it this way, if you like. People (though not Mark Twain) sometimes say, "There's no such thing as a bad dog, just a bad owner." Well, I'd say there's no such thing as a bad statistic, just bad users of statistics.

As someone who (in the past) based an entire BBC bias-related blog around statistics, I'm bound to say that of course. But the importance of using statistics to help answer the question "Is the BBC biased?" is something that many people (besides me) cling to with hope in our ever-optimistic hearts of finally nailing the BBC down over its strangely slippery bias.

That's why high-profile statistical reports about BBC bias always create such a stir (in certain circles).

Not that they usually seem to do much good. 

However watertight they may be, people who dislike their findings are free to dismiss them (and they'll find plenty of like-minded people cheering them on as they do so.) 

Similarly, it's a simple fact of life that reports that come to conclusions about BBC bias which confirm our personal opinions tend to be the ones we find most persuasive, and about which we are least critical. 

Thus, my world-famous interruption stats proved popular with UKIP and certain Conservative MPs, plus right-leaning readers of blogs about BBC bias. Left-wingers ignored them. The BBC dismissed them. 

Can you guess what I found?

Similarly, a Centre for Policy Studies report investigating the BBC News website's coverage of right-wing and left-wing think tanks, tied to citations of the right-wing Telegraph and left-wing Guardian, found an enthusiastic response at the Telegraph and at right-wing blogs but got trashed at the Guardian and left-wing blogs, and the BBC actively rubbished it. 

Can you guess what it found?

Plus (and from a different perspective), when Labour activist Phil Burton-Cartledge published findings about BBC bias on Question Time on his A Very Public Sociologist blog - along with his own interpretations of those results - the left-wing Twittersphere and the New Statesman were delighted. Right-leaning blogs like Biased BBC weren't impressed and couldn't believe what they were reading. 

Can you guess what Phil found?

Next. When John Robertson, an academic from the University of West Scotland, provided detailed statistical evidence of pro-union/anti-independence bias in the BBC's coverage of the independence debate, pro-independence/pro-SNP people danced a caleigh while anti-independence/pro-union types cocked a deaf 'un. The BBC went on the warpath. 

The mostly influential BBC bias-related statistical report of recent times though came from the University of Cardiff. It provided the statistic backbone for the BBC's own 'independent' study into the BBC's coverage of immigration, bias and religion - the Prebble Review - before developing a life of its own (thanks to the efforts of the left-wing academics behind it), finding great favour with left-wingers like Owen Jones, It appalled the Right. 

Can you guess what it found? 

We've dealt with all of these before at Is the BBC biased? and, I hope you'll agree, dealt with them fairly - except, of course, as regards my interruption stats, which are an unassailable paragon of objectivity and statistical rectitude and, thus, completely beyond criticism!

As a result, the CPS findings were found (by me) to be flawed, even though they accord with my own findings and right-wing intuitions, and Phil's finding on Question Time were given one-and-a-half cheers (not three cheers and not no cheers), despite running counter to my own suspicions and despite Phil's left-wing gloss on them. The Scottish independence report was only touched on briefly, and I can't really give a proper opinion on it without wasting hours and days investigating it (which ain't gonna happen), though it looked pretty convincing to me, even though I think a 'yes' vote in Scotland would be a very bad thing. 

That's me plugging my attempts to be fair-minded there. (You'll have spotted that, of course, being shrewd types.) It's me saying I 'dis' the Right' and 'give respect' to the Left if I see the evidence inclining their way. 

I can't help it. That's the way I am. So gimme a medal for it.

But even I was far from respectful towards the the most influential report of all, and joined in the robust rubbishing of it by many on the Right. 

This was the Cardiff University report, which I regarded as a crock of shite right from the start.

It has a curious history. It was part-commissioned by the BBC itself, as part of the 'independent' Prebble review into BBC bias. The Prebble Review used it, by and large, to advance the view that the BBC was, by and large, impartial - much as you'd expect from an official BBC 'independent' report. Then the Cardiff University academics behind it began giving it a new life, using it to advance the startling view that the BBC is pro-Right in its bias.  

The Left loved it (some, no doubt, with tongue firmly in cheek.) We right-wing bloggers were aghast. Alan at Biased BBC bashed it. I bashed it here at Is the BBC biased? 

And we were right to do so.

One of our lines of attack on it smacked somewhat of the ad hom fallacy, but - as fallacies go - this one had legs, and it was worth using them to run with. Let's sprint through it again.

We noted the BBC Trust's direct funding for the report. 

We noted that Cardiff Uni prof Richard Sambrook (former director of BBC News, the BBC World Service and BBC Global News) wrote part of it. 

We also noted that the lead author charged with publicising it, Mike Berry, is associated with the far-Left Glasgow Media Group (a group that tends to see blues under the bed). 

Plus, we noted that the other listed authors contained far-left Red Pepper writer (and European Commission-commissioned academic) Karin Wahl-Jorgensen, Kerry Moore (who has written about the British media's ill-treatment of Muslims), Lina Dencik (who describes herself as an 'activist') and Arne Hintz (who describes himself as an 'activist')
...in other words, as fine a collection of disinterested academics as you could possibly hope to find - as I'm sure you'll agree! 
The report got another media boost (in the left-leaning press) when Cardiff Uni media department boss-man Justin Lewis accused the Beeb of downplaying the report it funded/commissioned and of suppressing its findings that the BBC is pro-Right. We noted that Prof Lewis is a dyed-in-the-wool leftie, whose take on the BBC comes from the far-Left

We were suspicious, to say the least, of a BBC-backed report which the BBC used ('independently') to back its contention that it is an unbiased broadcaster of news. 

We were even more suspicious - given how convenient it is for the BBC to be able to advance one of its favourite arguments, "We get complaints from both sides; therefore, we must be getting it about right" - that the BBC-backed academics behind it (including a former VERY senior former BBC news supremo) then trotted out the remarkable claim that the BBC is actually right-biased.

Moving away from this line of argument, I complained that the Cardiff report used "a very small sample, the exhaustive concentration on which could lead to some very skewed results", contrasting that (modestly) with "my own intensive period of research", noting that "a massive sample such as this is surely preferable to a small sample, such as that used by the Cardiff team. Small samples are more prone to lead to statistical errors."

Well, blow me down with a feather, but there's a new report out today from the right-leaning think tank Civitas that rubbishes the Cardiff report along much the same lines as Is the BBC biased? and Biased BBC

It's an-easy-to-read 21-page report (and that even includes the footnotes). It may even turn out to be shorter than this post. 

I both recommend it and give my thanks to its authors, David Keighley and Andrew Jubb, for agreeing with me over everything I've ever said on the subject. I only wish they'd dedicated their report to me. [For any passing non-British people, this is an example of self-deprecating British humour. I don't actually mean it]. [[Or do I?]]. 

David and Andrew find that the BBC's Prebble report is "seriously flawed". They discuss Stuart Prebble's links to the BBC, the links between the BBC and Cardiff Uni's media department [another senior member is Richard Tait, former BBC trustee and Newsnight editor], and the university project director and the EU, saying the independence of the project is "severely compromised." They note (something we didn't) that the report was commissioned by BBC trustee David Liddiment, an intimate of the self-professed "liberal progressive" Mr Prebble. 

Excellent, compelling ad homs, as I'm sure you'll agree, but as a stats lover it's the critique of Cardiff Uni's methodology that really interests me...

...especially as it backs up my own criticism here at Is the BBC biased?.....[sorry for the break but I've had to move my big head off the keyboard here to type again].....that the Cardiff team based their study on what I at the time called "a very small sample, the exhaustive concentration on which could lead to some very skewed results" (contrasting it, oh-so-modestly again, with my own nine-month, exhaustive study of every interview with a party politician (some 2,200 of them) on all of the main BBC current affairs programmes over this period), adding that "A massive sample such as this is surely preferable to a small sample, such as that used by the Cardiff team. Small samples are more prone to lead to statistical errors."

David and Andrew's criticism of the Cardiff report focuses on several areas of concern that those familiar with statistics will be well aware of. 

The Cardiff team studied the Today programme, understandably. But they only studied part of it - one and a half hours (the same one and half hours) each day (i.e. only half of its air time), and missed an entire eight of their one-months-worth of programmes in 2012 - something I would never have countenanced when I did my nine-month survey of BBC interruptions. (A single missed edition, I felt at the time, would have scuppered it). Cardiff covered too short a period and missed too much.

This also provides an example of what are known as 'constant errors'. Omitting the first hour-and-a-half of the programme and the Saturday editions they omitted such things as the (pro-EU-inclined) business spot at 6.15-6.30 and, even more important, the mass of BBC interviewer-BBC reporter interviews that dominate the early stages of every Today programme. Why didn't they monitor Today over the entirety the programme, or over different periods of the programme each day? Why be so rigid?

Then there's 'convenience sampling', where the Cardiff researchers reached back, grabbed and re-cycled a random study from 2007 (something 'at hand') to compare to 2012, based on the BBC Trust's 2007 Bridcut Report into impartiality. This was presented as a 'before and after' but, as they had no report at hand from before Bridcut, they had to use one from just after Bridcut, making both post-Bridcut. 

David and Andrew's research is far more extensive and comprehensive. They recorded everything over the period Cardiff covered (luckily for us). They spotted 21 guests in that month in 2012 who spoke in favour of the EU/EU legislation. Fully 12 of them would have been missed by the Cardiff team because they appeared before 7.30 am or on Saturday. 

Worse, due to the Cardiff team's extremely restricted choice of criteria as to what constituted a relevant EU discussion, a further 9 speakers fell out of the Cardiff survey's field of sight. Only one of those 21 pro-EU guests, therefore, registered on the Cardiff survey. Wiping 20 pro-EU speakers out of a survey of pro-EU BBC bias certainly strikes me as a good reason to dismiss the Cardiff team's findings. As the Civitas researchers say, 
this methodology [by Cardiff] would have identified only 4% of the total number of pro-EU speakers who appeared on Today between 15 Oct and 15 Nov 2012.
They then note what will sound remarkable to many of us - the way the Cardiff researchers blatantly conflated ALL UKIP and Conservative right-wingers as holding identical views on EU withdrawal - despite the fact (as we righties - unlike those Cardiff lefties - know very well) that there's a huge range of opinion on the Right about Europe, from UKIP 'withdrawalists' to Conservative 'stay-in-the-EU-and-renegotiate' types and Conservative Europhiles). 

Even though the Cardiff team spotted a fall in UKIP voices on the BBC between 2007 and 2012, they (and Stuart Prebble) dismissed that because UKIP-type views were expressed elsewhere by Conservatives (many of whom didn't share UKIP's views on withdrawal). 

David and Andrew say
...in 221 full weeks of analysis of the Today programme, between 2005 and 2013, Newswatch’s research shows that Conservative withdrawalists appeared on only 14 occasions (equating to less than four times a year) and UKIP appearances outnumbered those by the Conservative withdrawalists by a ratio of almost six to one. Conservative ‘come out’ supporters made only 0.4% of the total EU-related speaker appearances, compared to 2.2% from UKIP. 
...which throws another spanner into the cogs of the Cardiff media machine.

The lesson of all this is that, in the absence of official BBC statistics on bias/impartiality (something they never publish), we shouldn't give up attempts to be statistical but we should, nonetheless, be rigorously sceptical about any such statistics-based surveys that come our way...

...and, I suppose, any attempted rebuttal.

1 comment:

  1. Bravo Craig!
    I only hope this article gets its fair share of attention.

    ReplyDelete

Note: only a member of this blog may post a comment.