I spent a lot of today in Stories From Numbers, a day of talking about data-driven journalism and linked data from the BBC’s Future Now project.
It was a dense day full of interesting talks. Here’s one of the highlights.
More or Less
Richard Vadon, Richard Knight and Olly Hawkins are from the team at Radio 4’s More or Less. They led us on a fascinating trip through the world of statistics. Debunking bogus numbers is one of the things the show does best. And they’ve got plenty of targets, like this video:
It’s alarmist and discouragingly popular (over 10 million views so far!). More or Less debunked the stats with some good primary research, but sadly their video hasn’t been viewed nearly as many times as its xenophobic instigator.
They shared some tips on how to deal with figures:
1. If a number seems bogus, it probably is.
No one is immune from this. Richard Vadon, the show’s editor, showed us some examples of BBC News at 10 stories based on data that turned out to be completely spurious. Where did the journalists get their data?
The Home Office.
Even official government material can contain errors, dubious analysis, or . . .
2. Beware of Chinese Whispers
Just because a number is repeated often, doesn’t mean it’s right. “Bad stats are like zombies,” said Vadon. “They never die and they’re hard to kill.”
Doubly dangerous are studies that quote other studies. The Home Office figures that led to a false News at 10 report were quoting an independent study, which itself quoted a previous study. But statistics are sensitive. With each quotation, distortions can creep in along the way, and before you know it even government policy can be based on completely wrong information.
3. Read the Original Research
It’s oft-repeated that the UK is the most surveilled nation on earth, with 1% of the world’s population but 20% of the world’s CCTV cameras. The average Londoner is supposedly filmed 300 times a day.
These numbers originally come from a 2002 study. A look into that study’s methodology shows us that the statistic is highly questionable.
There are a lot of CCTV cameras in the UK, but they’re not on a central database anywhere. They’re put up by local councils, private landowners, local governments, companies and even privte individual. The only way to find out how many there are is to actually go and count them.
That’s what this study did. The researchers went along Putney High Street and Upper Richmond Road in southwest London, counted the number of cameras they could see, and marked that down. Then they extrapolated to all of London, and from there to the whole UK.
There are some big problems with that logic. Which means that the BBC News story I linked to above is just plain wrong.
4. Ask: Is it a Big Number?
In 2006, one of the top stories was that the NHS, the UK’s state health provider, had a budget deficit of between £800 million and 1 billion.
That sounds like a lot of money, and by most counts it is.
But it’s only about 1% of the total NHS budget.
For comparison, the UK Treasury applies a margin of error of 2% when it assesses the budget of the entire UK government. So if the NHS can accurately count a shortfall of 1% on a scale of £1 billion, they’re actually doing really well.
One of the things I found most heartening about this talk is that the guys from More or Less were quite candid about their qualifications. “We’re all arts graduates, like most journalists,” said Richard Vadon. It doesn’t take a mathematician to do this kind of journalism – just a bit of applied intelligence. That’s good news for an arts grad like me.
“Journalists as a whole aren’t very good at spotting bad numbers. but we should get better,” Vadon said. Amen to that.