It has often been noticed that people are happy to admit they are bad at maths. There could be an interesting study into the difference between things where people are happy to confess incompetence and those where their reputation must be unstained. In category one, you will find dancing (both Michael Portillo and Danny Baker have confessed to having two left feet” in the past week), speaking in public (a surprisingly common feature of nightmares) and cooking (“I can burn cornflakes, me”). Category two will include driving ability, sense of humour (in both of these, we are lucky that the entire population is above average) and love-making. In this latter category, our smirking Lothario will tell us “I’ve had no complaints”. Well, maybe not, but it isn’t an area where a sophisticated independent quality assurance feed-back system has been developed.
What do we mean when people say they’re bad at maths, and does it matter ? Surely it cannot be a matter of mental arithmetic ? It may be useful for the dart player to know straight off that you can get a 144 finish with a treble 18, double 20 and bull, but is that skill needed when we move away from the oche ? In today’s world, we have an array of calculators and spreadsheets to check calculations. I can remember a comic hero – was it Dennis the Menace or Lord Snooty ? – who took a magical calculator into an exam on his wrist. It seemed science fiction then, but now, that facility is in most phones. No-one these days would calculate the stresses on a bridge, or the costs of a project, on the back of an envelope. OK, splitting a restaurant bill is no time for calculating to the second decimal place, but it is worth doing a digital check even for things as simple as ordering floor tiles or sharing holiday costs.
I am inclined to feel that arithmetical accuracy from mental calculation is not the most important matter. What matters is having a good understanding of approximations and relativities. If you can get a rough idea of how great a sum is, or its relation to other comparable magnitudes, you are close to the truth. It will tell also you whether your use of a calculator or spreadsheet is right (and stop you ordering 40 sq metres of vinyl floor for the shower room). But it is also a help when looking at news items and policy decisions.
Take the lazy connection made that allegedly explains inequality by the fact that its natural – just as some people are cleverer or taller than others. The problem with this genetic explanation of inequality lies in the proportions. Genes certainly have a role in determining matters like height and intelligence. A professional basketball player is about 50% taller than someone with restricted growth; someone with learning difficulties has an IQ maybe half that of a genius. Yet chief executives now earn about 120 times more than ordinary workers: difficult to attribute this to DNA. For those who want an amusing but devastating explanation of this, see Jan Pen’s parade.
And then there is the almost routine confusion of millions and billions. It maybe this isn’t that important: clear errors rarely are. The newsreader who looks solemn as she announces a monthly government deficit of £1.7 millions is just reading a misprint. It’s a simple cock-up when today’s Times converts the $5bn US spending on drones as £3.14m . Anyone interested will see that this is out by a factor of a thousand. Probably the bigger offence here is the spurious accuracy, another feature of everyday media innumeracy. I doubt the US spends precisely $5bn, so converting to the second decimal place is rather silly. A related nonsense is ascribing precise values as being the effect of specific events or policies. Nicholas Taleb’s first book, Fooled By Randomness, rightly jeered at media reports like “Wall Street was down five points on fears about the Japanese yen”, when the movement is so small that it is just noise. Think of that the next time a charity says how many deaths would be avoided by following their advice, or how hours the nation spends on a particular activity. On other occasions, governments just make silly claims because they sound good. In the Times (3rd March 2012) the government claimed to have increased a 90% tax return rate by 15 %, which gives support to their earlier worries about Britain’s standards of numeracy (Times 2nd March 2012).
Being a sad man, I sometimes check the calculations for project costs given by news outlets (often straight from the PR sheets of the beneficiary organization). Here’s a recent one. The South Yorkshire Police tells us (and so the BBC repeats) that it has to charge charities for support at their events because it was forced to spent £500,000 policing two marches by the extreme right. Well, £500,000 at £20 an hour buys 25,000 hours of police time, which if each demo lasts 5 hours gives them 5,000 police personnel on duty. Er, really ? The whole force has less than 3,000 officers. This sort of conflation is not rare. The Blair Government launched a childcare initiative to great enthusiasm from a media who seemed unable to work out that the proposed subsidy came to 30p per hour.
And then there is statistical illiteracy. This is rarely as bizarre as the claim that no college (Ed Balls at the Association of Colleges conference) or police force (Phillip Collins in Times 16th November 2012) should be allowed below average performance. There are (and have been) books to be written on how we understand or misunderstand statistics – good and bad – and how the public do not understand risk, for example. There’s even a Professor for the Public Understanding of Risk at Cambridge University, but to little effect. Did you know that more people were killed after 9/11 by their decision to travel by (risky) road rather than (safe) air, than actually died in the Trade Centre atrocity on the day itself ? The distribution of rare events can easily be misunderstood. Fatal traffic accidents fall at a junction where a safety camera is installed, but they fall also in the ones where they are not. They happen in clusters and almost randomly, and after a clutch of them in one place, there is likely to be a period where none happen.
Sometimes, when there is enough information in an article, you can see that the interpretation being put upon it is wrong. In December 2009, The Times published an article headlined “extra billions fail to raise school standards”: it featured a diagram which showed results and attendance are now 35% better than in 1996. The extra spending plainly has raised standards. It was true that productivity (that is, output per unit of input) has not risen – but that’s a different thing. If we have smaller class sizes and better paid teachers productivity falls – just as it does in the NHS when we have more nurses on a ward.
Maybe the problem overall is that people rarely look at any kind of evidence when forming their social or economic views. If you like low taxes, then they improve the economy. If you like high public spending, then that will pull us out of the recession. Agree with capital punishment and it deters murderers: disagree with it, and it has no effect. The government established NICE some years ago, to make calm assessments of the effectiveness of medical treatments. Maybe we need similar bodies in economics, crime, and social policy.
p.s. A couple of sources of sense about maths and stats: More Or Less, from BBC radio 4 (also available on World Service), and The Joy Of Stats on BBC tv – cheesy name, fine programme. Paul Krugman, a Nobel Prize winner, has recently penned an article about how much maths you need to be a top flight economist. His view is, er, not a lot. Which is a relief to me, as I gave up reading economics journals when they started with the word “If” followed by a page of equations, and then the word “Then”: I think it was the algebraic mob who declared that a collapse of US house prices was less plausible than a planetary collision.