Lightbox Loves: Don’t Get Your Statistics In A Twist…


       There is more money going into schools than ever before, £43.5 billion by 20201

       60% of the time, it works every time2

       Ali Dia was as good as Steven Gerrard3

These stats come from various sources; government, film, the article’s author, but they all share one thing in common. None of them are really true. And, herein lies the problem when dealing with statistics, the line between truth and fallacy is often blurred. The first statement has been inflated by the Department of Education by including university students paying their tuition fees1. The second, is statistically impossible. And, the last statement compares one footballer who played less than an hour with one who played for years, winning several trophies. But if this stat is based on Premier League titles, neither won one, so this statement is arguably true.

Once the domain of research teams, an increasing number of people are dealing with statistics on a daily basis. This is great, people have access to more information than ever before, allowing for more informed decision making. There is a down side however; the need for speed, lack of budget for bespoke research, little training, and authors’ agendas often mean the statistics presented don’t always mean what we’re led to believe.

So how do you avoid falling foul of misleading statistics? Firstly, there are industry groups designed to support us; the UK Statistics Authority, the Market Research Society, the ASA even. Closer to home, ask your research team. We’re increasingly asked to sense-check research from outside the agency. There are also a number of things you can do. Here’s a checklist and examples to help you identify how genuine a given statistic is:

        Who’s been asked? When gauging the public’s opinion about a new car, should we be asking cyclists, or would a group of people with valid UK driving licenses be more relevant?

        Have enough people been asked? You often see in adverts that 86% of 83 people agreed with X. Is this enough to be trusted? Probably not.

        Is the sample representative? If 1,000 people have been asked about alcohol consumption, do they match the wider drinking population in terms of age, gender, region, etc?

        What’s included in the definition of the stat’s subject? The Department of Education statistic above is a great example of how including or excluding elements can distort the figures.

        What’s the context? When looking at campaign effectiveness, knowing awareness was 82% after the campaign sounds brilliant. But what was it before? Without this piece of info, the 82% is near meaningless. Awareness could have been 90% prior to the campaign!

        Who wrote the statistic? The ease with which stats can be shaped facilitates author bias, as seen with my Steven Gerrard example.

It’s important to approach all statistics with a suspicious but open-mind. Not all statistics are disingenuous and your gut-feel isn’t always correct. Asking these simple questions of all the statistics you come across is a quick and easy way to ensuring you can trust the numbers!

1. BBC School funding ‘exaggerated’ by ministers, says watchdog (8th October 2018)
2. Anchorman (2004)
3. Me!



Leave a Reply