Post-truth, post-Brexit statistics

The recent political coming and goings (the EU referendum, the arrival of a new Prime Minster and Labour’s travails) has seen a period of unusual attitudes to facts. More people seemingly want information and yet the (accurate) use of facts by politicians, some elements of the media and quite a few people has fallen to new lows. Experts are being rubbished, institutions’ reputations are being damaged, and the media is accused of being biased, prompting discussion of a post-truth society. There is much talk of a fractured Britain as technology and globalisation have hastened economic disruption affecting many livelihoods.

This note sets out a few steps – go local, kill the average, be open, do good research, un-spin and tell good stories – that the statistics world might take to help people reconnect with reality and help policy makers understand what might be needed if we are to establish a more sensible approach to debate and policy. It has much in common with the Data Manifesto published by the Royal Statistical Society two years ago. 

Here are six themes that could be adopted and then progressively implemented.

1 – Make data local and personal  

The use of national macro data such as GDP does not engage people. The already existing national data covering topics of interest to (real) people should be presented by drilling down to regions, local authorities and neighbourhoods. This idea is not new. A decade ago when national crime figures were not trusted a review was commissioned. The main conclusion of 2006 Smith report on crime data said “The focus must shift from the publication of the aggregate national picture to a system of communication which encompasses local data at local level between police and their neighbourhood communities.”

The desire for local area data was very apparent in David Cameron’s Prime Minister Questions sessions. Some days it felt like every answer included the rate of unemployment (and how it had fallen!) for the constituency of the MP who had asked the question. It was right that he tried to connect with MPs and voters by using constituency figures but sad that all he had to play with was social benefits data – the only local data available in a timely fashion. (It is worth looking at good HoC work – social and economic statistics on UK parliamentary constituencies setting out the data available at constituency level, and showing that hardly anything timely does exist.)

Beyond the geography, the drive for broken down national figures could include data for different demographics, young and old, married and single, and so on, who all have different experiences.

The experience with crime data should be replicated across all data on big issues affecting people. Consider these examples:

  • migration – there’s little understanding of headlines that say last year’s immigration was the same as a town the size of a town that most people haven’t visited. Instead people want to know what’s happening in their own town.
  • labour market – an unemployment rate nationally of 5% or whatever is meaningless when most employment for most people is reasonably local. So, there is a need to explain the local labour market dynamic.
  • inflation – an inflation rate around zero for a long time does not resonate when people can see lots of rising and falling prices each week as they go about their business. So why not publish local prices – changes and levels?

In some cases, local area data will be relatively easy to deliver in others it will require a considerable investment in time and money. The desire for more local data should drive data priorities/initiatives. Speeding up the drive to get more admin data used in the census would be a start.

2 – Think about distributions not just the “average”

Averages work well with certain questions and in certain contexts. If you want to know how far the moon is from the earth, then knowing the average distance over, say, a year is a reasonable and useful answer. But it does not work as well for an individual who wants to know how their income compares to average earnings in the UK. The latter has too many variables – region, job type, sex, qualifications, hours worked etc – that render the answer of little use in this context. The number exists, of course, but is it any use? It is better to focus on the range for different family types – what about the inter-quartile range or saying, for example in the case of earnings, what figure puts you in the top or bottom ten percent? The median – the mid point of the observations when ranked – should be used more often than the mean as it is distorted less by extremes.

In some cases the mean has been discredited, or at least complicated, by redefining the calculation. It might be a trimmed mean that excludes the very highest and lowest figures or it might be by moving from one calculation to another, for example from an arithmetic to a geometric mean. One recent example is the story told by the newly released house price index from the ONS. The average price fell from £301k to £218k in one month using new methodology. How many ordinary people would be interested in that set of statistics which appears so odd given the plethora of data from individual home upwards that is available on zoopla?

Focusing on the extremes is often more helpful. It’s better for policy-making where you need to identify and help certain groups such as the poor or disabled, and need to make sure the wealthy are contributing. There is a danger of making too much of (minor) fluctuations or differences around an average.

A shift away from the near obsession with the mean will be increasingly possible with big/admin data and enhanced data visualisations. While an average will always be calculated, producers need to be clear about what a mean means, and its limitations.

3 – Be open

Being “open” applies to both data producers and data users.

  • Data producers should publish as much data as they can and as much detail about methodology as they can. In the case of government departments they are already meant to be doing this as per the government’s code of practice. The principle is understood but the practice is not delivered and the UK Statistics Authority does not press the point home in the areas it regulates. Freedom of information responses and other bespoke requests must be published too. Of course individuals’ responses to surveys or personal records need to be safeguarded but any aggregated data that is anonymised should be published. People can also manipulate disaggregated data and recalculate sub-totals that suit their interests.
  • We should also expect policy makers to publish the data (or research) they have used or considered in preparing policy to be published along side the Green or White Paper as evidence or supporting material. There is a benefit to trust by showing the workings not just the conclusion that falls from the black box. Perhaps the media too could be encouraged to start to be clear about their sources. When they have written a story about data, it would be easy to hyperlink to the press release. It’s not uncommon practice among bloggers and some news providers but the major ones still don’t do it. Academics should publish all the data they use.

4 – More academic and think tank research focused on the real-world.

It is possible to characterise most research as either small scale qualitative or macro reworking of aggregate stats. More research funding and effort should be directed to trying to bring the two together, in other words to relate the human experience to the national figures? For example:

  • productivity – many firms are doing well yet the national figures are poor. Why? Firms’ actions v data they put in when filling in forms need to be compared. The micro performance of firms from different countries should be compared to see why some are more productive than others.
  • government accounts – there is a huge disconnect between an individual’s experience of a public service and the national debt. There’s no link between council itemised spending (now published in many places on excel sheets) and the impact on families of public spending cuts or “austerity”?
  • immigration – there is a need to relate the aggregate numbers of arrivals to the changes that occur to individuals. This could cover all elements including lifestyle, housing, access to services etc, not just wages.
  • globalisation and technology – weigh up the plusses and minuses for families and nations. Generally “progress” boosts performance as measured by country macro statistics (access to cheap labour, more imports and exports), but can be less good for individuals (less secure and more rapidly changing employment and lower wages set against cheaper, greater range of products)

Such research would not be easy but it would be interesting and offer far more value than much of what gets funded currently. It would also have the effect of improving the quality and understanding of data as it gets used and tested.

5 – Present data un-spun

All data producers need to be encouraged to present methodology and background to their data and to present the results separate from the commentary. There are protocols to do this in government but that does not cover all data. Most data is produced commercially with an agenda (clear or hidden). Trade associations, businesses and charities do not collect, collate and present data without hoping that it’ll serve a purpose, and be to their benefit. This raises all sorts of issues about data quality or spin. Can you trust a headline if the underlying methodology and detailed data is not published? This is less of a problem with data published by, say, the Bank of England or the Office for National Statistics, but most users of the data or people who read media reports will not have a clear idea of which number is more trustworthy than another. And many people simply do not trust statistics. If data are presented by so-called experts who are perceived not to be independent then the value of the information in the data will be less in the eyes of the reader. And the media will sometimes highlight different elements of a release to suit their prevailing narrative.

6 – Tell a different story with the data

Most data publishers, including the ONS and others in GSS, are already starting to tell a story with data rather than just publish it with basic commentary. Too often this takes the form of dumbing down the narrative. Instead it could try more to answer the questions that people might want answered or try to link data sets and new administrative data sources to give a richer interpretation or allow readers to compare and contrast. Data producers need to be real about the accuracy of the data – hopefully it’s good but is unlikely to be gospel. Even then it might not be measuring what people want to be measured or things that are relevant to them.

Next steps:

There are some positive signs that the nation is heading in the right direction:

  • There was a very useful event at the Institute for Government yesterday asking if the public had had enough of experts. The panel members from the RSS, Full Fact, Britain Thinks and The Times, and their sensible comments, and the enthusiastic support of those attending, showed that much thinking is already going on.
  • Kath Viner, in her Guardian article of 12 July 2016, “How technology disrupted the truth”, explained how an established truth used to be “fixed in place” by the establishment and handed down. This arrangement is now broken, as anyone can challenge much of what is presented as fact – particularly if the facts in question are uncomfortable, or out of sync with their own senses – and spread their own “truth” on social media. The “Death of journalism”, she said, due to fall off in funding makes it harder to establish something close to consensus on facts.
  • A speech by Andrew Haldane, chief economist of the Bank of England, in June asked awkward questions about the nature of the economic recovery. Positive economic aggregates did not translate to improvements in the lives of many: “For many, the economic recovery is visible and tangible – in sales, in jobs, in investment. But for others it is barely visible and for some non-existent. How to reconcile the macro data with these micro accounts? Were these stories outliers? Or was I neglecting an important missing ingredient in the UK’s economic fortunes? Put differently, whose recovery were we actually talking about?”
  • A joint project to identify what information is needed to inform public decisions over the next five years and make sure it is both available and well-communicated in advance is shortly to be announced, involving among others Full Fact.

That’s all very good mood music but the statistics community, led by the UK Statistics Authority, must redouble its efforts to encourage government to fund statistics and use the data openly as inputs to its policy. (Its current business plans are a bit tame.) The changes above would help the facts to connect with people – GDP does not, but a version of “my income” and “my prices” would. Statistics are one way to help an informed public be a larger voice than the misguided or ignorant mob. UKSA must step up.

Facts are mostly about the past. Facts about the future are few and far between and a distinction needs to be made between the past and forecasts. This was a major issue in the EU referendum campaign. Full Fact wrote a piece about the possible pitfalls in modelling the future. That said, if we have an agreed, believable, vibrant description of where we are now, we are half way to having a basis for debate about the future.


There is nothing in this agenda that could not have been done before brexit but the weaknesses exposed and amplified during the debate make the changes more pressing for the statistics community if it is to do its part. The fact of brexit has disrupted how the nation’s institutions work and people’s expectations, and so gives an excuse – or need – to change society’s approach to statistics. Consider:

  • The new PM, like her predecessors, will want to be able to show that she is heading in the right direction and is more likely to be able to achieve that in a sceptical era with a revitalised approach to data.
  • Post-brexit, ministers and policy-makers will be more accountable, being unable to hide behind EU-related excuses. The buck stops with them. They can rebuild trust and develop consensus by being open and promoting a new evidence base when setting out white papers and policy announcements.
  • Without the need to accept EU-related compromise and “group think” there is a chance to promote free thinking, and help the UK to develop its own solutions – and be a real world leader in data.
  • Any freedom from EU rules about data definitions, and being able to pick which common standards to follow, can only be helpful at a time when admin data is likely to replace traditional census and survey sources.

The steps sound simple enough but they do require something of a revolution for them to be taken. The country needs it.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s