This blog proposes a change in the way the UK Statistics Authority operates. Pretty much anyone (outside UKSA) who has a view on the matter feels that it is not operating how it was envisaged at the time of the legislation. With very modest cost (and without revisiting the Act) the UKSA board could implement a structure with clearer responsibilities and greater effectiveness. The reinvigorated stakeholder relationships with enhanced transparency and trust would set up the government statistics machine to play a central role in producing data both for policy making and the public good. Statistics should be about so much more than producing the same figures as last year and the Bean review can point the service in that direction.
The 2007 Statistics Act had a fundamental flaw in it. It wanted the one “Board” to be both producer and regulator of statistics. (See for example sections 20 and 8 of the Act.) It is hard to imagine why some in government fought so hard to impose such ambiguity. It’s almost as if they were wanting the system to fail. Alas UKSA has chosen to interpret the Act in a way that limits their role.
The resulting issues facing UKSA include: Internal focus not outward looking, low profile, lack of transparency about UKSA activities (it’s the only way to cover up the ambiguities of the chosen path), failure to build public awareness and trust, weak relationship with Parliament, passive and under-resourced non-execs, an uncertain sense of direction and achievements, chopping and changing plans and strategy, and uncertain extent of independence of regulation and assessment. This is mostly not the fault of the present incumbents as they inherited an ONS that does not have the expertise, culture or contacts to innovate. Proof of that is the Treasury commissioning of the Bean review – had the organisation been self confidently running itself with the right checks and balances such a review would not have been needed. The Bean review will hopefully force the hand.
There is experience all over the public sector to prove that confused responsibility between a board and an executive does not work. One need look no further than the BBC. Rona Fairhead, Chairman of the BBC Trust, speaking earlier this year said: “a fault line continues to lie in the blurred accountabilities between the Trust and the Executive board”. She continued: “At a minimum, we would want to propose some reform of the current model. To keep the Trust as part of the BBC but to be much more specific that its responsibilities were focused more clearly on regulation and accountability, with strategy and oversight left to the Executive Board.”That’s more or less what I am proposing for UKSA.
As Ms Fairhead said: “None of this is easy. There is no perfect or obvious answer.” Yet the truth is that the statistics framework has been far less effective than it could have been. From the outside one gets the feeling that the board hasn’t yet defined its role in a way that is any less confused than the BBC’s. There is a lack of transparency about both the UKSA role in the public sector and its own internal structure. If it doesn’t know those it’s hard to operate effectively.
For a long time the UKSA/ONS organogram was the one below. (The image dates from 2010 yet was retrieved from the UKSA site in mid 2013 via Wayback Machine.) It wasn’t very detailed given the organisation’s size and expenditure, and wasn’t even totally accurate but it was all there was for three years!
After some pressure from me and others, the latest version is much better but also not totally accurate. There is also another (inconsistent) version on data.gov.uk. This all matters as the organisation has to understand itself before it can be effective outside and before others can understand it. It clearly has not yet fully embraced the dual roles required under the Act in a workable way.
So far as I am aware, the authority has never explained its place in the wider government machine. I set out an attempt to do so below. The main impression is of an organisation that is unclear about how to handle the “dual roles”, sees itself as central and struggles with links to the world beyond perhaps unsure of what is expected.
UKSA has to be part of both the executive (it mainly employs civil servants and produces government data) but also, thanks to its non-executives and assessment team, should be a public-facing guardian of the parliamentary and public interest in statistics. It needs to span the line between parliament and executive. As the diagram shows, the organisation does have a foot in both camps but few reporting lines are strong and clearly defined.
As a result, the link with outside bodies is poor. The nature of the links with government departments is unknown to us outsiders but the limited information in the UKSA board minutes and the modest progress made on some strategic statistical issues suggests that the quality of relationships is mixed and leverage is modest.
In the wider world, it appears that only one in eight people (see below) know something about UKSA. That is according to a survey conducted earlier this year, and is much less penetration than the ONS, where nearly one in two know something. The relationship with parliament (and its nominated committee, PASC and now PACAC) is also limited.
So, how could it be improved?
The diagram below shows how UKSA could manage its dual role much more effectively. Assessment and non-execs on one side and the National Statistician and ONS, rightfully, on the other. The UKSA board would be the body that gave the National Statistician the independence to work within government to achieve change.
There are five changes that could be made (numbered “New 1” etc in the diagram below):
1. Create a Public Policy Evidence Team. Currently there is no horizon scanning, no looking around for potential problems, no place to make serious innovation in the quality or quantity of data and other evidence that government uses. This relatively small team would be drawn from across government, from many departments and various disciplines. Weaknesses in data could be spotted (picking up on ministerial concerns, parliamentary committee reports, news stories, users’ or trade association worries etc.), new data, techniques or partnerships evaluated, and solutions proposed. It would deliver (see arrows on the chart): improvement to the government statistics machine, challenge to the policy-making process, and innovation in the raw material used by the analytical teams across government. It would not be inward looking like the ONS, studying only its own figures, but outward looking at all data and evidence. The creation of this team should – if done properly – obviate the need for rapidly-initiated one-off reviews of UKSA or data or processes (of which there have been many).
2. External expertise. The PPET would include outsiders too. The use of external expertise is currently haphazard, unstructured and not done in a transparent way. This is a chance to reform that.
3. PPET Reports. Reports from the PPET would be public and delivered to Parliament. They would be highlighting weaknesses in the evidence base but be positive and forward looking in tone. They would bluntly propose changes that are needed to make the evidence base fit for use. Proposals would be presented to Parliament and then it would be for parliament to decide what to do. There would often be a cost associated with proposed improvements. If parliament did not want policy makers to have better data (to help guide them to better policy) or want Parliament or the public to be able to more effectively hold the executive to account, so be it – at least the issue would have been aired.
4. Reinvigorated assessment and regulation in UKSA. As a counterpart to the PPET, UKSA’s own staff would be given more powers to horizon scan. The non-execs would meet more often to discuss the issues being worked on by PPET. The non-execs need to play a real role in support of the NS and the work being planned by PPET. At the moment the contribution of non-execs is not designed to stretch much beyond turning up for formal meetings. They need more time and more domain expertise.
5. Stronger relationship with Parliament. The PPET reports would give a boost to the (currently barely-existent) relationship between the NS and Parliament, via the medium of, probably, PACAC. The new dynamism in the system would also boost the role of the UKSA board – instead of trying to quietly sort out problems under the radar it would be the audit group reassuring parliament that the (enhanced, more effective) processes are correctly being applied. Issues like data sharing for statistical purposes could be dealt with openly and honestly. The status and independence of the NS would be enhanced.
To conclude, the new structure could deliver:
- Transformative changes with widespread gains.
- A mutually supportive structure to allow all stakeholders, in and out of government, to work for a common good.
- Happier users, better policy-making and improved trust.
- Clearer separation between executive and assessment. (The “dual role” in legislation.) UKSA and the NS/ONS could each then have clear, workable and sensible strategies.
There is no need to legislate though some changes or clarifications would be welcome if the opportunity arose. The plans require very modest, if any, additional cost – changes would be relying on machinery of government changes and budget transfers.
Footnote. The objectives of UKSA from section 7 of the 2007 Act are quoted below and show how the proposed operation would be a closer fit to the intention of the legislation:
(1) In the exercise of its functions under sections 8 to 21 the Board is to have the objective of promoting and safeguarding the production and publication of official statistics that serve the public good.
(2) In subsection (1) the reference to serving the public good includes in particular – (a) informing the public about social and economic matters, and (b) assisting in the development and evaluation of public policy.
(3) The Board is accordingly, in the exercise of its functions under sections 8 to 21, to promote and safeguard – (a) the quality of official statistics, (b) good practice in relation to official statistics, and (c) the comprehensiveness of official statistics.
(4) In this Part references to the quality of any official statistics includes – (a) their impartiality, accuracy and relevance, and (b) their coherence with other official statistics.
(5) In this Part references to good practice in relation to official statistics includes ensuring their accessibility.