For anyone unable to get through 250 pages of the Bean Review report on government statistics this blog highlights the main issues raised (as I see them) from chapters 4 and 5, regarding the ONS effectiveness and governance. It’s not nice reading. There were positives but they were overshadowed by the negatives. It’s a sad story about a lost decade.
I was surprised to learn so much from the review. When you think a situation is bad and you’re criticised for thinking as much, it is shocking to find that it’s even worse. Anyone reading public sources in recent years, seeing the honours and plaudits being handed out, had, it seems, very little sense about what was going on in the country’s statistical system. The UKSA board chose a regime of secrecy and surely that must now end. My future blogs will look at options for change.
It is important to note that the current National Statistician and his new leadership team have earned the confidence of the review team. They have set a new course and are identifying and prioritising the quality of statistics. That is the future while this blog is drawing a line under the past.
The blog lists the main issues in the Review and I conclude that they can be summarised as follows.
From Chapter 4 about ONS governance:
- Staff managed inefficiently.
- Weak at innovation.
- Significant finance and accounting weaknesses.
- Weak project and programme management (PPM) capability and poor track record in project delivery.
- Insufficiently strategic.
- Slow to adopt new methodologies, including big data
- Slow to get access to admin data, especially micro data.
- Failure to sense check prior to publication.
- Operating in silos.
- A website subject to widespread criticism and ridicule and giving poor access to data.
- Too many errors and corrections reflecting poor quality assurance procedures.
- Fragmented technology infrastructure and outdated systems.
- Poor staff reporting and retention.
- Insufficiently inquiring and self-critical about the statistics produced.
- Poor user engagement – failure to share work-in-progress and explain innovation.
- Failure to shape, not just comply with, international standards.
- Risk of marginalisation if data do not offer the insights required by users.
- Definition of a ‘public good’ (to be satisfied for access to micro data) is too narrow.
From chapter 5 on governance:
- Misdirected and narrow remit for the UKSA Board and regulatory function.
- Failure to be proactive in generating pre-emptive action when issues arise.
- Ineffective engagement with users and key stakeholders.
- Failure to ensure the separation between the production and assessment functions is well understood outside the organisation.
- Too little professional independence of departmental heads of statistics.
- Statisticians spend too little time (10%) identifying potential quality improvements.
- Inadequate systematic assessment of the limitations of statistics and failure to use such information to inform bids for extra resources to resolve issues.
- Too many statisticians “are quietly left to carry on turning the handle”.
- Poor quality review processes in ONS, and almost non-existent in other departments.
- The assessment process has become a bureaucratic “tick-box exercise” and does not cover all dimensions of ‘quality’.
- The regulatory function is inadequately proactive – most de-designations followed errors picked up by users.
- Too much focus on the binary nature of the “National Statistics badge” and inadequate commentary to explain what is happening.
- Detailed and comparable information on the quality, relevance, and costs and benefits of individual statistical outputs is virtually non-existent.
- A lack of a central process for staff to feed in ideas and solutions to senior leadership.
- Little sense of team work and sharing between departments.
- Some resource decisions are determined by which Deputy Directors “shouted the loudest”.
- There is a long-established tendency to make large resource commitments without sufficient testing and costing of different options.
- There is a lack of a good evidence base on the costs of delivering ONS’s various statistical outputs and the gaps relative to user needs.
- An inspection of material provided to the Board pointed to a number of issues warranting attention.
- There was insufficient direct interaction between users and the non-executive members of UKSA with the Board relying on information mediated via ONS.
- Failure of the Board to get a reliable picture of the concerns of users and key stakeholders.
- Little evidence of actions being stimulated by the UKSA seminars.
- UKSA’s focus on “the quality of official statistics” has been narrow, rather than user needs and fitness for purpose.
- A near absence of routine reporting mechanisms which support the identification and correction of capability issues within the ONS.
- Board decisions not followed through thus failing to effect change.
- Many of the objectives in the UKSA/ONS strategy document are nebulously defined, leaving room for debate over whether they have been achieved.
Paragraphs from the Review that contain key comments on effectiveness or governance are reproduced below (the number shown is the paragraph number from the report).
4.23 ONS attributes most of the additional staff to the implementation of new international standards in 2014. If staff working on this programme are excluded, UK national accounts resourcing is in line with that of other countries. However, the majority of NSIs who responded to the Review’s survey managed this transition with few or no additional staff. With the available data it is difficult to identify why ONS has found this such a challenge, yet it has been slow to roll out the changes and had to bolster its baseline capability of 123 FTE with 46 temporary staff to do so.
4.24 ONS is characterised by a relatively weak capacity to innovate and improve.
4.27 In 2013, the Chartered Institute of Public Finance and Accountancy (CIPFA) were asked by ONS to undertake an assessment of ONS financial management. It found significant failings, including: a lack of appropriate financial management capability, ownership and accountability beyond the central finance team; an absence of basic financial discipline in programme management; inadequate medium-term financial planning; limited integration between financial and business planning; insufficient focus on securing value for money; and a culture that militated against the finance function supporting transformational change.
4.29 In June 2014, ONS commissioned the consultancy Atkins to conduct an internal review of ONS’s project and programme management (PPM) capability and capacity to deliver the current and future project portfolio, including the 2021Census. In their conclusions, Atkins raised some general concerns about the state of ONS PPM capability, in particular its ability to deliver on time and within budget.
4.37 Some respondents were quite critical. Some referred to its sluggishness in embracing new developments. ONS had fallen behind in adapting and improving its methodologies to reflect changes in the economy and was insufficiently strategic when deciding its statistical priorities. Many argued that more use should be made of administrative data, though current legislation was recognised as being a barrier. There was also scepticism as to whether ONS had grasped the transformational opportunities of big data more generally and had sufficient ability in data science techniques to exploit them. Some also noted ONS data collection methods were out-dated.
4.38 There was also some criticism of ONS behaviours and capabilities. Many respondents said ONS needed to do more to engage with users, both within and outside government. A number of users commented on the failure to sense-check some statistics before release, arguing that greater use of economic expertise could help prevent embarrassing errors. Some respondents saw a need to invest in improving systems and skills, while ONS was also criticised for operating in silos.
4.39 The ONS website also attracted a lot of comment from users. While a new website has since been launched, there was consensus that the website at the time of the Call for Evidence was very poor. Others made wider comments on the general accessibility of official statistics: key data is not prioritised as well as difficult to find and access. Users would like to be able to access and manipulate underlying micro data as well as have better access to real-time data sets.
4.43 It is important that ONS produces statistics that are of high quality and error-free if users are to have confidence in them. Since March 2012, ONS has issued on average close to two corrections a month to its data and has also been criticised for its handling of erroneous statistics. Over the course of this Review ONS has had to correct processing errors in its labour productivity statistics, as well as its experimental price indices based on web-scraped data, one of its most cutting-edge outputs.
4.48 Another criticism of ONS has been that a lack of expertise has led to the publication of erroneous data, particularly in the wake of relocation to Newport.
4.52 In early 2015, the UK was reviewed on its compliance with the European Statistics Code of Practice by a team from other European NSIs. …….
4.53 The review did find some areas of weakness and made a number of recommendations, several of which are echoed in this Review. One particular finding is, though, worth recording in full. This was to seek greater use of administrative data for statistical purposes, subject to appropriate safeguards. The peer review noted that: “In recent years, many European countries have purposefully increased their use of administrative data. As a result, the availability of source data has increased and the NSIs have succeeded in augmenting their existing survey data or even replacing their own surveys with the use of administrative data. The combined effects have been increased data supplies for statistical purposes, reductions in response burden and cost by businesses and household and cost reductions and increased efficiencies for the NSIs. Such developments have only taken place to a limited extent in the UK where there are substantial cultural and legislative obstacles to utilising administrative micro data for statistical purposes.” (p.14-15)
4.56 ONS asked questions about NSIs’ systems and data sources, as well as staff retention and reporting processes. The survey concluded ONS was one of the weakest performers in all of these areas, though this was in part due to the relative complexity of ONS’s systems compared to the less integrated systems used in some other countries. ONS’s choice of technology was largely consistent with that used by other NSIs, but it was the only statistical institute that reported major concerns about systems performance. The UK scored lowest on overall self-evaluation of the agility and flexibility of its systems. ONS was also one of only two countries with major concerns about the coherence of internal data sources and data quality. Twelve NSIs reported a tightening of their budgetary restraints, with only four stating no concerns in this area.
4.62 Providing economic statistics that are relevant, timely, accessible and of high quality not only requires the right skills, methods and systems – it also requires a pro-active, open and creative approach that keeps pace with developments in the modern economy and understands and responds to the changing needs of statistics users. There was widespread agreement among respondents to the Call for Evidence that ONS needs to be more inquiring and self-critical about the statistics it produces.
4.65 Three inter-linked ingredients are needed to help meet this objective of building a ‘curious’ ONS that is more responsive to changes in the economic environment and better meets evolving user needs:
- Improved understanding of the ways and context in which its economic statistics are used. This could be facilitated by building up the economics capability of existing staff through training, shadowing and secondment opportunities at HM Treasury, the Bank of England and other relevant organisations, and by the recruitment of more economic analysts, including at a more expert level. ONS should also seek to strengthen its engagement with the economic statistics user community; regular events such as the ONS ‘Economic Forum’ are helping and this section includes further recommendations to foster collaboration and the exchange of ideas.
- Raising staff knowledge of the systems, methods and data sources for the production of economic statistics. An environment of continual improvement requires a good knowledge of the limitations of existing approaches and the opportunities presented by new developments and technologies. It appears that while some training is offered, it mostly takes place within directorates. A broader range of career paths and training opportunities would both help. This could be complemented through more interchange of staff with other NSIs and relevant organisations. Rationalising the complex and aging range of systems used by ONS would also make it easier for staff, especially new recruits, to get a fuller understanding of processes. Such in-depth knowledge would make it easier to ‘sense-check’ outputs by comparing them to information available from other sources.
- Strengthening ONS’s quality-assurance processes and analytical capacity to spot mistakes and inconsistencies, including building in sufficient time for meaningful and rigorous internal challenge.
4.66 ONS’s approach to international standards is defined by the need to comply with them, rather than an ambition to shape them. One user mentioned to the Review that while international comparability is important it should not be an excuse to avoid developing innovative methods and approaches.
4.68 The modernisation programme of the mid-2000s has cast a shadow that persists today and little has been done to counter the perception that change programmes do not mean real change, and certainly do not mean change for the better. The most recent survey of ONS staff showed less than half believed their managers would take any action in response to the results of the survey
4.69 There is a strong likelihood that opportunities for innovation are missed because, regardless of the potential of their ideas, staff do not have the confidence to question existing practices unless they believe that they will be listened to by managers.
4.77 As an organisation, ONS is at times overly cautious when it comes to sharing work-in-progress, testing new methods, and drawing on data from elsewhere.
4.87 It is the judgement of this Review that the loss of statistical expertise which resulted from the relocation decision has had a significant – though not necessarily permanent – detrimental effect on the capability of ONS and the quality of its outputs over the past decade.
4.91 The first of these drivers was outlined earlier in this chapter – that ONS needs to move beyond focusing largely on the production of statistics and instead use data and statistical expertise to help users answer their questions about the economy. Official statistics risk becoming increasingly marginalised if statistics producers cannot offer the insights demanded by policy makers and market commentators. Analytical expertise is needed to be able to respond effectively to users, and relates closely to each of the three ingredients of curiosity listed above – it is the sine qua non of a modern NSI.
4.92 The second driver – the need to embed the skills necessary to exploit administrative data sources – is discussed in more detail later. But it is clear that ONS lags several steps behind some other NSIs that have been heavy users of administrative data for a while. However, if the legal and other barriers can be removed, working with administrative datasets would become an integral part of the day-to-day operation of ONS, just as in many other NSIs. The structure, provenance and application of administrative data are different to those for survey data, and the tools and techniques in ONS teams will need to adjust accordingly.
4.99 The established gateway for recruiting most staff into ONS feeds directly into data-gathering roles within its survey operation. However, simply exposing new entrants to the traditional production process merely perpetuates the ‘factory’ model and does not really provide the skills needed to challenge and change those processes. In future, ONS should seek to bring in a greater proportion of its staff through the various Fast Stream programmes, or else set up a similar scheme of its own for graduates with analytical aptitude.
4.105 It seems clear that ONS is not only much less dominant as a centre of statistical expertise than it once was, but that it is also lagging as a centre of economic expertise.
4.140 Given the ubiquity of electronic data today, it is incongruous that the production of ONS economic statistics still relies so heavily on the posting of paper forms and knocking on doors.
4.144 ONS today has access to many tools and techniques for producing economic statistics that Rickman could not have even dreamt of. It is somewhat remarkable, therefore, how little use is made of such administrative data. But this problem is not new – the 1989 Pickford Review, for instance, recommended that greater use be made of administrative data, particularly information available to the tax authorities.
4.145 The 2007 Act was in part designed to facilitate increased access to departmental administrative micro data in order to support statistical production. Yet just two micro data sets have been shared with ONS for the purpose of statistics production under the Act’s provisions. The first was VOA data, used in the construction of the House Price Index. The second was HMRC VAT data, whose potential is presently being explored. While ONS has access to aggregate administrative data, it only has very limited access to the micro data. The aggregated information is certainly useful, but it is the richness of the underlying micro data that really carries potential. This can be used to clarify the source of puzzles in the aggregate data and, through the use of linked data sets, allow a far more detailed perspective on economic developments
4.146 Many other NSIs make far more use of administrative and alternative data sources in the production of economic statistics than is the case in the UK.
4.150 In particular, there seem to be three obstacles to ONS making greater use of administrative data. Each individually limits progress, but taken together they constitute a significant barrier to the effective exploitation of such information in the production and interpretation of UK economic statistics:
- Legislative framework.
- Reluctance to provide access.
- Insufficient ambition in exploiting new data sources.
4.160 The UKSA strategy ‘Better Statistics, Better Decisions’ explicitly recognises the need to build greater data science capability in ONS, but only a few high-level actions were set out in its business plan, covering the period up to March 2018. Reflecting the difficulties of getting access, ONS lags behind many other data driven organisations, including several other NSIs, in terms of its capability to exploit administrative and other big data sources. Access to HMRC VAT administrative data was secured in 2011, but ONS has made slow progress in scoping out the full potential of this data since. While good examples of exploratory data science work can be found, these have tended to be ad hoc projects and isolated, one-off experimental work, with limited prospects of being operationalised soon.
4.170 The need for clear communication is equally important for ONS’s experimental outputs and its one-off studies. ONS will need to strike a balance between promoting and show-casing its experimental work effectively, while being fully transparent about the uncertainties and limitations of the data sources and techniques employed so as to maintain trust and integrity. This will mean engaging users much earlier in the process – at the initiation and development stages of exploratory work – to manage expectations pro-actively and stimulate an ongoing dialogue on techniques and methods.
4.180 The complexity of ONS systems has probably also been a contributory factor to some of the recent statistical errors and corrections. The internal ONS review into the 2014 error in the International Passenger Survey uncovered a wider range of issues with the systems for collecting and processing that data. The review found that researchers could not directly interrogate data in the processing system and checking routines had not been incorporated because they slowed processing to an unacceptable degree.
4.181 The legacy of a fragmented technology infrastructure and outdated systems is fundamentally contrary to this Review’s vision of a flexible and agile NSI. ONS’s technology infrastructure needs to be transformed if it is to get the best out of the data it collects now and the large volumes of administrative data it may have access to in the future. ONS’s new senior leadership is determined to turn things around and has started to implement a technology transformation plan running up to 2020 through which it plans better to meet GDS standards and reduce the number of different platforms to fewer than ten.
4.182 These are important and much-needed developments. However, ONS historically has a poor track record in project delivery and must avoid repeating past mistakes, such as those that afflicted the Statistical Modernisation Programme, which sought to revolutionise the ONS technology estates in the 2000s. Looking back at that programme in 2009, Stephen Penneck, who was Director of Methodology at the time, concluded that ONS had lacked the core skills needed to deliver the modernisation programme – in project and programme management, in business analysis, in IT architecture, and in development and testing – reflecting a lack of investment in such skills over many years. He also noted a lack of accountability, an initial approach that had been far too ambitious, and poorly thought-through requirements.
4.188 Currently, the data on businesses in the UK is incomplete and of relatively poor quality; linking different data sources and formats usually requires statisticians to put in place complex matching processes. Clear definitions of what constitutes a business and a unique identifier for businesses that is fit for all administrative and statistical purposes across Government would bring great benefits to economic statistics as well as public service delivery and policy development more widely.
4.192 The ONS website is the principal channel through which users access ONS economic statistics. A clear, user-friendly website is therefore a pre-requisite for an effectively functioning ONS. Yet for several years, the ONS website has been the subject of widespread criticism and ridicule. There have been several unsuccessful attempts to rectify this, until a totally new website came on line on 25 February, just as this Report was being finalised.
4.193 Having been re-launched towards the end of August 2011, the old website had been live for less than a month before ONS issued its first statement apologising to users for its performance. It transpired these were not teething problems. Criticism of the poor accessibility of statistics online continued to dog ONS for years to come. The website was, for example, raised on numerous occasions during the last parliament by the (as was) Public Administration Select Committee. Following their 2013 inquiry, the respected economics commentator and author Tim Harford branded the website “a national embarrassment” in the Financial Times.
4.195 The final straw for the existing website came after a catastrophic failure in early 2014, the result of introducing improvements to the site’s taxonomy. Though ONS deserves some credit for its handling of the crisis including through social media, it led to the commissioning of a review by experts from an external company – Thoughtworks – who identified wider issues with the website and digital capability at ONS. The instability of the web platform and consequent risks of any further improvement work meant that all future development was scaled back to business critical updates only
4.208 Approved Researcher status is project, person and time specific. ONS ran a public consultation in early 2015 regarding the criteria, process and safeguards used in this scheme, which identified that the current scheme no longer fully met the needs of the research community. The consultation found that 80% of respondents wanted ‘on-going‘ access to datasets for an agreed time, as reapplying for access to the same data every quarter or year is a significant burden. Moreover, there was a need to clarify the definition of a ‘public good’ that needs to be satisfied for a project to be approved. Many respondents felt that the current criteria are too narrow and this was a view that was corroborated by this Review. ONS is currently finalising improvements and plan to launch the improved scheme in mid-2016.
4.210 Currently, ONS does not have the authority to permit access to micro data it has received from other departments without their explicit permission. Within the VML, any data that a researcher wishes to access needs the approval of the data owner, which may be a team within ONS or another government department. There is also no consistency between ONS and other public bodies on the requirements to access micro data. For example, legislation requires access to ONS data must deliver a public good, whereas to access HMRC data, a researcher needs to serve a HMRC function.
4.213 Statistics New Zealand have a dedicated website which shows past and current projects and what datasets have been used for them.88 Case studies are also presented, showcasing where research project outcomes have been used within government departments and in academia. This is something that ONS should be doing and can promote knowledge sharing both within the organisation and beyond, while improving the methodology of the statistics produced.
4.214 Users of the VML, such as the Bank of England and Institute for Fiscal Studies, told the Review that there were often significant issues regarding the usability of the micro data that is available. This has led to some researchers needing to spend months cleaning the data prior to using it for research. A lack of documentation and clear labelling of the contents of the data set, naming of variables, history regarding series breaks, etc, is also quite common.
4.215 The VML facility, as a micro data dissemination point, is an example of good practice and a model which is being imitated in other countries. The ultimate aim of the VML and similar facilities used by researchers, within government and otherwise, should be to provide access to the underlying data in order to allow official statistics and the methodology used to create them to be challenged, validated or critiqued. This is a necessity for an NSI that aims to be open and transparent. Transparency exposes it to criticism and challenge and it is important that ONS is open to this. The best way to engender trust in ONS’s statistics is to enable researchers to work from the same data to try to replicate and improve ONS’s findings. This has to be seen as an opportunity to learn, improve methodologies and reach consensus among experts.
5.3 Since 2014 steps have been taken to broaden attention to aspects of quality. But the UKSA Board and regulatory function could have paid more attention to ensuring that economic (and other) official statistics are of the highest quality in the broadest sense of not only being accurate and coherent but also relevant to user needs. For a variety of reasons, ONS’s quality assurance processes have proved less effective than users might expect. And while the UKSA Board has intervened when there have been significant errors in published statistics, it could have been more proactive in generating pre-emptive action. In part, a lack of relevant, timely and digestible information is to blame, but ineffective engagement with users and key stakeholders is also an issue.
5.10 In response, during 2013 and 2014, the UKSA Board agreed a number of organisational changes to strengthen and streamline its governance structures and to enhance the separation of production and assessment functions. The ‘Assessment Committee’ which had been responsible for considering draft reports and making recommendations on National Statistics designations was reformed as a ‘Regulation Committee’ consisting solely of non-executives and the Head of Assessment (and therefore with no executives involved in the production of statistics). It was given a broadened remit to shape the regulatory strategy and oversee the programme of assessment, and carrying the delegated responsibility for adjudicating on National Statistics designations. It thus seeks to address PASC’s concerns about the need for greater separation between the production and assessment functions. It is clear from the Review team’s engagement with users, however, that these changes are not yet well understood outside the organisation.
5.13 Taken together, these changes have resulted in a much more sensible senior management structure, with more clearly defined and demarcated roles, responsibilities and reporting lines. As such, it now looks similar to the sort of structure that is seen in many other private and public sector organisations.
5.17 When asked about the value of acquiring National Statistic status, many HoPs said it gave ‘credibility’ to a statistic and could help defend it against criticism. But several HoPs gave reasons for not seeking assessment against the Code (even a statistic that might be fully compliant with the Code does not have to be badged as a National Statistic). In some cases, departments did not wish their statistics to be assessed for the National Statistics badge because the statistics were of insufficient quality; indeed one HoP suggested that rejection might result in adverse media coverage. Where there is a high profile or important official statistic, UKSA should have power not just to suggest, but to require assessment against the Code for National Statistics status.
5.18 Furthermore, in its 2013 review PASC flagged its concern that UKSA lacked the power to “prevent departments from circumventing the obligation to meet the standards in the Code by publishing data that is not classified as a statistic. These releases under alternative designations, such as ad-hoc ‘administrative’, ‘management’ or ‘research’ data”, are not subject to the Code because they are not classified as a statistic. In 2011, the National Statistician published updated guidance on the ‘Use of Administrative or Management Information’, which provides guidance on regular and recurring use of administrative and management information. However, UKSA does not have the power to enforce this guidance. At the heart of this issue lies the departmental prerogative to decide how a new data release is classified. Consequently departments can decide whether or not data is subject to the standards of the Code. That can result in some information being released on an ad hoc basis to support positive news stories and withheld at other times when their release would be problematical.
5.19 While it would be unduly burdensome to require that the release of all management information be subject to the same safeguards as official statistics, UKSA should be able to insist that information be deemed to be an official statistic if it judges that departmental discretion is being used inappropriately to subvert UKSA guidance. And in cases where the information relates to high profile issues of interest to the wider public, UKSA should also have the power to require that the release be treated as an official statistic and therefore be subject to the Code. These changes could be accomplished via a change in the Ministerial Code.
5.23 While most HoPs saw the exercise of such independent judgement as a matter of professional integrity, the Review team discovered some questionable behaviour within a minority of departments. One participant suggested that HoPs who followed the Code too rigidly were “doing themselves a disservice” and suggested it was important to give departments options on how to ”flex within the Code”. Another admitted to occasionally subverting the spirit of the Code. Such instances were certainly not widespread, but it does illustrate the reliance of the system on the strength of character of the individuals involved.
5.24 Moreover, most HoPs recognised that they were ultimately employed by their departments, not UKSA or ONS, and that created a tension between maintaining statistical integrity and supporting the department’s or minister’s priorities. A majority of HoPs stressed that repeatedly refusing departmental requests on the grounds that they breached the Code could both lead to them being side-lined and could also compromise their subsequent career. But, in general, HoPs said they were able to manage this tension successfully and few felt significant pressure to compromise their professional standards.
5.26 Tensions are bound to arise from time to time between departmental and ministerial priorities on the one hand and maintaining the integrity of official statistics on the other. While the present arrangements largely seem to work in managing those tensions, some further reinforcement of HoPs independence would be valuable, including routine public reporting to highlight abuses and poor practices. This is in line with UKSA’s recommendations in 2010.
5.38 Output managers in ONS production teams carry the initial burden of ensuring the quality of the statistics they are responsible for. While the vast majority of statistical releases are published without mistakes, as discussed in Chapter 4, there have been several recent high-profile errors, initially identified by users. A paucity of economic expertise, together with cumbersome systems, have meant inadequate sense checking takes place before data are released and quite basic checks, such as comparing implied deflators to actual price indices, are sometimes absent.
5.39 A consequence of recent mistakes, including those resulting in de-designations by UKSA’s regulatory function, is that production teams have been encouraged to put more effort into avoiding mistakes. While that may seem entirely appropriate in the circumstances, in the absence of the necessary analytical support, it has resulted in additional rounds of burdensome mechanical checking. An internal audit of quality assurance processes found, for instance, that one team had no less than ten rounds of quality assurance but that only the first two rounds added any value. An important lesson from lean manufacturing processes is that multiple layers of checking that largely duplicate earlier rounds is typically less effective than a single round where the checker knows (s)he carries sole responsibility.
5.41 The Review has found, on average, a production team devotes just 10% of their time to the identification of potential quality improvements. There are some pockets of laudable practice, e.g. embedding ‘development mini teams’ within production teams that have the capacity and expertise to look at the issue from a broader perspective. But the general approach appears to be to focus on small, iterative and contained improvements that are easier to identify and implement within the production team, such as better presentation of data.
5.42 Deputy Directors are particularly important for quality assurance and improvement as they are responsible for setting the overall approach of the production teams beneath them. While some have sought to bolster their analytical capability and to encourage more innovative thinking, too many appear content simply to see their teams continue to produce the same statistics – with the exception of changes that are legally mandated – month after month, quarter after quarter.
5.43 In sum, insufficient effort has been expended in identifying and addressing the limitations of statistics of the sort discussed in Chapters 2 and 3 of this Report, even when there is already strong user interest in an issue. The Review team found little evidence that teams either systematically assessed the limitations of their statistics against user needs or used such information to inform bids for extra resources to resolve issues. One senior civil servant in ONS described it as a “Pandora’s box of development backlog work that ONS are taking a big risk by not always being on top of what is in it”.
5.45 The NSQR is a more established tool. These range from small in-house reviews to large externally-run reviews, such as the Barker-Ridgeway review. There is widespread agreement in ONS and elsewhere that these reviews serve a useful purpose. While production teams claim to have been already aware of the need for very many of the recommendations in these reviews, most of the major improvements to economic statistics that were not already legally mandated have stemmed from NSQRs. In part, this is due to few other concrete proposals for quality improvement work, and in part due to the higher profile of NSQRs, which means they are prioritised for additional resources.
5.47 In part as a result of the limited focus on ensuring the quality of statistical production, practice is very variable across departments. Some do focus on producing good quality statistics, but in others, statisticians are quietly left to carry on turning the handle. While ONS may not be a paragon of ensuring its statistics are of high quality, it does at least have the backstop of the NSQR process. The lack of any analogue in departments poses the risk of a two-tier system developing if departmental statistics producers fail to keep up. For that reason, the NSQR programme should be extended across the whole statistical estate. This is set out in Recommended Action 2. Moreover, some departmental producers appear to place the needs of their own ministers ahead of wider user needs. One producer told the Review team that they were not permitted to meet with a wide range of users, given the views of their ministers. But without engaging with users to understand their needs, how can producers ever expect to meet those needs?
5.49 Throughout the Review, it has been clear that the present senior leadership of ONS shares an ambitious and progressive vision for putting the organisation at the international frontier of the production of high-quality economic statistics. Nevertheless, ambition can sometimes outstrip the ability to deliver. For instance: a recent UKSA regulatory report highlighted the limited progress made in implementing many of its earlier recommendations regarding the statistics on income and earnings; VAT microdata was obtained from HMRC in 2011, yet has still to be incorporated into the production of statistics; and little effort has been put in to securing access to scanner data of the sort Statistics Netherlands are now using for the production of price statistics. ONS still has a long road to travel.
5.53 Despite recent progress, greater emphasis on quality issues – in their broadest sense – is needed. The Review team was told repeatedly by producers of statistics that the current assessment process did not fully cover all dimensions of ‘quality’, as described in the five dimensions contained in the ESS. Production teams described instead a bureaucratic “tick-box exercise” evaluating compliance with specific associated practices rather than with the spirit of the Code. One senior civil servant in ONS described the current assessment process in scathing terms, asserting that they did not engage with methodological issues. Producers described the recommendations as “distracting limited time away from actually improving statistics”. In the 60 conversations the Review team held with producers, only those working on re-designating statistics believed that the assessment process reflected quality in substance as well as process.
5.54 This view of producers is supported by recent assessment reports. Looking at all the recommendations and requirements contained in the assessment reports of six economic statistics in 2015 and 2016, although there was some coverage of quality issues, the emphasis was primarily on ensuring better communication with users, internal processes to avoid mistakes and greater transparency of documentation. The Review team found only around 10-20% of requirements in these recent assessments were focused on improving the substance of the underlying statistic itself.
5.55 Looking forward then, it seems clear that a further strengthening of emphasis on assessing quality – in its broadest sense – is called for. In addition, it is important that the regulatory function seeks to become more proactive. Some dedesignations have been the result of problems highlighted by the assessment process. But the evidence shows that in the majority of cases (see Chapter 4), the de-designations followed in the aftermath of data and processing errors that were first picked up by users, not the UKSA regulatory function. Production teams are ultimately responsible for the quality of the statistics they produce and the regulator cannot be expected to identify all risks before they crystallise. But a more proactive regulatory function might have brought the underlying problems to light sooner.
5.56 Given the limitations of several of the statistics discussed in Chapters 2 and 3, it may be somewhat surprising that so few statistics have been de-designated, despite the National Statistics badge requiring “the highest standards of trustworthiness, quality and public value”. It is hard, for instance, to understand – though unsurprising given the genesis of the process and function – how the provision of some economic statistics currently maintains the National Statistics badge. For example, Chapter 2 outlines the substantial limitations of regional statistics that users, such as Diane Coyle, described as “lamentable”.
5.58 While the binary nature of the classification – a statistic either warrants the National Statistic badge or not – may achieve clarity, the reality is more nuanced: there are fifty shades of grey twixt white and black. It would serve users better if that were reflected in the classification process. That could be through the use of a scorecard that rated a statistic on each of the several dimensions of the Code. Or it could be encapsulated in concise commentary accompanying the statistic. Either way, it could alert users when there were concerns about a statistic without having to resort to the rather blunt weapon of de-designation.
5.62 The Review team identified several weaknesses in ONS’s current systems for prioritisation and resource allocation:
5.63 Management information. Detailed and comparable information on the quality, relevance, and costs and benefits of individual statistical outputs is virtually non-existent: the “Pandora’s box” remains tightly shut. In its 2013 report, the Chartered Institute of Public Finance and Accountancy noted, “In terms of unit costs, activity costs, benchmarks and other financial performance ratios available being used to inform decisions to maintain or change current services we see significant difficulties at ONS.” While annual self-assessment by production teams using the Value Engineering Tool provides some information, it is unclear how this is then used to provide a meaningful input into prioritisation and resourcing decisions.
5.64 Priorities only tend to shift if an emerging issue is identified externally or through the formal mechanism of an NSQR. Even then, many staff believe the organisation is slow to respond and reallocate the necessary resources. The large majority of production staff spend around nine-tenths of their working time on standing responsibilities, so there is also little scope for shifting resources within teams. The lack of a central process for staff to feed in ideas and solutions to senior leadership was also frequently lamented.
5.65 Focus. Priorities are heavily influenced by the need to meet legislative requirements in order to avoid fines or else in response to statistical errors and the subsequent reputational damage. Given the lack of detailed management information, that is perhaps no surprise, with resource allocation set top-down based on headline issues, rather than built bottom-up based on a comprehensive understanding of business needs, costs and trade-offs. Some junior staff also complained that resource decisions were often determined by which Deputy Directors “shouted the loudest”.
5.66 Project management. Historically, an overly complex governance framework and a lack of sufficiently clear business case protocols have often led to failed or notoriously costly project delivery within ONS. While the Portfolio Committee has recently simplified governance arrangements and scrutiny processes in an attempt to improve the management of projects within ONS, there is a residual risk that a long-established tendency to make large resource commitments without sufficient testing and costing of different options could persist.
5.67 Co-ordination across the statistical system. The fact that UKSA is not always well-sighted on departmental statistical resourcing decisions and has no formal leverage to influence them, has led to a somewhat incoherent statistical landscape. Several HoPs and senior civil servants noted that improving the quality of statistics generally came at the bottom of their departments’ list of priorities. The 2013 PASC report also noted the lack of a strategy for the statistical system as a whole and recommended that UKSA “should coordinate data on resource requirements and plans for statistics across government departments, so that, where appropriate, resources can be pooled and the UK’s statistical needs met as efficiently as possible”.
5.68 There is one noteworthy example of a departmental decision that illustrates the limitations of UKSA’s coordination powers. In April 2013, the Chair of UKSA publicly wrote to the Secretary of State for Communities and Local Government, highlighting concerns about a decision to cease the publication of regional statistics. UKSA suggested that the decision “might be seen to raise questions about whether the decision was based on statistical or political considerations” and asked that it be reviewed by the department. Despite this intervention, the statistical series was nevertheless stopped and UKSA had no formal powers to intervene beyond raising its concerns publicly.
5.72 The previous discussion suggests the need for improvements in the way ONS’s activities are prioritised and resources allocated. The need for improvement is recognised by the current senior management team. A key deficiency is the lack of a good evidence base on the costs of delivering ONS’s various statistical outputs and the gaps relative to user needs. Better information would allow a more strategic approach to the allocation of resources and reduce the tendency to focus the organisation’s effort on just the most high-profile (but possibly ephemeral) issues.
5.74 Responsibility for the oversight of ONS performance resides squarely with the UKSA Board. It is the Board’s responsibility to ensure that ONS operates efficiently and produces high-quality statistics meeting user needs. And any failure by ONS to meet those standards also represents a failure of oversight by the Board. Ideally, these governance arrangements provide a mechanism through which performance issues are identified and corrected promptly without the need for external intervention.
5.75 Interviews with users and producers, together with inspection of material provided to the Board pointed to a number of issues warranting attention.
5.76 Engagement with users. Discussions with a wide range of users of economic statistics made it clear there was insufficient direct interaction between users and the non-executive members of UKSA. Better engagement with users of economic statistics would provide the Board with a fuller perspective on current and emerging statistical limitations and whether user needs are being met. In the absence of this, the Board has to rely on information mediated via ONS or else users’ public comments.
5.77 The primary channel for users to provide feedback is the annual ONS customer satisfaction survey. ONS describes using this feedback to understand “how and why [its] statistics and analyses are used, and what [its] customers think about the quality of them and the statistics [ONS] provides.” In its 2014-15 survey, ONS advised respondents for the first time that submissions would be published on the ONS website.
5.78 While publication is valuable for transparency, it does appear to inhibit the frankness of some users’ responses. Certain key users (HM Treasury; Bank of England; Office for Budget Responsibility; Department for Business, Innovation and Skills), whilst acknowledging that ONS has made some progress in recent years, raised numerous concerns about the quality and relevance of numerous ONS statistical products with the Review team. Yet one would not guess this by looking at their responses to the survey. Looking specifically at the 2014-15 survey, for instance, reveals a marked reluctance to mark down ONS’s performance. On the question of how they felt about the “quality of ONS statistics, analyses and advice”, three users said they were satisfied, with only OBR saying they were neither satisfied nor dissatisfied. On the question of whether ONS was innovative, two agreed while two neither agreed nor disagreed. Of course, UKSA/ONS can hardly be criticised if respondents pull their punches, but it does suggest that key users do not find the survey a very effective route for expressing reservations about ONS performance.
5.79 As a result, ONS interpretation of the feedback was largely positive. Looking at feedback from government departments and other key stakeholders, 91% of respondents expressed satisfaction with ONS performance, whilst 82% of respondents were either satisfied or very satisfied with the quality of ONS’s statistics, analyses and advice. In its public report on the results of the survey ONS identified three areas for improvement. These were the need to: improve ONS’s website and access to ONS outputs; be more consistent in the presentation of data in spreadsheets and ad hoc statistics; and improve communications with stakeholders. Commentary on de-designations or quality considerations was conspicuous by its absence. Only 59% of respondents thought ONS was innovative, but this was not identified as a priority area in the summary report.
5.80 It appears, therefore, that the annual customer satisfaction survey does not provide ONS and UKSA board with an altogether reliable picture of the concerns of users and key stakeholders. A mismatch between the perception of producers and the experience of users also emerged in a small survey carried out by the Review team of over 70 users and producers. Users and producers were asked to evaluate the extent to which statistics met user needs. On average, producers estimated that 45% of their statistics were entirely fit for purpose and fully met user needs, while users put it almost a third of that, with only 17% of statistics fully meeting their needs. Moreover, producers estimated that only 4% of their data was of poor quality and did not support informed decision making, while amongst users this figure was over four times higher, at 17%. Moreover, these average responses conceal some even more contrasting individual results, with some producers believing that all their statistics fully met user needs, and some users feeling that almost all statistics required significant improvement!
5.81 Each month, the Board receives an update on ONS performance against the strategy laid out in ‘Better Statistics, Better Decisions’, covering achievement against key performance indicators, progress against planned activities and mitigation of strategic risks. In January 2016, as part of this report ONS reported generally positive feedback from key users, high levels of customer satisfaction with ONS performance and that work to ensure key statistics remained relevant was on track with sufficient mitigating actions in place. In effect, user views, ONS effectiveness and the quality of statistics were all rated as ‘green’ (though relationships with HM Treasury and the Bank of England were rated as ‘amber’). That was despite the challenges identified in the Interim Report. The minutes of the meeting do suggest, however, that the Board realised that there was a need for better engagement with stakeholders to understand their concerns.
5.82 The Board has recently instituted a series of seminars to improve engagement, holding sessions on the opportunities and challenges relating to increased use of government data, and on the ‘productivity puzzle’, as well as a joint session with the HMRC Board. The Review welcomes these efforts by the Board to become better informed about key measurement issues and emerging challenges. However, so far there has been less evidence of actions being stimulated by these seminars; it is obviously important that such sessions help to drive subsequent actions to improve statistics and do not just become a distraction.
5.84 Monitoring quality and performance. UKSA is charged with promoting and safeguarding “the quality of official statistics”. But its attention to the quality of the whole statistical estate and ONS performance appears to have been rather narrow, with the Board focussing on reliability, and the regulatory function concentrating on trustworthiness, rather than the broader issue of whether statistics adequately meet user needs and are fit for purpose. Moreover, attention to the issue often appears to have been reactive rather than proactive
5.85 The Board has intervened when there have been important errors in published statistics. For instance, public concern about quality in the wake of the errors in the construction and GDP statistics in 2011 led the Board to make a strategic priority of “ensur(ing) that the macroeconomic statistics meet user need and best inform public debate and economic decision-making.” To achieve this, the Board committed to “independently reviewing the governance and future development of inflation statistics in 2013”, leading to the Johnson review of consumer price statistics. The Board also committed to review economic statistics to ensure that they “best meet user needs in the future”, leading to the restarting of the NSQR process, starting with the National Accounts. This process has supported proactive identification of quality issues with particular economic statistics.
5.86 However, due to the in depth and resource intensive nature of NSQRs for both producers and the teams conducting the reviews, there is only one NSQR on an economic statistic a year. For meaningful routine exploration of quality issues, there is a need for more frequent monitoring of issues relating to the quality and coherence of statistics by the Board.
5.87 Since July 2015, there has been an increasing recognition of the need to focus on quality, particularly of economic statistics. There was an acknowledgement by the National Statistician in July 2015 that the “statistical system is skating on quite thin ice and there are significant vulnerabilities”, which required a balancing of everyday production tasks against “the imperative to prepare for the future.” Similarly in October 2015, the Board considered “Challenges and opportunities in economic statistics”. The minuted discussion covered the increased use of administrative data, limited capacity to provide analysis and advice due to technological limitations, and challenging misuse of statistics publicly. This shift is encouraging, and demonstrates that the National Statistician, his new leadership team and the Board are identifying and prioritising the quality of economic statistics.
5.88 On organisational performance, there are far fewer routine reporting mechanisms which support the identification and correction of capability issues within the ONS. It is commonplace for boards to rely on external assessment of performance to support scrutiny and the UKSA Board is no exception, having commissioned several reviews into aspects of ONS performance, such as those by CIPFA, Atkins and Thoughtworks. For this approach to work, the UKSA Board must be aware of organisational limitations, to commission reviews into them. The Review team found few effective self-assessment processes in ONS which could support proactive identification of organisational performance issues, limiting the UKSA Board’s ability to take early corrective action. Other independent organisations have had recourse to independent evaluation offices to address informational lacuna at Board level about organisational effectiveness
5.89 In order to address this issue, the Review recommends a broadening and deepening of UKSA’s regulatory function. This should include not only the assessment of the consistency of official statistics with the Code but also the execution of rigorous independent assessments of the accuracy, reliability and relevance of statistics, i.e. quality in the broadest sense, and of the organisation’s ability to deliver them. Assessing both the statistical estate and ONS’s effectiveness is likely to improve its ability to pre-emptively identify mistakes.
5.90 To reflect the expanded remit, the operation is referred in this Report as an ‘Independent Regulation and Evaluation Office’ (IREO). As well as providing the UKSA Board with digestible and relevant information, the IREO would be expected to publish an annual public report on the performance of ONS and provide an independent assessment of the quality of the whole statistical estate. This should also include the ability to recommend the creation of new statistics or modify existing ones to address lacunae and ensure greater coherence in the statistical estate. This report would also aid users, government and Parliament in holding UKSA/ONS to account for meeting its statutory responsibilities.
Recommended Action 24: The UKSA regulatory function should be subsumed within a new Independent Regulation and Evaluation Office (IREO) charged with assessing the trustworthiness and quality of official statistics as well as ONS’s effectiveness; the head of the IREO would report to the UKSA Board and publish an annual assessment of ONS performance and the whole statistical estate.
5.91 In order to meet its expanded remit, the IREO would carry out reviews either on its own initiative or at the behest of the UKSA Board. It would clearly need to have sufficient resources at its disposal to do this, including statistical knowledge, and would be expected to draw on resources from within ONS or outside the organisation as appropriate, including commissioning reviews by external experts. As the Head of the IREO may sometimes need to tell uncomfortable truths to the UKSA Board, this person should be widely seen to be a strong and externally credible individual.
5.92 It is a moot point whether or not the IREO should be placed outside UKSA altogether. On the one hand, placing it outside bolsters independence and makes it less subject to the ‘marking one’s own homework’ critique. On the other hand, leaving it inside facilitates scrutiny of ONS; it will be harder for a completely external organisation to understand what is going on in ONS than for a unit within UKSA/ONS. This is also the model chosen for both the IMF’s and Bank of England’s Independent Evaluation Offices. On balance, the Review favours the latter, at least in the first instance. But for it to be a success, it will need the UKSA Board and ONS staff to be open to criticism.
5.93 Implementation. Quality and performance issues need not only be identified but also addressed. Several individuals who spoke to the Review team expressed scepticism about the ability of the Board to effect change, citing only partial implementation of the recommendations in past reviews. One former Board member said that, at least in the past, actions had sometimes been agreed at Board level but not followed through on. One factor appears to have been inadequate monitoring of progress in implementation.
5.94 After the discovery of the errors in the construction and GDP statistics in 2011, ONS conducted two internal reviews (Brand reviews one and two) to diagnose the causes. In July 2013, after errors were discovered in Business Investment Statistics and GDP, the ONS Board commissioned a third Brand review. This review found that recommendations made in the first two reviews, that might have prevented the latest errors, had not been implemented. The subsequent Board discussion noted that “under the present circumstances it would not be a surprise if further errors occurred”. The ONS Board commissioned a response to the Brand review “as a matter of urgency”, stipulating that this must include deadlines by which all recommendations would be implemented. Clearly, failure to implement recommendations for two years is unsatisfactory, unless there were reasons why remedial action was impossible.
5.95 While NSQRs and RQRs may help to isolate weaknesses in statistical outputs and scope for quality improvements, the UKSA Board does not routinely see progress against any associated recommendations for action. Recommendations are placed on a central risk register that contains hundreds of other individual risks, but are only monitored by the central quality team and the production teams responsible for delivering the recommendations. So it is not surprising that implementation has yet to begin on several of the recommendations of the Barker-Ridgeway review. Of course, the Board should not be expected to track the implementation of every single recommendation. However, it is important the UKSA Board ensure the past failures of the former ONS Board, like in the example of the implementation of the Brand reviews, are not repeated.
5.96 Strategy. In 2015, the UKSA Board launched ‘Better Statistics, Better Decisions’, laying out an ambitious five-year strategy for UK statistics structured around five key qualities needed for a world-class NSI in the 21st century: “Helpful, Professional, Efficient, Capable and Innovative”. That strategy is consistent with the vision for the future provision of statistics that underlies this Report. The associated business plan would, though, benefit from a comprehensive corresponding set of SMART (specific, measurable, assignable, realistic, time limited) objectives against which the Board can hold ONS and departmental producers to account. At present many of the objectives are somewhat nebulously defined, leaving room for debate over whether they have been achieved. In her response to the Call for Evidence, Professor Diane Coyle noted that “there is a big gap between UKSA’s: ‘Statistics need to keep pace with a fast changing world. We need to be constantly attuned to developments and respond rapidly when new issues arise where the evidence base is absent or contested,’ and the specific challenges of measuring the digital economy”.
5.97 A comparison with Statistics Sweden is instructive. There, some objectives are set by ministries, such as reducing burdens to businesses, addressing declining response rates to household surveys and improved measurement of the impacts of globalisation and digitisation. In its annual report, Statistics Sweden describes progress against those specific objectives, together with a lot of information on what statistics are produced, their cost, staffing and user views. UKSA should seek to emulate such a detailed description of its activities and the progress against its strategic objectives both for ONS and departmental producers.
5.98 The most important ingredient in ensuring that the UKSA Board is effective is the quality of the people involved and their commitment to delivering a statistical system fit for a 21st Century economy. But the Board also needs to be supported by effective processes and here the Review believes there is scope for improvement. In particular, the information flows regarding the quality of statistics and their costs, user views and needs, and the implementation and monitoring of change all leave something to be desired.
5.99 Several of the recommendations in this Report seek to close this gap. These relate to identifying shortcomings in economic statistics across the whole statistical estate (Recommended Action 2) an effective and transparent prioritisation process (Recommended Action 22), the establishment of a high level stakeholder group to improve awareness of user views (Recommended Action 23), the creation of an IREO to increase scrutiny of the quality of statistics and ONS performance (Recommended Action 24) and the technical analysis coming out of the Centre for Excellence in the Measurement of the Economy (Recommended Action 4). Together these should significantly expand the evidence base underpinning Board and senior management decisions.
5.100 The architecture enshrined in the Act was designed to prevent government interference in the production and publication of official statistics. Many respondents to the Call for Evidence reiterated the central importance of maintaining that independence. One respondent neatly captured the overall sentiment: “whatever changes are made to the governance arrangements, it is essential that the independence of the production of statistics continues to be guaranteed by statute and is seen to be free from political influence.” The Review strongly endorses that sentiment.
5.101 Independence alone is, however, clearly insufficient to guarantee the provision of high-quality statistics that are fit for purpose. In response to the question “do you think the current governance arrangements for economic statistics support their effective production?” Professor Diane Coyle noted, “while the arrangements broadly safeguard the independence of official economic statistics, the user dissatisfaction means the answer to this question has to be no. The statistics produced are not at present effective in answering the questions users want to address.” Much of this Report has been focussed on explaining why this is so and what can be done to improve matters.
5.102 In a democratic society, accountability should be the hand-maiden of independence. The UKSA Board is responsible for ensuring that ONS delivers statistics that meet user needs (though its ability to achieve that for the wider statistical system is rather more limited). So the Board should also be accountable – to government, Parliament, users and the public more generally – for achieving that objective. The 2007 Act sought to establish this accountability by making UKSA accountable to Parliament.
5.103 In most advanced economies, the Ministry of Finance is the ‘parent’ department of the NSI. This may seem a natural assignment, as the Ministry of Finance is invariably a major user of economic statistics, as well as holding the purse strings. One consequence of the Act was that residual ministerial responsibilities were given to the Cabinet Office. The thinking was that this would buttress independence precisely because of the Cabinet Office’s ‘lack of a particular subject interest’ in statistics. As a department which both produces and uses relatively few statistics, the Cabinet Office was seen as offering an impartial home. Moreover, its role as the co-ordinating department across Whitehall meant that it could support “effective planning of statistical work … to meet future statistical requirements right across government”. Alongside that shift in the identity of the parent department, responsibility for Parliamentary oversight of UKSA and ONS shifted from the Treasury Select Committee (TSC) to the then Public Administration Select Committee (PASC) – now Public Administration and Constitutional Affairs Committee (PACAC).
5.104 While it is primarily the responsibility of the UKSA Board to hold ONS to account for the delivery of high-quality statistics, both the parent department and the relevant select committees should be engaged in ensuring that happens. Several factors appear to have contributed to this engagement being less stringent than might have been expected in view of the number of recent errors and the extent of user concerns:
- As noted above, the information on ONS performance and user satisfaction, including on an internationally comparable basis, is not all it could be.
- Reticence on the part of government and key stakeholders to voice their criticisms loudly in case it was seen as infringing UKSA’s independence.
- The relative lack of interest within the Cabinet Office in statistics, with an insufficient amount of officials’ time allocated to the oversight of UKSA. Although there has been some extensive engagement on data access legislation, routine meetings with senior UKSA/ONS leadership instead focussed on how ONS could better support government implementation taskforces, rather than oversight of UKSA.
- While PASC/PACAC has been very active, viz. its wide ranging inquiry on statistics launched in 2012, its focus, not surprisingly, has been on issues where it has a comparative advantage, namely constitution and governance, rather than the quality and delivery of economic statistics themselves. Although recognising that there were some user concerns about data quality, its 2013 report on the Operation of the Act focussed on issues such as: the lack of clarity over UKSA’s committee structure; the need to separate UKSA’s production and regulatory functions; co-ordination across the statistical system; and confusion over the meaning of the ’National Statistics’ badge.
5.105 Now it may seem that the obvious solution to this problem is just to transfer departmental responsibility back to HM Treasury (and prime responsibility for Parliamentary oversight to Treasury Select Committee). Certainly that would put a department in charge that has a high stake in ONS producing high-quality economic statistics. Moreover, the annual spending review process means that the Treasury should be well-sighted on UKSA/ONS’s objectives and the resources needed to achieve them.
5.106 Such a re-assignment of responsibility would, however, also reintroduce the concerns about ensuring independence that the 2007 Act was supposed to solve. Moreover, while HM Treasury has a stake in the provision of high-quality economic statistics, it has less of a stake in other ONS statistical products, such as population, crime or health statistics. There is no perfect solution.
5.107 Accordingly the Review does not recommend changing the current assignment. Instead, the expectation is that other recommendations in the Report will mitigate some of the problems associated with the current arrangements, while allowing the benefits of the present arrangements in buttressing independence to be maintained. These include: establishing effective and transparent processes for prioritisation (Recommended Action 22); the high-level stakeholder group for economic statistics, acting as a conduit for key users to make their concerns felt (Recommended Action 23); and the role of the IREO in providing an additional and public evaluation of ONS performance (Recommended Action 24).