The ONS published a welcome note yesterday updating the progress being made with statistics on the migration patterns of international students coming to the UK. It gave a fairly upbeat impression but really only laid bare how little we as a nation know about these students. New statistics are needed – and requiring all students to get a National Insurance Number would be a good start and might even be part of the post-Brexit changes. Lets hope that the ONS and UKSA Board are in there arguing for such changes. The article also had a graphic that was misleading and below the standards that we might expect from the ONS. Continue reading International student migration – ONS update
The ONS and its governing board, UKSA, does not lie but on migration their statements stretch credibility and do nothing to boost public trust in data. The ONS needs to change its stock press briefing that the passenger survey is “the best available source” for migration data. It is true but only because it does not seek a better source. It’s like telling a cancer patient that an x-ray is best when we know the (more expensive) scan is better – it might help at the margin in some cases but it’s not what’s needed. It is damaging to give such a deeply misleading impression of the data’s veracity. For the same reason, the ONS needs to stop publishing confidence intervals on migration data.
The UKSA needs to say it as it is – migration numbers are not fit for purpose and a new system (and more money) is needed to collect decent numbers. Brexit will require better data so UKSA would no longer be saying anything radical. Pretending that the figures are OK is not only a failure of its regulatory duty but provides cover for a Home Office that really needs to get on with what’s required – the not so tricky job of counting people crossing borders. Continue reading ONS: Time to be frank on migration
Those broadsheets that wanted to “remain” are looking for every scrap of bad news following the Brexit vote. For many stories it seems fair enough, newspapers always have their own take on events. Surely though, it’s a step too far when the reporting of official statistics “facts” falls below a certain threshold of quality, deliberately. Such was some of the reporting of Tuesday’s inflation figures. More reporting of events (and less speculation), a bit of perspective (not focusing on the latest month’s figures) and looking at the detail of the release would be good. Continue reading “Inflation soars” OMG
The recent political coming and goings (the EU referendum, the arrival of a new Prime Minster and Labour’s travails) has seen a period of unusual attitudes to facts. More people seemingly want information and yet the (accurate) use of facts by politicians, some elements of the media and quite a few people has fallen to new lows. Experts are being rubbished, institutions’ reputations are being damaged, and the media is accused of being biased, prompting discussion of a post-truth society. There is much talk of a fractured Britain as technology and globalisation have hastened economic disruption affecting many livelihoods.
This note sets out a few steps – go local, kill the average, be open, do good research, un-spin and tell good stories – that the statistics world might take to help people reconnect with reality and help policy makers understand what might be needed if we are to establish a more sensible approach to debate and policy. It has much in common with the Data Manifesto published by the Royal Statistical Society two years ago. Continue reading Post-truth, post-Brexit statistics
The gender pay gap is one of the most misunderstood areas of British public policy statistics. The only question is the extent to which this is accidental or deliberate obfuscation by pressure groups. The UK Statistics Authority needs to step in to do its part in getting better data, better explaining the existing data it publishes and correcting those who misuse it. It is a shame that the respected IFS has added to the deluge of confusion with its latest report published today.
So far as the statistics are concerned, the pay gap is the average amount of money paid to men in work versus the average paid to women. So far as legislation is concerned, the pay gap is the difference between the pay of an equally qualified and experienced man and woman doing exactly the same job. Sadly the rhetoric swings happily between the two helping no one. Every time this blows up I simply wish for better data so that we can really understand the issue and put ourselves in a position where we can develop policies that will put an end to discrimination. Instead we get (mostly) ill-thought out hot air. Continue reading The scandal of the gender pay gap
The BBC and Wikipedia probably beat the official Rio website in terms of the data offering and presentation but it was also a feast for others interested in providing numbers, including the media, or browsing them. Here are some links, plus my summary table of the sequence of medal awards – this was valuable in tracking the rate of GB medals so as not to be blown off course by media during the event that (predictably) swung from gloom to over-hyped optimism. Continue reading Team GB, Olympics and stats
This is a story about how I tried – and failed – to get some data about top performing GCSE students and girls doing STEM-related A levels. The story highlights weaknesses in the Department for Education but also in government statistics and their regulation systems more widely. The public deserves better. A report from the UK Statistics Authority on this quest was published today.
Exam success is key for a school pupil that wants to go to a leading university, on their way to a top job. As the AS levels will soon be a thing of the past, GCSEs are vital in that journey. Yet information about what sort of pupils from which type of schools in different parts of the country get the all-important top grade GCSEs (or study combinations of STEM-related A levels) is largely a mystery as the government denies access to the full set of school-level exam results. Continue reading School exam statistics – state secrets?
The rise of migration to the UK has been one of the extraordinary stories of the last 20 or so years. A majority of Britons want migration to be lower and think (by six to one) that the Government’s policy towards it as been unsatisfactory. High migration was surely one of the main reasons why the referendum happened and why the “leave” camp won. Perhaps curiously, Mrs May, who has been Home Secretary for six years, is the odds-on favourite to be the next Prime Minister, despite having overseen a migration failure. She did not get to grips with the mismanagement of e-borders, promoted the conceptually ridiculous net immigration target, signed up to the “tens of thousands” manifesto pledge, and then missed it by a factor of around ten. Under her watch, there has been net immigration of 1.8m and gross immigration of nearly 3m despite a myriad of mini-adjustments to migration rules. Conservative party members who are concerned about migration will want to see a clear commitment from Mrs May to end free movement in Europe and introduce a new immigration system and work permits. Continue reading Mrs May’s record on immigration
UK Statistics Authority issued a statement today saying that it was “disappointed” in the way the £350m figure (of the cost of the UK’s EU membership) is being used and that it “undermines trust in official statistics”. I am sure that that much is true but it is also true that had UKSA presented the numbers “better” in the first place, the Leave campaign might never have used it in the way it has. UKSA could also have moved more quickly to clear up the ambiguity. Instead it left this statement to the 11th hour and risks making itself look political. A lesson should be learnt by all who feel that such numbers are important to democracy, accountability and good policy making. The reputation of numbers has taken a hit in this debate and every effort should be made to make sure they are presented properly in the future.
Pretty much everyone thinks it’s a good idea to have more economists (code for analytical capability) at the ONS but opinion divides when there’s discussion as to what they should be doing. There is a need to have people who can acquire and probe exiting data assets to make them sweat in the spirit of the Bean Review. In contrast, there is no need for the ONS to have any more descriptive writing and (sometimes dumbed-down) publications that serve some unspecified need. That would be a wasted opportunity. There is a risk that the hiring of large numbers of economists in a hurry, mostly in their early careers, as opposed to curious souls with experience, will lead the ONS down the wrong path. Meanwhile economists outside government need to start making the case for better statistics. Continue reading Economists at the ONS