Average earnings aren’t really average earnings

The earnings figures are very important and politically sensitive, yet the trends are highly uncertain and poorly presented by ONS. The poor presentation by the ONS centres on the failure to remove compositional changes and present earnings growth on a like-for-like basis. This has not only led to an overly pessimistic view of what’s happening in the real world but means that political and media attention has been diverted from the real issues in the labour market.

The bulk of this blog appears in appendix 2. It looks at the presentation of the earnings numbers in the main ONS publications and shows the paucity of “health warnings” that the ONS gives in its press releases. It reveals a simplistic, confused and ultimately misleading presentation of the earnings data. The ONS fumbles its guidance as to whether AWE or ASHE ought to be used, failing to address which of the various measures available might be best used when. It publishes just a few measures, without saying why they have chosen them, and ignores many others. This has allowed a distorted view of the labour market to take hold and be widely believed with a certainty that’s not justified. Earnings growth for most people is not as weak as widely believed but ONS seems unwilling to explain the full story.

The headline average earnings figures (AWE) produced every month are the total earnings in the economy divided by the number of workers. It’s an average but only one of several averages that could be calculated. This tells you what the average earnings is in the whole country in any month, in £s. This is very different conceptually from telling you what is happening to the growth in earnings of any person in the labour market or even what is happening, over time, to the existing group of people in work. Yet, that is what everyone except a few economists wants to know. To get that number the ONS would need to produce a like-for-like average, including someone’s earnings only when there is a corresponding figure for that person in the previous period. The ONS refuses to produce like-for-like numbers. It produces numbers for its key stakeholders (the Bank of England and Treasury) while ignoring its public duty to give tax-payers the numbers they need to understand their place in society. 

The difference between a crude average and a like-for-like average matters, especially at times like this. If lots of people enter the job market on low earnings and lots of people on high earnings leave the world of work (this is what has been happening in the last decade) then the composition change among those counted in the average will be considerable – and much greater than normal. (See a worked example in appendix 1.) The composition change does not matter to a macro economist but it distorts the figures so far as individuals are concerned. This subtlety is not widely appreciated. The failure of the ONS to present more numbers or present the existing numbers with full integrity has led to a grave misrepresentation – the trajectory that most workers have experienced in recent years is not reflected in the statistics.

It is not clear to me what the ONS could do to alleviate the problem with AWE (by calculating different statistics from the same raw data set) but amending the presentation of ASHE would not be hard. As Section 4 of the ASHE release from October 2017, states:

“The composition of the workforce collected in the ASHE sample changes from year-to-year, reflecting changes in the structure of the workforce. This can affect changes in median earnings. …… One approach to removing this compositional effect is to look only at jobs in which the employee has been in the same post for at least one year (termed “continuous employment”). The change in median earnings for this continuously employed group shows how much a “typical” continuously employed person (that is, a person at the middle of the continuously employed distribution) in the current year is earning compared with the “typical” employee from the same group in the previous year. This analysis shows consistently higher growth rates over time for the continuously employed group compared with all employees …….”

Screen Shot 2018-02-06 at 15.35.12.png

It is often – almost universally – stated in the media that earnings growth (variously in nominal or real terms) of people is the lowest since some point in the distant past, and that the recovery from the recession a decade ago has been very slow. It is easy to see how the blue line in the above chart (deflated by CPI) would lead to that conclusion. The yellow line, which allows for composition change, shows something different. The misunderstanding has set the tone for much political and economic debate. Had average earnings growth been presented as being two percentage points higher on average each year in the last decade, the media coverage would have been very different. But as 80-90% of people in work are in the same job year to year, that’s closer to what most people have experienced.

The ONS knows about this and the failing has been pointed out to the UK Statistics Authority (for example, this letter from 2015). While UKSA has valiantly explained the many statistics, they have failed properly to clarify the issue or the offering from ONS to users. They have not set out clear expectations to ONS, with deadlines. It’s been going on for years and the ONS is clearly not budging, presumably seeing no need to present better statistics more clearly to users. This letter from 2015 about clarifications in one-off ONS analysis shows the messiness of the situation. The description and data in the letter is helpful but the figures in it are not part of the normal publication cycle – they should be. The closing line of many UKSA letters on this issue – along the lines of “that, given the range of important and interesting statistics in this area, it is especially important that users describe them precisely” – is amusingly hand-washing in style given the complexity of the numbers and the reluctance of ONS to help.

There’s then a question as to whether the data can actually be trusted. You don’t have to look far into the spread sheets to find numbers that simply don’t look right. Is it really plausible that average earnings across the whole finance sector fell by almost 10% (from £550 a week to £495) in one month in 2009? If apparently odd numbers are right, the ONS ought to explain the story behind them. The screen shots below come from an ONS excel sheet “EARN01: Average Weekly Earnings” where breakdowns of the data in the monthly release can be found.

Screen Shot 2018-02-06 at 13.12.28.png       Screen Shot 2018-02-06 at 13.13.07.png

The differences in the annual growth rates for earnings comparing AWE and ASHE need some explaining too. Every year bar one in the 2000s the difference between the two was less than one percentage point. Yet three of the last six years have seen a gap greater than that. These are turbulent times and the ONS ought to be explaining the difference between the data from AWE and ASHE. (See the end of the appendix for more details.)

It is surely time for UKSA to act as these issues have not been addressed by correspondence or in their assessment reports. AWE was only given its National Statistics status in 2009 (Assessment report no 19). It reads as if that report was rushed out to allow ONS to stop producing the predecessor index, the average earnings index (AEI). As the report explained, the AEI was a measure of the growth in average earnings, presented as an index, whereas AWE is a measure of the level of average earnings, presented in pounds and pence. It is time for OSR to reassess AWE and decide if an index expressly calculated to show change is required. This would not be the first relatively new National Statistic to need a review early in its life.

ASHE was assessed in 2011 in Assessment report no 138. The remaining labour market data were assessed in 2014 in Assessment report no 138. The latter raised a number of presentational issues and asked ONS to “Make clear the distinctions between the various labour market statistical outputs so that users are better able to identify and access information relevant to their needs, and ensure the coherence of the statistics presented within them.” The lack of coherence has not been addressed and is the occasional source of slightly tetchy correspondence between one side of UKSA, OSR, and the other, the ONS. The last letter (from April 2017) lays bare the lack of progress.

So, if I’m even half right and the earnings picture for most workers is not so dire as presented, what is going on? I suggest there’s a loss of jobs at the top of the earnings scale and a disproportionate creation of often part-time and insecure jobs at the lower end of the scale. Other data is entirely consistent with this narrative even if the actual data to prove it does not seem to have been released from the ONS computers. This change in the labour market and its consequences (on public policy and public finances) are far more pressing for the country’s future than the (not really happening very much) decline in real earnings. It’s time for better figures, more reasonably presented.

This blog is published ahead of an ONS event (on 7 February) about earnings statistics.

I last wrote on this topic in December 2015 – not much has changed since then!

Appendix 1

A worked example of the impact of the composition change

The table below (copied from a blog I wrote in 2015) shows the impact of composition change. In this case, 100 existing workers got a 4% rise and 10 new workers were taken on at the lowest wage. The result is a decline in both average and median earnings – neither fairly reflect what’s going on! Imagine:

  • 100 workers in period 1
  • earnings across the spectrum
  • in period 2, 10 new jobs are added at the lowest wage
  • all other workers (the 100 from period 1) see a 4% rise in wages
  • yet by period 2, average earnings fall and median wage falls
  • headline stats look terrible yet 100 workers have a 4% increase and 10 workers now have jobs that they didn’t have before!

Screen Shot 2015-01-29 at 10.50.35

Appendix 2

A detailed description of what appears in ONS releases about earnings

The following are quotes taken from the statistical bulletin “UK labour market: January 2018“, the latest monthly labour market release. There are dozens of other labour market releases but all bar a few users will rely on this alone.

Section 1. “Main points” does not mention composition change:

  • “Latest estimates show that average weekly earnings for employees in Great Britain in nominal terms (that is, not adjusted for price inflation) increased by 2.5% including bonuses and by 2.4% excluding bonuses, compared with a year earlier.
  • Latest estimates show that average weekly earnings for employees in Great Britain in real terms (that is, adjusted for price inflation) fell by 0.2% including bonuses, and fell by 0.5% excluding bonuses, compared with a year earlier.”

Section 8. “Average weekly earnings” has a section headed “Things you need to know about average weekly earnings”, and says:

“Average weekly earnings measures money paid per week, per job to employees in Great Britain in return for work done, before tax and other deductions from pay. The estimates are not just a measure of pay rises as they do not, for example, adjust for changes in the proportion of the workforce who work full-time or part-time, or other compositional changes within the workforce. The estimates do not include earnings of self-employed people.”

I emphasise the word “just” as the statement would be more accurate if that was omitted. This is arguably not even a measure of pay rises. The other caveats in that paragraph are ignored in the wider debate about earnings and need to be more prominently displayed in the release. Otherwise, huge headlines and political spats will continue on the basis of a small (statistically insignificant) rise or fall.

Under the commentary section it continues:

“For November 2017 in nominal terms ….. average regular pay ….. was £480 per week before tax and other deductions from pay, up from £469 per week for a year earlier. …… Between September to November 2016 and September to November 2017 …… regular pay increased by 2.4%, little changed compared with the growth rate between August to October 2016 and August to October 2017 (2.3%).”

The risks is that wording like “regular pay”, “up from” and “increased” could easily give the impression that the data are indicating pay rises of individuals as opposed to a rise in economy-wide earnings.

This misrepresentation is compounded in the following section of the release:

“Looking at longer-term movements ….. For November 2017 in real terms (constant 2015 prices) average regular pay ….. was £459 per week before tax and other deductions from pay, £14 lower than the pre-downturn peak of £473 per week recorded for March 2008.”

At this point – when a comparison is being made over a decade – it is necessary for ONS to mention the composition effect. It is negligent not to do so.

The ONS compounds the error of not mentioning composition effect by presenting just one chart – a rather negative one showing earnings in real terms.

Screen Shot 2018-02-06 at 10.42.07.png

After sections headed “Where to find data about average weekly earnings” and “Where to find more information about earnings”, there is “Notes for: Average Weekly Earnings” which includes, under note 2:

“As well as pay settlements, the estimates reflect bonuses, changes in the number of paid hours worked and the impact of employees paid at different rates joining and leaving individual businesses. The estimates also reflect changes in the overall structure of the workforce; for example, more low paid jobs in the economy would have a downward effect on the earnings growth rate.”

So, eventually here, in a footnote after the “end notes” on page 18 of the pdf there is the most gentle of warnings. I don’t think that is sufficient.

The final section of the release is section 17, “Quality and methodology“. Earnings does not get an explicit mention but this paragraph is interesting:

“In general, changes in the numbers (and especially the rates) reported in this statistical bulletin between three-month periods are small, and are not usually greater than the level that is explainable by sampling variability. In practice, this means that small, short-term movements in reported rates should be treated as indicative, and considered alongside medium-and long-term patterns in the series and corresponding movements in administrative sources, where available, to give a fuller picture.”

I agree with it but surely, if the “small, short-term movements in reported rates should be treated as indicative” and need to be “considered alongside medium-and long-term patterns in the series”, the composition effect needs to be addressed in the release as it will affect the medium term patterns.

This difference between short and long term data is worth exploring.

There is a hyperlink in section 8 of the monthly release to the “Guide to labour market statistics – Explanation of the major concepts that exist within the labour market and their relationship to each other.” Although it is hidden away it is surely where the composition effect ought to be addressed.

It doesn’t have a long section on earnings, but section 2 includes:

Earnings statistics can be classified into two categories: structural statistics and short-term indicators. Structural statistics tend to be more detailed and are used to analyse trends in earnings over long periods. The Annual Survey of Hours and Earnings (ASHE), and before it the New Earnings Survey, is the recommended source of employees’ pay levels. Data are published on an annual basis for the UK, and also broken down by industry, occupation, region, small area, gender and full- or part-time status. ASHE data are published annually in October, using data from the previous April. Average Weekly Earnings estimates appear monthly in the labour market statistics releases.

For estimates of short-term pay or earnings growth, the Average Weekly Earnings (AWE) statistic is the source we recommend, providing estimates of monthly and annual change for the main industrial sectors. AWE was accredited a National Statistic in November 2009, and replaced the Average Earnings Index (AEI) as the lead measure of short-term earnings growth in January 2010.

It measures the changes in employees’ average weekly earnings. AWE is based solely on the Monthly Wages and Salaries Survey (MWSS), which covers employees working in businesses with 20 or more employees in all industrial sectors in Great Britain (an adjustment is made for smaller businesses).

As well as tracking changes in earnings, AWE makes an explicit estimate of earnings in pounds. Separate estimates are made of bonus and arrears pay. AWE uses employment weights that are recalculated every month. This means that AWE reflects the composition of the workforce at any given time, and that changes between months capture shifts in the workforce as well as changes in earnings within industries. These are both advantages over the former AEI measure, and two of the main reasons for the switch between the measures.

AWE is used to produce figures for the economy as a whole, and by sector and industry. The AWE uses the number of employees on an employer’s payroll as its denominator, so changes in the number of paid hours worked (assuming the pay rate per hour stays the same) will show as an increase or decrease in average earnings as appropriate.”

Once again, the composition effect is barely mentioned. Indeed, as the bit in bold shows, the fact that the figures include the composition effect is seen as an advantage! This might well be true when viewed from the perspective of a macro economist looking at earnings in the economy at large in the context of GDP, but is most certainly not the case when thinking – as most users do – about what’s happening to individuals or people. I will write more about the take-over of the ONS by economists and the impact on data in future blogs.

The other main insight from that section is the short v long term uses for data. It is obvious that a monthly index (AWE) will be the preferred measure for looking at short term changes – it has to be better than an annual series (ASHE)! But the note makes it clear that ASHE is to be preferred when it comes to analysing “trends in earnings over long periods”. But what is “long”? The ONS does not say. Given ASHE has only been around since 2004 (see this ONS note) and its predecessor (NES) since 1970 (see some history here) it seems clear that analysis over, say, the period since 2004, should not be based on AWE data. Despite this, the ONS itself has Figure 9 in its monthly release (reproduced above) covering a 12/13 year period.

I might be wrong but I cannot see a link in either the guide to labour market statistics or the monthly release of labour market data to the ONS paper comparing the AWE with ASHE. This important paper, entitled “An overview of and comparison between Annual Survey of Hours and Earnings (ASHE) and Average Weekly Earnings (AWE)” was published in September 2017.

In section 2, “Overview”, the report says:

“The ASHE headline measure is not solely a measure of rates of pay and can be affected by changes in the composition of the workforce. For instance, all other things being equal, an increase in the relative number of employees in highly paid industries will cause average earnings in ASHE to rise. ………

Average Weekly Earnings (AWE) is our lead indicator of short-term changes in earnings. …….. AWE, for any given month, is the ratio of estimated total paid in wages and salaries for the whole economy, divided by the total number of employees. Therefore, AWE is the mean rate, as opposed to the headline median estimate reported in ASHE. As with ASHE, AWE is not a measure of rates of pay and can be affected by changes in the composition of the workforce.”

Screen Shot 2018-02-07 at 10.46.33.png

It seems likely that ASHE will be less affected by composition changes (as it is a median measure not a mean) but the ONS is saying both are affected by it. I can see no ONS analysis of the impact of composition changes on mean v median.

In section 3, “What do ASHE and AWE measure?“, the note boldly says that neither measure shows “pay increases”. This is despite “pay increases” being the main use of the statistics in media and political debate.

“A common misconception of the headline Annual Survey of Hours and Earnings (ASHE) and Average Weekly Earnings (AWE) figures is that they are measures of pay rises. However, they are both designed to estimate earnings of all employees in the economy at a single point in time and therefore measure the change in average pay, rather than the average of changes in pay. This is the reason why average pay can increase without anyone having had a pay rise.

As noted in the overview of the two sources, both ASHE and AWE earnings estimates can be affected by changes in the composition of the workforce, for example:

  • increases or decreases in the number of part-time or full-time employees
  • changes to the number of hours employees work
  • employees entering or leaving the workforce

These changes in the workforce mean that the averages may not be measuring changes within each series on a “like for like” basis.”

There is nothing wrong with this description but it is not balanced. First, instead of saying “This is the reason why average pay can increase without anyone having had a pay rise”, it might be better to say “This is the reason why average pay can decrease despite most people in work having had a pay rise”, as that is what is happening now. Second, where it says “the averages may not be measuring changes within each series on a “like for like” basis” it would be better to say “the averages are not be measuring changes within each series on a “like for like” basis”. That said, given this statement it is surprising that UKSA does not act when AWE is used to describe pay rises.

Section 11, Annex B, of the note gives a worked example demonstrating compositional effects. Oddly the example given is of a five person company where the lowest person leaves. It shows that “The average weekly pay ….. is 14% higher even though everyone who was working in the company in 2015 worked the same hours and earned the same hourly pay as in 2014.” An example showing the addition of low paid or part time workers to the work force would have been more representative of the recent years.

Critically, it is disappointing that there is no attempt in the note to offer actual data based on a like-for-like basis. ASHE can do this though it is might not be possible to do it for AWE. The failure of ONS to address this issue is curious. Given the awareness of the issue it perhaps reflects a nervousness about showing that the most used measure, AWE, is not what it seems to be.

Section 15, Annex F, gives some actual growth rates in earnings as shown in the two series. They are strikingly different at times and perhaps ought to act as a warning not to make too much of the figures from year to the next. During the 2000s there was only one year when the annual growth rates of the two series were not within one percentage point of each other. Since then the rates have been all over the place. In 2011, ASHE grew by just 0.4% while AWE rose by 2.1%. The roles were reversed in 2014 when ASHE was up by 0.2% and AW fell by 1.7%. The picture of earnings growth is truly uncertain.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

w

Connecting to %s