Statistics in shambles: The crisis at the Office for National Statistics

This article examines some of the problems facing the Office for National Statistics and explains why they matter from the perspective of intergenerational fairness.

What is the Office for National Statistics?

As the UK’s largest independent producer of official statistics, the Office for National Statistics (ONS) is a very important institution. It is responsible for collecting and publishing key statistics on the economy like the unemployment and inflation rate, as well as productivity and GDP. But it also produces datasets covering everything from migration, household wealth, and cultural identity to housing, international trade, and crime.

Why does the ONS matter?

A democratic society cannot function well without reliable, timely, and trusted statistics. Governments depend on them to design, implement, and evaluate policies. Businesses need them to make investment and hiring decisions. The Bank of England relies on them when determining whether to raise or lower interest rates. Journalists use them to hold governments to account. Researchers and academics need them to understand the shared challenges we face. When the data are wrong, missing, or late, important decisions turn into guesswork and public debate drifts away from reality.

What has happened to the ONS?

Over the last few years, the ONS has faced widespread criticism for the reliability of key statistical outputs. The Labour Force Survey (LFS) is perhaps the most widely discussed example. The LFS is a large household survey that underpins the UK’s official estimates of employment, unemployment, and economic inactivity.

The saga begins in October 2023, when the ONS delayed and then suspended publication of its usual labour-market release based on the LFS. This was due to a significant decline in response rates, which had dropped to just 14.6% in mid-2023, compared to 47.9% a decade earlier. To be sure, the COVID pandemic was partly to blame, as lockdowns forced the ONS to rapidly switch from face-to-face interviews to phone calls.

But with response rates that low, the survey data could not be considered sufficiently representative. At the time, the ONS did publish some headline estimates for unemployment, employment and inactivity. However, these estimates were based on other ‘experimental’ data sources, which many regarded to be just as unreliable.

Since then, the ONS tried to implement a series of changes and reforms to improve the quality of its labour market statistics. And yet, despite these well-intentioned efforts, serious concerns have persisted. In May 2024, the chief economist for the Bank of England wrote a letter to the ONS arguing that the changes to the LFS “have not yet led to an improvement” and that it “remains uncertain whether the credibility” of the LFS will improve. In December 2024, the ONS admitted that the new and improved LFS might not be ready until 2027.

There are other notable examples beyond the LFS. In June 2025, the independent Office for Statistics Regulation (OSR) decided to suspend the accredited status of the Wealth and Assets Survey (WAS). The OSR judged that the survey no longer met the standards of trustworthiness, quality, or value required for official statistics status. Indeed, its comprehensive review found that, due to declining response rates and a lack of investment, the WAS was “no longer of sufficient value or quality to meet users’ needs.”

Why does this matter? The WAS, which began in 2006, measures the well-being of households across the UK in terms of their assets, savings, debt, and planning for retirement. It is one of the most (if not the most) important sources of information about the distribution of wealth in the UK. So, at a time when public concerns about wealth inequality are on the rise, we currently lack reliable official statistics to accurately understand its nature or extent. Good luck trying to design an effective wealth tax without this data!

Finally, it’s worth emphasising that the LFS and WAS are just two of the most high-profile statistical failures in recent years. As the Financial Times highlighted, 79 different official data series have either been cancelled or decertified since 2010, with 25 of those occurring in the last year alone.

Why is the ONS in crisis?

The ONS was once regarded as one of the most capable and trusted statistical agencies in the world. But, as the above examples make clear, the ONS seems increasingly unable to deliver on its core functions and responsibilities. All of this begs the question: why has the ONS found itself in this predicament?

Well, the answer seems to depend on who you ask. If you ask Sir Robert Devereux, who recently led an independent review of the ONS, he would point to three main factors. First, an understandable push to develop new, cutting-edge statistics came at the expense of “less exciting but nonetheless crucial” core statistics like unemployment. Second, weak internal planning and budgeting resulted in a “divergence between what [teams] were asked to do, and the resources provided to do so.” Third, there was a “reluctance, at senior levels, to hear and act on difficult news.”

If, however, you were to ask Sir Ian Diamond, the former head of the ONS, you would hear a different story. Earlier this month, Diamond told a parliamentary committee that the ONS had essentially been asked to do too much with too little. The ONS was, in his words, “hamstrung” by Treasury funding constraints. The Treasury wanted the ONS to focus on restoring the quality of core economic statistics. However, other departments, such as the Home Office, wanted more and better-quality data on crime.

Others, including the former deputy governor for the Bank of England, Sir Charlie Bean, have argued that some of today’s problems can be traced back to the ONS’s move from London to Wales in 2007. The move was a result of the then Labour government’s plans to decentralise the civil service by shifting jobs out of London and the south-east. But this well-intentioned policy came at a significant cost in terms of organisational capability and leadership. As the Financial Times reported, almost 90 per cent of the ONS’s London-based senior staff chose to quit rather than relocate.

So, who should we believe? It is difficult to say. Perhaps it is reasonable to conclude that some combination of all these factors played a role. It is also important to recognise that some of the problems facing the ONS are not unique to the UK. Around the world, response rates to official government surveys have declined in recent years, though not quite as precipitously as in the UK.

And one final, often overlooked point: the work undertaken by ONS staff is highly complex, technical, and largely thankless. As such, junior statisticians and civil servants should not be blamed for the shortcomings of senior leadership. They are, like many civil servants, presumably doing the best they can under challenging circumstances.

Why does the ONS crisis matter for intergenerational fairness?

The reliability issues with ONS data obviously impair government decision-making and make the Bank of England’s job much trickier. But they also make it incredibly difficult for us to accurately understand the nature or extent of intergenerational inequalities in the UK.

Consider the LFS. This survey contains crucial information on how younger generations are faring in the labour market. It helps us to answer questions such as: how many people aged 16–24 are unemployed or not in education, employment, or training? Or to what extent have the government’s changes to employers’ National Insurance contributions affected labour-market outcomes for younger workers?

Likewise, the WAS underpinned much of IF’s previous research on intergenerational wealth inequalities. In particular, our widely-reported finding that there were over 3 million people over the age of 65 living in millionaire households was entirely based on the WAS. As the Institute for Fiscal Studies (IFS) noted earlier this year, unsound methodological changes to the most recent WAS release also resulted in a £2.3 trillion fall in measured pension wealth for 2018–20. This, as the IFS rightly argued, is a highly implausible outcome.

Another pertinent example is the Living Costs and Food Survey (LCF), which provides vital data on household expenditure in the UK. For many years, this survey included a table estimating spending by age, allowing us to better understand how cost of living pressures were impacting different generations. However, in last year’s LCF, the ONS decided to remove this table, citing concerns about low response rates and “data volatility.” Once again, this only further constrains our ability to understand how younger generations are responding to changing economic circumstances.

What needs to happen?

Clearly, then, things need to change at the ONS. Sir Robert Devereux’s report contains many sensible suggestions, from restoring a focus on producing core statistics to reforming the leadership structure.

Some further recommendations are likely to flow from the ongoing Public Administration and Constitutional Affairs Committee inquiry into the ONS. But one point is obvious already: if the Treasury wants better statistics, it might need to pay for them.

In the end, better statistics won’t, by themselves, solve Britain’s problems. But we don’t stand a chance of solving these problems without them.

Help us to be able to do more 

Now that you’ve reached the end of the article, we want to thank you for being interested in IF’s work standing up for younger and future generations. We’re really proud of what we’ve achieved so far. And with your help we can do much more, so please consider helping to make IF more sustainable. You can do so by following this link: Donate.

Photo by Mika Baumeister on Unsplash.