In my last blog post, I showed that while governments are increasingly using the technology to demonstrate that services are improving, their efforts risk being undermined by “gaming” – in other words, fiddling the performance statistics to make things look better than they really are.
We focused on the problem in the UK. In this blog, I look at what the UK has and hasn’t done to address the problem, and what we can learn from that.
The first thing that the UK did to address gaming was to create an Office of National Statistics on an independent footing, reporting to Parliament rather than the Executive. This was not a response to the problems with crime data, but to broader complaints going all the way back to the 1980s that official statistics were being manipulated. For example, the definition of unemployment changed 29 times during the 1980s. The suspicion that some of the changes were politically driven was hard to avoid.
The second thing was to give the UK statistics body a role in regulating the statistics that public agencies produce. (We saw in my earlier blog post how the Authority withdrew its triple-A rating from police crime data.)
The third thing – again, not a specific response the crime data problem - is that every year since 1981, there has been an independent crime survey, based on citizens’ self-reports. It is regarded as more reliable than police data, because it includes crimes which citizens suffer but don’t report to the police (for example, sex offences are underreported), and because reports are not filtered by the police.
After that, the report into crime data by the police watchdog, HMIC, which we discussed in the previous blog, made thirteen recommendations, which include training for police officers on how to record crimes accurately, and the advice that performance targets, personal performance appraisals, and decisions about remuneration or promotion should no longer be based on police-recorded crime data.
How have the UK police responded to the HMIC and other criticisms? One telling statistic is that although crime is reported to have dropped in the latest annual independent crime survey, it is actually reported to have increased in the police’s own data. It appears that the police have already taken steps to record crimes that they used to leave out. This does not mean that the problem is wholly solved, however. Critics have suggested that even the National Crime Survey does not give an accurate picture. Crime may not have dropped as much as the statistics suggest, because criminals may have migrated from crimes like burglary to the more lucrative pastures of cybercrime, which even the National Crime Survey admits that it has not found a way of capturing so far.
What do we learn from all this?
- Service performance is not a neutral arbiter. In the UK, just as in Malaysia and the US, management by numbers has created a perverse incentive to fiddle the statistics which risks undermining the entire initiative. Any government that goes down this road without taking steps to minimize gaming is acquiescing in data fiddling. It is also shooting itself in the foot, because it is undermining the public confidence which management by numbers is supposed to increase.
- The steps needed are measures to ensure the integrity of the statistics. Government statistics agencies should be independent, and should be able to regulate the statistics gathered by line agencies.
Management by numbers is not the only way to improve public services. Competition between service providers, and good old-fashioned professional upgrading through training etc. and other approaches have their place. But the final message of the Malaysian, American and British experience is that governments that go down the numbers road to improve service performance will lose public confidence unless they recognize that there will be gaming, and unless they stamp on it.
Join the Conversation