PMO review: Statistical clarity is a must for successful governance
Summary
- India’s statistical system is under the PMO’s lens. Given recent data dissonance, it clearly needs a relook. The country’s view of itself is too hazy in these digital times of sharp resolution. For reforms, autonomy is key.
It is axiomatic that a country needs high-quality data to guide policy, a clear picture of itself created by statistics, and it’s ironic that we have fallen behind the curve on ensuring this for a digital age of precision. In our early decades of freedom, India was known for innovation on this score, given the challenges of placing a vast land of mass poverty and tricky diversity under the lens for data collection, slicing and dicing. Today, however, we have a hazier view of our progress than we should in the 2020s, given the modern tools available, with official readings of key variables contested by critics. To be sure, no complex formula captures the absolute truth. The best we can hope for is a high-rez snapshot, in contrast with one that is too blurry (or badly distorted by dodgy inputs), as an approximation of reality. Even so, statistics need to be as robust as they can be made for their utility to be maximized as policy inputs and the aims of governance to be met. Hence, it is not a surprise that the Prime Minister’s Office (PMO) has embarked on a review of India’s statistical system, as reported by Mint on Wednesday. The initiative is based on a paper prepared by the Economic Advisory Council to the Prime Minister.
At a basic level, some data dissonance is driven by a long delay in this decade’s covid- disrupted census, with projections by global agencies taken as the basis for news of our population having exceeded China’s, rather than a recent headcount of our own—a figure from which other numbers could be derived more reliably. Since output-per-head is a critical variable that tracks people’s economic well-being, we need accuracy on the country’s gross domestic product (GDP) too. As any GDP estimate is formulaic, with its feedstock taken from multiple sources, it is subject to input-quality hazards. The government’s 2015 GDP update—and adoption of the 2011-12 base series—was trailed by criticism, some of it over how closely our informal sector was being captured, with proxy data alleged to be vastly exaggerating its output. This critique grew sharp as expert views diverged over the impact of a currency switch in 2016 on cash-reliant businesses, and it has caught fresh wind from today’s debate over a V-versus-K shaped recovery from the pandemic. Annual GDP numbers are also prone to a confidence loss on account of revisions that can stretch for three years; so, apart from a relook at how we measure national income, faster data clarity would help, while the GDP deflator used to convert nominal data into inflation-adjusted numbers may call for a tweak to better reflect rising costs across the board. This would mean a look at our price indices, with a producer price index under due consideration. Other metrics, surveys and dashboards will be put to scrutiny too. As continuity on basic trackers would permit cleaner comparisons with the past, sharp snap-offs from old records are best minimized. Even estimates of poverty have been riddled with controversy. Recall how leaked findings of a National Sample Survey on consumption had fanned suspicions of an adversity cover-up half a decade ago. Official numbers have seen an upswell of sceptics on other counts too. To secure the credibility of future revisions, the system demands impeccable transparency.
The efficacy of governance eventually rests on the clarity we obtain on varied aspects of progress. For top-level leadership, a clear view of reality is crucial (and optical illusions risky). All said, a sound statistical apparatus is a must, and that’s a function of its autonomy.