Why Ease of Living Index rankings fail to inspire confidence3 min read . Updated: 23 Aug 2018, 10:16 AM IST
The recently released Ease of Living Index raises more questions about the quality of life in Indian cities than it answers
Bengaluru/Mumbai: Pune is the best city to live in India, while Delhi is among the worst cities in terms of economic prospects, according to the Ease of Living Index rankings published recently by the union ministry of housing and urban affairs. In terms of safety and security, Bengaluru is among the worst, the Ease of Living report suggests. How much do these rankings reflect the reality of urban life in India? A Mint analysis of the rankings suggests that there are good reasons to be sceptical.
Two key issues cast doubts over the rankings: the arbitrariness in constructing the index and the use of questionable or incomparable data.
The Ease of Living Index rankings are based on 79 indicators, which are grouped under four “pillars": institutional, social, economic, and physical. Physical services, which include housing, water supply, sanitation, etc, are allotted the highest weight (45%) in determining the city rankings. Economy and employment are together assigned a mere 5% weight.
Such a lopsided weighting scheme raises questions on the relevance of the rankings, according to experts.
“(Public) services are basic and important, but the absence of any indicator that speaks of jobs or productivity from the index reduces the value of the index," said O.P. Mathur, senior fellow and chair, Urban Studies, Institute of Social Sciences, New Delhi. “This renders the survey subjective, as any weightage given to a parameter can be questioned."
One reason for the low weight assigned to economic prospects could be the lack of data itself.
“In India, cities are not recognized as units of the economy and therefore it becomes very difficult to measure (their economic contribution)" said Srikanth Viswanathan, chief executive officer, Janaagraha Centre for Citizenship and Democracy.
The second issue is the quality of data used. “The main source of data for the computation of the Ease of Living index involved secondary data, which was collated by city governments from various sources," the report states. Moreover, for indicators such as access to toilets, the survey says that sample field surveys may be relied upon in case of unavailability of data with the urban local bodies. The reliance on data from different sources and collated by different bodies raises doubts over their comparability. In several indicators, such as construction of toilets, there is no way to know the truth except what the government says or the local bodies report, and such numbers may be biased, said Mathur.
To compare the Ease of Living Index rankings with a more transparent metric, Mint constructed a district-level livability index based on seven key parameters—percentage of households with access to clean drinking water in a district, access to improved toilet, pucca house, clean cooking fuel, electricity, share of children stunted and share of women with more than 10 years of schooling.
The data for each of these parameters was sourced from the National Family Health Survey, which surveyed more than 600,000 households in 2015-16 to arrive at district-wise representative estimates.
The livability index rankings were then compared with 24 of the largest cities (with population above one million), where the corresponding city population is at least 40% of district population. The analysis shows that there is very little correspondence between the Ease of Living index rankings and the livability index.
Cities such as Ludhiana, Srinagar, Bengaluru, Delhi and Amritsar fare far better in livability in comparison with the Ease of Living Index ranks. Cities such as Jabalpur, Gwalior, Visakhapatnam, Jaipur and Bhopal, show a poorer performance on the liveability index.
A cross-examination of the sub-indices of the Ease of Living index generates similar results.
When we compare the health rankings of cities with child stunting rates for the corresponding district, we find absolutely no correlation. The safety ranking is more closely aligned with the murder rate recorded by the National Crime Records Bureau (NCRB) but it is worth noting that even in this case, there are some notable exceptions: health and crime scatter-plots.
The murder rate rather than overall crime rate has been used here since the extent of under-reporting of murders is likely to be lower compared with overall crimes.
The bottom-line: the methodological issues underlying the Ease of Living index need to be addressed for it to be taken more seriously.
More transparency in sharing the rationale for different weights as well as in sharing disaggregated data will help build a more credible index. The future versions of the ease of living index could be improved if these changes are made, and if the reliance on dodgy data sources is reduced.