Home >Industry >Infrastructure >Is Mysuru really the cleanest city in India?

The Union ministry of urban development has exceeded the gold standard of how not to report results of an exercise in surveying and ranking, in its recent release on the Swachh Bharat Rankings.

The press release, put out on the PIB website not only does not contain the full rankings, but is also rife with mistakes (Dimapur is included in the capital cities list, but not Kohima, for example) and makes no attempt to explain how these rankings were arrived at.

Building a ranking system is not a trivial process. First, we need to decide on the criteria that will go into building the system, and then decide on how these criteria will be measured. Following that we need to build a framework for aggregating scores (a simple averaging or summing up is not appropriate in most cases), taking care that we are not indulging in double counting. We might have to define weights for various criteria, and irrespective of how carefully and thoughtfully the system is designed, there is always scope for arbitrariness and subjectivity, and rankings thus produced are seldom unanimous.

In this context, when the results of a ranking exercise are released, the least that can be done is to explain the assumptions and process behind the ranking, and how they were arrived at. For in that case, even if the reader doesn’t necessarily agree with the ranking, the process thus detailed can help the reader decide on “how much salt" to add, and provides a lower bound for the trust the reader develops on the ranking system.

In this context, the Swachh Bharat Rankings fall short on several levels. The press release mentions that the rankings were based on a survey, and that the rankings were based on the “total sanitation practices covering a set of parameters, including the extent of open defecation, solid waste management, septage management, waste water treatment, drinking water quality, surface water quality of water bodies, mortality due to water-borne diseases, etc." Beyond that, however, there is little information on what the survey was about, or how the results were aggregated or tabulated.

Writing in Election Metrics last year, I had mentioned that one of the reasons Indian pre-poll surveys are not credible is that they don’t give out enough information to build confidence in the reader. In that context, I had said pollsters should put out data such as the sample size, methodology for choosing sample, interview questionnaire and overall error in estimation of forecasts.

In that context, the urban development ministry has gone one up on the worst pollsters in India, for not only have they not disclosed anything about how the survey was done, but also not published the results in full. It is like a pollster conducting a poll and reporting it simply as “Mysore wins" without saying anything about how they arrived at that result.

What makes the survey worse is that its results are not exactly intuitive. While people may not grudge Mysuru’s top spot or Damoh’s (in Madhya Pradesh) bottom spot, the presence of Mandya, Hassan and Bengaluru (all in Karnataka) in the top 10 leads to scepticism among people who have either lived in or been to one of these cities.

While it is not inconceivable for a ranking exercise such as this one throwing up unintuitive results (this is a standard “feature" of all ranking exercises), the lack of disclosure of information only helps fuel such scepticism.

It is hoped that the ministry follows up this survey soon enough and provides us the necessary details as to how the survey was conducted and how the rankings were arrived at.

Until that is done, however, this survey will be treated by citizens as a mostly meaningless exercise, and raise further questions on the efficacy of the Swachh Bharat Abhiyaan.

Subscribe to Mint Newsletters
* Enter a valid email
* Thank you for subscribing to our newsletter.

Never miss a story! Stay connected and informed with Mint. Download our App Now!!

Edit Profile
My ReadsRedeem a Gift CardLogout