A data-rich approach to tackling Delhi’s worsening air quality
- Farm loan waiver: Maharashtra detects 1.5 million suspicious bank accounts
- BJP has successfully covered 75% of polling booths in 3 years: Amit Shah
- OPG Securities approaches SAT against NSE’s suspension notice
- RBI revises investment, trading rules for banks
- India needs robust cold chain supply system to increase farmers’ income, say experts
Post-Diwali, the air quality (AQ) in Delhi took a massive plunge, and people came up with many solutions to this problem. The sad reality is that many of these are unlikely to help and there is little evidence that any single method will make any real difference.
We need to first establish the data and the method to show cost and benefits of interventions. This can provide the hard evidence for coercive methods or out-of-box innovative solutions. In this article, we will outline a solution to moving away from data-poor seat-of-the-pants style of decision making.
AQ is measured by various parameters (Particulate matter 2.5 or PM 2.5 is the focus most of the time but nitrogen oxide (NOx) can be more dangerous to a pollution-sensitive individual’s health). The AQ Index (AQI) measures the quality of a city’s air and is used by the authorities to take action like asking some emitters (like construction companies) to reduce activity or declare holidays for schools. All emitters are not equivalent. Diesel cars are worse on NOx but power plants (there are five within the National Capital Region and NTPC Ltd’s Badarpur plant is a particularly bad emitter) the worst on particulate matter.
AQ depends on metrological activity and concentration and types of emitters. North India suffers fog during winter due to natural metrological phenomena. AQ is not a stable or slow-changing phenomenon. It may be a surprise to many that the AQ parameter values like PMI 2.5 of 100 micro-gm/cubic meter can vary by 200% or more at the same place during 24 hours and by 600-800% across seasons. The stock market, which most of us consider very volatile, varies by less than 12% a day for most individual stocks and 5% at the level of the Nifty index on most days. Quantifying such volatile phenomena requires a rigorous statistical basis.
Unfortunately, our public policy decision making even with the honourable Supreme Court and the National Green Tribunal is riddled with “expert” opinions lacking adequate data and impose lot of pain and costs on the whipping boy of the moment but do not cure. Digital India will fail to deliver on its promise unless regulators and public policy advocates abandon data-poor seat-of-the-pants approach and embrace a data-rich Plan-Do-Check-Adjust (PDCA) method.
We need a statistical approach which uses granular data at a grid of 1-10 square km and analysed by the hour for the city across two or more years to really see what is happening and derive the root cause. There is a seasonal factor as well. There are secular factors like increase in the human population, construction activity, and car population that need to be considered in making an assessment of the cause and effect and forecast with any degree of certainty. For those interested, please do read the original Indian Institute of Technology-Kanpur report or a commentary by the Centre for Science and Environment. (see report). While these are firm conclusions, they are based on a very small number of measurement (the IIT-K study used six of 21 official AQ measurement sites in Delhi reporting 24-hour summary) and short time (for only eight months, November 2013 to June 2014). The report uses a lot of assumptions and reaches conclusions that would do with some more rigour. We need to have some estimate of cost and benefit. Please pursue the excellent website of Dr. Sarath Guttikunda. (see report)
which explores the approaches to forecasting AQI from a survey of emitters and suggests a minimum of 50 measurement centres for Delhi to make statistically valid estimates. They have a good visualization of AQ across time. (see report)
In the past, the number of measurements that were possible was very small due to cost reasons. The mathematical models that were used to project from these measurements were fairly poor as well. This is also the story of weather or meteorological forecasting. However, with much more sensors and hourly measurements, forecasting accuracy can be greatly improved. In a three-year project, SouthWest and other airlines participated in a meteorological data collection and reporting system (MDCRS) by fitting their airplanes with weather stations and collecting parameters at 15-minutes interval. For the first time, weather forecasting got very granular data. They were able to keep airports open and fly in and out with better forecast than the Federal Aviation Administration (FAA) advisory (see report).
For the NCR, we should cover an 80 by 80 km area and need a pollution sensor and weather (sunlight, relative humidity, wind flow etc) ideally per 1-3 square km cell in high-rise dense areas and at least one per 6-10 square km in suburban areas and collect data at hourly or less interval across seasons to provide the big data needed to show difference from interventions. The concept of reporting the air and water quality at every 15 minute interval has already been put in place for the larger polluting industries by NGT.
Once we have a baseline we can say if an industrial holiday, suspension of construction or no-car day makes what difference, where and for how long. Is the burning of biomass in the winter nights for heat a major culprit? These are vital questions to ask to justify intervention.
In addition, we would conduct experiments to clean the air. As technologists, we favour incentives and cure given that by the time we have an AQI alert, things could have gotten very bad. The current clean air approach is somewhat flawed. Emission norms are set by source. However, the city suffers not from a particular source alone, but by aggregation across all sources. It is difficult to know in advance what will be the concentration of emitters. Indian cities grow very rapidly and have proven to be immune to top-down planning or management. A spurt in the use of diesel gensets or brick kilns on the outskirts can happen very fast. We need a method to correlate cause and effect, which the current daily summary of AQ from 6-9 points in a metropolis like the NCR simply does not provide.
The authorities need to monitor a large regional space in and outside the NCR. This is not easy to police and ensure compliance. However, an open public portal with large number of data points on the AQ can bring some degree of transparency and institutional checks on compliance.
What can a hospital, a school or a mall do to improve AQ?
Rain and trees are well accepted methods to clean air. Biotechnology can be used to make urban forests much more effective. A small barren patch of land can be converted to an urban forest within 3 years. Eco-entrepreneur Shubhendu Sharma is creating mini-forest ecosystems using an accelerated method. It’s based on the practices of Japanese forester Akira Miyawaki, as well as on Sharma’s own experiences, gleaned from his former career in car manufacturing. Watch his TED talk.
CityTree from GreenCitySolutions is a biotechnology accelerated freestanding enclosure intended to be placed at locations with dense people movement. Other researchers are able to increase effectiveness of photosynthesis by 10X. These need to be encouraged and hold much promise.
Scrubbing clean air in public places is energy-expensive. Malls are trying. Scrubbers can be fitted in vertical wind turbines. These clean a smaller space and can be experimented with.
The IoT (Internet of Things) Forum has a proposal for a PMI (particulate matter index) Challenge which can help establish this within 3-4 months using current the IoT network based on LPWAN (low-power wide-area network) technology. We propose to use a small number (20 ) fixed and mobile, high-quality and expensive sensors and overlay with a large number of cheaper , less accurate fixed (600 + at light poles on road junctions) and crowdsourced measurements. The reason for the two different types of sensors is because EPA (Environmental Protection Agency)-grade accurate sensors are expensive and the lower quality sensors often drift and give not-so-accurate results under different environmental conditions. The co-existing of the EPA grade sensors alongside with cheaper sensors would enable a software calibration and co-relation of the data gathered for overall better accuracy for the city. There are a number of start-ups in this space like PAQS, Aurrasure, Oizom, Brisa and Yuktix . There is a growing citizen movement to measure and report AQ worldwide (CITISENSE in Europe) and open data community (including Bengaluru) who can pitch in analysis. Using big data analytics, we can calibrate for relative change and produce actionable conclusions. This can also engage residents in experiments and corporate social responsibility (CSR) funds for local actions to improve PMI. The sensor network can be expanded later for vehicle density, noise pollution etc. This is the base sensor-data grid for a Smart City.
Current regulation and public policy is based on a data-poor mindset and can be dramatically improved in a modern IoT world. Data-rich wireless sensor networks have delivered 12-25% improvement in world-class factories that barely eke out 2-6% with Six Sigma or ISO (International Organization for Standardization). The potential benefits in data-poor city and public policy decision making is probably even more.
Arvind Tiwary is chair of the TiE IoT Forum, co-chair IOTNEXT and leads Work Group 3 in IoT for Smart City Forum (IOT4SCTF).