Lessons from history on climate change

The fact that the Paris agreement was ratified at a record pace of less than 12 months shows the widespread consensus behind the need for climate change mitigation today


Photo: PTI
Photo: PTI

The success of the Paris climate treaty, which has been covered in detail in these pages, is remarkable to the fullest degree. Given the complexities, vagaries and the lack of precision in cause-and-effect assessments of climate change, it took a Herculean effort to bring together 196 countries and have them all agree on a potential pathway to mitigate the problem. It was an effort 143 years in the making.

To truly appreciate how we got to this point in December 2015 in Paris, and in the run-up to the next round of major climate change negotiations scheduled to take place in Marrakech later this month, it may help to take a step back and understand the history of both climate change science and international cooperation around the issue. It is one thing, after all, to read about India’s commitment to reduce emissions intensity by 33-35% by 2030—which may not mean much to the average citizen—and another altogether to know how all of this came to be.

The very first climate measurement efforts began in 1873. Under the International Meteorological Organization (IMO) that was chartered that year, a network of weather stations was established and standards for data collection from these stations were put in place. As the number of scientists working in this area began to grow—though at a gruellingly slow pace over a period of four decades—the military and colonial applications of this knowledge became apparent. Knowledge of weather patterns in adversarial terrains stood to give tremendous military advantage to whichever side had access to the data. Similarly, colonialists benefited from the knowledge of weather patterns that could help them develop efficient agricultural practices suited to specific regions. Money and resources were, therefore, continually invested in the study and development of this science.

By the time of World War II, the military potential of meteorology had been truly realized. The Allied powers led by the US and Britain were able to get decisive territorial advantage over their German adversaries in the Normandy Invasion of 1944 when Allied meteorologists found a break in stormy weather patterns that the Germans took for granted.

The military relevance of meteorology, therefore, led to the global prominence of the IMO. In 1950, the IMO became a part of the United Nations, changed its name to the World Meteorological Organization (which still operates under the UN umbrella today), and received bona fide financial and global political support. The flow of research dollars led to a number of scientific initiatives that furthered our understanding of the climate.

The first was the tracking of carbon dioxide levels in Hawaii, in the middle of the Pacific Ocean, that laid the basis of atmospheric carbon dioxide monitoring. The second was the setting up of research bases in Antarctica and Greenland which led to the drilling of ice cores that revealed to us long-term climate history. The third was the deployment of global satellite systems, which, while primarily had military applications in the Cold War era, helped us see and monitor the Earth as a whole for the first time. And the fourth was the use of supercomputers in mathematical modelling of the climate. The combination of these factors would provide the underlying basis for the thesis that our climate was rapidly changing due to human activity.

In parallel, concern over human-induced climate change was spilling over from scientific circles to the public at large. Among other factors, a killer smog in London in 1953, followed by a similar smog in Los Angeles and then in New York in 1966, made people aware that the chemicals spewed into the air from industry had fatal consequences. The risk of climatological and nuclear warfare during the Cold War—i.e. the use of atomic bombs to disrupt weather patterns—captured popular imagination as well.

After the end of the Cold War in 1991, it seemed like the world could finally prioritize managing climate change, which by this time had been soundly assessed by the scientific community, as well as was a matter of serious public concern. In 1992, the Rio Earth Summit was convened in Brazil and resulted in a treaty that established the United Nations Framework for Convention on Climate Change (UNFCCC). The UNFCCC has since been organizing the Conferences of Parties (COPs) to find a globally acceptable pathway to mitigate climate change. The Paris conference was the 21st such convention of the COP.

Two diplomatic peculiarities stand out about the COPs. The first is that all negotiating parties have an equal veto power; and the second is that any agreement must be subsequently ratified by national parliaments, or equivalent national authorities, to come into effect. So, not only can a tiny country that you likely haven’t even heard of, say, Belize, derail everyone else’s efforts at a COP; but even if everyone did agree on the negotiating table, the lack of a treaty’s ratification at home—as had happened in the case of the US’s non-ratification of the Kyoto Protocol from the third COP—could render all efforts useless as well.

It has taken 21 rounds of negotiations over a quarter of a century to arrive at a binding agreement. The fact that the Paris agreement was ratified at a record pace of less than 12 months shows the widespread consensus behind the need for climate change mitigation today. In light of the 140-plus year history of this effort, we have truly come a long way.

Sumant Sinha is chief executive of ReNew Power. Silent Spring will appear every fortnight and look at issues related to the environment, climate change and renewable energy.

READ MORE