OPEN APP
Home / Opinion / Columns /  Decomputerize to decarbonize: A climate debate we can’t avoid
Listen to this article

The CoP-26 conference that concluded last fortnight was one of the last desperate attempts by humankind to save our home. Our planet faces an existential threat, with climate change and global warming threatening to make it unliveable in a few short decades. All planets die, as they get consumed by their stars in spectacular galactic explosions that create white dwarfs and black holes, and so will the earth—in about 7.5 billion years. As a species, we have hastened this demise considerably by accelerating climate change to dangerous levels. If you read news reports and most scientific literature, a few common factors are blamed for how we have managed this feat: the use of vast amounts of concrete, vehicular exhaust, industrial pollution, air travel, and even cows belching out copious quantities of methane.

However, one huge factor ruining our planet seems missing in this conversation: unfettered computerization and technology adoption. Admittedly, this assertion might seem odd coming from a practitioner—and sometimes proselytizer—of technology. But it was an article by Ben Tarnoff in The Guardian (bit.ly/3HIbgeG) that set me thinking along this path and an unputdownable reading of Atlas of AI by Kate Crawford that drove the issue home.

Let’s look at a few facts that Tarnoff, Crawford and others, like Nathan Ensmenger, allude to. A recent UN study revealed that the manufacture of one desktop computer took 240kg fossil fuels, 22kg of chemicals and 1,500kg water. A University of Massachusetts team calculated that training one model for natural-language processing—the branch of artificial intelligence (AI) that helps ‘virtual assistants’ like Alexa understand you—emits 626,155 pounds of carbon dioxide, what 125 New York–Beijing round trips will produce. As the models get more complex, this goes up exponentially. OpenAI estimated that the computing used to train a single AI model is increasing by a factor of 10 every year. Then there is the cloud, where giga-loads of data generated by computers mysteriously waft up to and get stored. This ‘cloud’ is actually the hundreds of data centres that Google, Microsoft and others have dotted our planet with, and they guzzle water and power at alarming rates. Tarnoff reports that data centres consume 200 terawatt hours per year, roughly the same amount as South Africa, and is likely to grow 4-5 times by 2030, which would put it on par with Japan, the world’s fourth-biggest energy consumer. “The cloud", says Crawford, “is made of rocks and lithium brine and crude oil." This pales in comparison, however, with what makes the guts of a computer, and what is in acute short supply these days—semiconductor chips. A ‘fab’ unit would take $20 billion to build and need 2-4 million gallons of ultra-pure water per day, roughly equal to the needs of an American city of 50,000 people. In fact, reports Crawford, the carbon footprint of the world’s computational infrastructure has matched that of the aviation industry at its peak, and it is increasing at a faster rate.

So, to decarbonize, says Tarnoff, we must decomputerize. And I reluctantly agree with him. Before you accuse me of being a Luddite, let me add that this does not mean getting rid of computers, but only the unnecessary ones. Think about your home and the number of computers it has: laptops and PCs, mobile phones, voice assistants, smart ‘things’. Do we really need them all? Consumers, corporations and governments are digitizing everything. Cisco estimates that there will be 29 billion networked devices by next year. Each one requires energy, minerals and water, and each produces data for an ever-expanding cloud.

Therefore, says Tarnoff, we actually need a “Luddite revolution"; “Digitization doesn’t just pose a risk to people, however. It also poses a risk to the planet. Digitization is a climate disaster: if corporations and governments succeed in making vastly more of our world into data, there will be less of a world left for us to live in."

Thomas Watson, founder of IBM, famously said that “at most, the world [needed] five computers". While that is not even a Luddite’s dream, perhaps we should go back to what Bill Gates had envisioned, “a computer on every desk", and stop there? Just a single computer seems laughable now, with computers dotting almost every available square foot of our living spaces. It is time to debate the uninhibited growth of technologies that we consider ‘clean’ but might instead be speeding up the demise of our home planet.

Jaspreet Bindra is the chief tech whisperer at Findability Sciences, and learning AI, Ethics and Society at Cambridge University.

Subscribe to Mint Newsletters
* Enter a valid email
* Thank you for subscribing to our newsletter.

Never miss a story! Stay connected and informed with Mint. Download our App Now!!

Close
×
Edit Profile
My ReadsRedeem a Gift CardLogout