Opinion | The automation anxiety looming over the world

The possibility of a software system being held responsible for the latest Boeing 737 Max crash raises a question: Has technology been given too much control too soon?

In an algorithmic world, it is but natural to get overwhelmed by Artificial Intelligence or AI-powered automated systems, whether in planes, cars or our homes. However, the crash of a Boeing 737 Max and the subsequent grounding of this aircraft model in India, China and many European countries, including Britain, Germany and France, has made many people wonder if we are adopting newfangled technologies a bit too prematurely or relying too much on computer software to keep us safe. Can automation backfire in cockpits, cars or daily-use devices for that matter? The Ethiopian Airlines disaster of 10 March was preceded by a similar Lion Air crash in October 2018 that involved an aircraft of the same make, and while it may be too early to identify the exact reason for the latest one, it’s worth recalling that India’s Directorate General of Civil Aviation (DGCA) had asked local carriers to check the Boeing 737 Max’s sensors back then.

A November 2018 article in The Air Current noted that Boeing had “quietly added" a new Maneuvering Characteristics Augmentation System (MCAS) “to help pilots bring the plane’s nose down in the event the plane’s angle of attack drifted too high when flying manually, putting the aircraft at risk of stalling". However, the article added that the system could be deactivated if “pilots trim the aircraft manually to override the MCAS’s attempt to automatically pitch the jet’s nose down". This system is at the core of the current anxiety over these planes. There is also speculation that Beijing swiftly grounded the 737 Max because Boeing did not respond to its demand for a full disclosure of the MCAS’s technical innards; given that China too is trying to strengthen its aviation industry and market its C919 aircraft globally, this could be a case of point-scoring in an image battle of safety. But that does little to soothe the nerves of flyers. Boeing, on its part, has acknowledged that it has been enhancing the flight control software of the 737 Max, an exercise that includes updates to the MCAS flight control law, pilot displays, operation manuals and crew training. The company claims, though, that MCAS does not control the aeroplane in normal flight but improves its behaviour in extraordinary circumstances.

Worries of ghost control by bits and bytes go back a long while. Even a decade ago, when an Air France plane crashed into the Atlantic, questions arose over whether its cockpit’s “fly-by-wire" systems were to blame. Relevant in this context is a phenomenon dubbed “the paradox of totally safe systems": what makes us safer might also make us careless. An academic paper published in June 2017 by three researchers from the University of Edinburgh Business School—Nick Oliver, Thomas Calvard and Kristina Potocnik—argues that the very same measures that make a system safe and predictable “may introduce restrictions on cognition, which over time, inhibit or erode the disturbance-handling capability of the actors involved". Pilots, thus, may find it difficult to foresee complex interactions and not know how to deal with computer failures. While software could in theory eliminate accidents of human error by taking charge of steering wheels and cockpit controls, we might need to keep good old human cognition in the loop—and fully alert—to avoid the pitfalls of automation.

Close