OPEN APP
Home >Opinion >Columns >We humans often take disaster warnings far too lightly

Cyclone Tauktae, which caused India’s worst-ever offshore disaster, raises many questions. On 14 May, multiple agencies gave out warnings of a severe cyclone. There were more than 90 vessels in the areas that could be impacted. While almost all of them moved to safe locations, four vessels ignored the warnings . Why? The captain of P-305, an accommodation barge with 261 people on board, stayed put in harm’s way, ignoring all the standard operating procedures (SoPs). Why?

A few days before cyclone Ockhi hit the Indian coast in 2017, the meteorological department had issued warnings. But state authorities did not take these seriously, resulting in the death of more than 200 fishermen. The Harvard Business Review article ‘Why Good Leaders Make Bad Decisions’ by Andrew Campbell, Jo Whitehead and Sydney Finkelstein explains the peculiar behaviour of Brigadier General Matthew Broderick, who was the chief of the Homeland Security Operations Center, responsible for alerting president George W. Bush and other senior government officials if Hurricane Katrina breached New Orleans’ levees. Despite getting 17 reports of major flooding and levee breaches, he ignored all those reports, went home believing that the levees would hold out against the hurricane. That night, all hell broke loose in New Orleans. These and other instances where dire warnings of a potential catastrophe were ignored should be a matter of public concern.

If alerts of an impending catastrophe are ignored by ordinary people, one can attribute it to their lack of knowledge. But in all these cases, a miscalculation of potential risk was done by extremely experienced professionals. The captain of P-305, for example, had more than 30 years of experience. What is it that makes these professionals make such fatal errors of judgement?

Our brains evaluate what’s going on around us using its pattern-recognition feature. Patterns are formed based on memories of one’s past experiences and judgements. Brigadier General Mathew Broderick, from his previous experience in the Vietnam war and handling of other hurricanes, had come to believe that it is always better to wait for ‘ground truths’ from a reliable source before acting. Although he had more than 17 reports of flooding after the hurricane struck, he was also getting contradictory signals. The Army Corps of Engineers said there were no reports of breaches. Broderick’s pattern-recognition process told him that these contrary reports were the ground truth he was looking for. He assumed that there was no need to worry. So he took his car and went home; and the city got battered.

The captain of the barge P-305 would have surely received many reports of the incoming cyclone. The many alerts he would have received in his long career and how he reacted to them must have formed the pattern by which he would have evaluated the latest warning. But, unfortunately, Tauktae was a rare cyclone—of a kind that had not occurred over the Arabian Sea for more than 20 years. So, any decision that he would have taken based on his past experience of facing other cyclones would have been inadequate in dealing with this unique situation.

It is often wrongly assumed that humans are rational and consider various options before taking a final decision. But, as the research of psychologist Gary Klein shows, our brains are reluctant to consider alternatives. The authors of the HBR article remind us that humans are particularly bad at revisiting their initial assessment of a situation—our initial frame. There are reports that many other colleagues did talk to the captain of P-305 about the impending cyclone. But the captain did not change his initial assessment of the situation. This inflexibility of human nature is particularly dangerous while dealing with extreme weather events. Tauktae intensified from a ‘very severe cyclonic storm’ to an ‘extremely severe cyclonic storm’ within a few hours. Even the most state-of-the-art cyclone models are unable to pick up such rapid intensification in advance. Trying to manage our inflexible human nature while dealing with hugely dynamic cyclones is quite a task. How do we mitigate this problem?

Today, the output of predictive models is largely in the form of numbers and satellite images. These numbers are inadequate in generating the requisite ‘feel’ of an oncoming catastrophe. Scenario planning is a tool that enables a vivid demonstration of various potential outcomes and also the evaluation of alternate responses to a future event. Combining the output of a predictive model with the rigour of scenario planning could help visualize the consequences of a future catastrophe better. This could improve how we prepare for and respond to disasters.

Playing devil’s advocate has been a tool used since medieval times to bring contrarian views into decision-making processes. Decisions that address constantly evolving and uncertain situations should not be left to an individual. Decisions in such situations are best taken by a team of experts who can bring in data and views that are contrary to initial assessments, and then debate various solutions, before taking a final call.

In the past, very low probability events of high consequence were often ignored as ‘acts of God’. We have since made much progress in developing sophisticated tools to better predict such rare events. But the disaster off the coast of Mumbai is a reminder that we still have a long way to go before such tools are used the way they should be.

Biju Dominic is the chief evangelist, Fractal Analytics and chairman, FinalMile Consulting

Subscribe to Mint Newsletters
* Enter a valid email
* Thank you for subscribing to our newsletter.

Never miss a story! Stay connected and informed with Mint. Download our App Now!!

Close
×
Edit Profile
My ReadsRedeem a Gift CardLogout