# Are you sure you want absolute certainty?

Are you sure you want absolute certainty?

Let’s say an insurance company offers natural disaster insurance that covers losses from natural disasters (losses from fire are covered fully and losses from flooding and earthquakes are covered partially). The company creates two ad campaigns to promote this offer. In one, the company claims 100% coverage for losses caused by fire without mentioning the partial coverage for other disasters. In another, the message highlights the policy’s coverage for a range of natural disasters including fire, flooding and earthquakes and provides information on the extent of coverage for each. Which campaign do you think will be more effective? Though the second one highlights more coverage, it will be viewed as less favourable because of the uncertainty that remains with partial coverage of the range of disasters. The first one, on the other hand, provides complete certainty (though it is limited to only one disaster). The certainty of full coverage for fire is viewed much more favourably than full coverage for some natural disasters and not for others.

We love certainty. We go to great lengths to avoid uncertainty. We pick options that have a known, certain outcome over options that may be risky. Research in behavioural economics has convincingly demonstrated people’s dislike for uncertainty. What is really surprising is the value we place on eliminating uncertainty altogether (instead of just reducing it significantly). Consider the following example. Say, scientists have finally come up with a preventive cure for cancer. They have developed four different vaccines that reduce a person’s susceptibility to cancer. Three of the four vaccines reduce cancer risk by 33% each, and the fourth one reduces the risk by 1%. The four vaccines work independent of each other, and their effect is cumulative. How much would you be willing to pay for the vaccine that reduces the risk by 33%? And how much would you pay for the vaccine that reduces the risk by 1%? Based on simple utility theory, a rational person should be willing to pay roughly 33 times more for the more potent vaccine comparedwith the 1% one. Now imagine the following: you have received three of the 33% potency vaccines. So, now you have only a 1% risk of getting cancer. In other words, you are just 1% away from being totally risk free. How much would you be willing to pay now for that 1% vaccine? If you are like most people, you would be willing to pay way more than the “utility" price we referred to above.

Just as we prefer a certain outcome when that outcome is favourable, we become more risk-seeking and value at least a chance of recovery when the outcome is negative. In one famous study, researchers asked a group of individuals to choose the best treatment programme for an outbreak of a disease that was going to kill 600 people if ignored. Treatment A would save 200 lives while Treatment B had a one-third chance of saving the 600 people but a two-thirds chance of saving none of the people—72% of the group preferred Treatment A. For another group of individuals, the researchers presented the treatment options differently. They offered individuals these treatment choices: If Treatment A is adopted, 400 people will die and if Treatment B is adopted, there is a one-third chance that no one will die and a two-thirds chance that 600 people will die. Although these options are identical to the options presented to the first group, this time, only 22% preferred Treatment A. Just reframing the question to focus on the certainty of saving lives or certainty of death resulted in a dramatic shift in choices. This shows individuals may be able to manipulate your choices by simply changing the “appearance" of certainty even though there is no real change in the outcome. This is referred to as the pseudo-certainty effect.

Because we crave absolute certainty, any action that converts absolute certainty to a mere probability will likely be resisted. For example, if you tell your employees they will get year-end bonus, and then later on revise it by saying there is a 90% chance they will get the bonus (notice that you have only changed the probability of bonus, and not the amount of bonus), your announcement will come as a big disappointment. The disappointment will be much greater than if you had initially placed the probability of bonus at 90% and subsequently reduced it to 80%. Though you are reducing the probability by the same 10% in both cases, the first one hurts more because it makes a certain bonus into an uncertain one! Be careful any time you make decisions on the basis of absolute certainty—you may be the victim of framing. Check if you really have considered all the possible outcomes and whether things are as certain as they’re being made out to be. When you communicate to others, make sure you don’t position something as certain when it really isn’t. This pseudo-certainty effect suggests that retracting your statement will be a lot harder if you suggested an outcome was certain than if you left yourself open for the possibility of uncertainty. So, next time you pay for reducing uncertainty, think about the premium you place on absolute certainty and if it is worth the price. By the same token, you can lessen people’s disappointment with your “cutbacks" by avoiding framing the outcome in absolute certainties.