Hello User
Sign in
Hello
Sign Out
Subscribe
Next Story
Business News/ Opinion / When businesses learn how consumers think

When businesses learn how consumers think

Egregious forms of manipulation, exploiting people's behavioural biases to their detriment, are equivalent to lying and deception

When is nudging ethical? Is it acceptable to exploit behavioural biases? Photo: Hemant Mishra/Mint

In recent decades, psychologists and economists have produced a flood of new findings about how human beings think and act. Those findings offer compelling lessons about how to change people’s behaviour. Governments have taken notice—and so has the private sector. There are terrific opportunities here, but also real risks.

Behavioural scientists have established, for example, that people are greatly affected by “default rules", which establish what happens if they do nothing at all. If employers automatically enrol employees in savings plans (while allowing them to opt out), participation rates will be far higher than if employers ask employees whether they want to opt in.

Behavioural science has also shown that people have a limited attention span, that they dislike losses far more than they like equivalent gains, and that unless information is made salient, they might ignore it, even if it is quite important. In these circumstances, simple reminders significantly increase the likelihood that people will take the necessary medicines, save money, and teach their children to read.

In cases of this kind, behaviourally informed nudges deserve a big round of applause. But other uses of the same techniques, exploiting people’s behavioural biases, should make us a lot more nervous. Consider the report that Uber Technologies Inc. and Lyft Inc., the ride-hailing companies, are aggressively exploring how to use behavioural science to get their drivers to do what they want.

In a clever experiment, Lyft showed one group of inexperienced drivers that they would make a lot more money by moving their work from a slow time, such as Tuesday morning, to a busy time, such as Friday night. It showed another group how much money they would lose by sticking with the slow times.

The result was straight out of Behavioural Science 101: Because people dislike losses, the second approach was more effective in getting drivers to work more during busy times. Intriguingly, Lyft elected not to use that approach, on the ground that it would be too manipulative.

But ride-hailing companies have eagerly enlisted other behaviourally informed approaches. One of the smartest, now used by both Uber and Lyft, is called “forward dispatch". Before the current ride ends, drivers are automatically assigned a new one. That’s good for passengers, because their waiting time is shorter. It’s terrific for the companies, because it encourages drivers to stay on the road. It’s less clear that it’s good for drivers, for whom it creates a clear default: Keep working.

In a similar vein, many companies use a strategy known as “negative option marketing", which means that unless consumers actually take action, they will be assumed to continue to want some good or service—and to pay for it. For example, you might choose to subscribe to a magazine for a year, but the subscription is automatically renewed, so you can end up paying for it for a decade or more, even if you don’t like it much.

All this raises big questions: When is nudging ethical? When is it acceptable to take account of, and perhaps to exploit, people’s behavioural biases?

The first way to answer those questions is to ask whether people are being helped or hurt. Many of the best nudges are like GPS devices: They make it easier for people to get where they want to go. If you are reminded that a bill is due, or that you have a doctor’s appointment this week, you aren’t likely to complain.

Most employees are happy to be defaulted into savings programmes, and poor children aren’t objecting to free school meals. But we shouldn’t approve if people are being defaulted into a product or a programme that hurts them, whether it involves extra pounds, impaired health or hidden fees.

It’s also necessary to ask whether behavioural science is being used to manipulate people. Most nudges are anything but manipulative. On the contrary, they are explicitly designed to increase transparency and to boost people’s capacity for agency—as, for example, by informing consumers about late fees, health risks or the energy efficiency of household appliances. Many nudges are widely publicized, as in the case of automatic voter registration.

Uses of behavioural science become far more troublesome when they are hidden from their targets, or when their users are exploiting people’s unconscious biases for their own profit.

If companies default customers into expensive services they don’t need, and use jargon or fine print to obscure what they’re doing, there’s a big problem. The same is true if employers enlist behavioural science to trick employees into working longer hours, without giving them a fair opportunity to make that decision on their own.

We are sorely in need of an ethics of nudging. We should start by insisting that in this era of behavioural science, it is more, not less, important for private institutions to treat people with respect.

Here, as elsewhere, the government’s regulatory role should be cautious. But consumer and worker protection laws already provide protection against lying and deception. For federal and state regulators, it is time to consider the possibility that egregious forms of manipulation, exploiting people’s behavioural biases to their detriment, belong in the same category. Bloomberg

Cass R. Sunstein is a Bloomberg View columnist.

Unlock a world of Benefits! From insightful newsletters to real-time stock tracking, breaking news and a personalized newsfeed – it's all here, just a click away! Login Now!

Catch all the Business News, Market News, Breaking News Events and Latest News Updates on Live Mint. Download The Mint News App to get Daily Market Updates.
Get the latest financial, economic and market news, instantly.