Remember that time you checked a consent box in a rush to buy groceries on an app? Or when you signed up to chat with your friends for “free”? Perhaps you hesitated to accept all the terms on a loan application. Consider the number of times you checked that box, because too much hung in the bargain if you didn’t—that bag of groceries, the chat with your friends, your new dream house. Did you read the fine print? Perhaps you did.
But think of the person who cannot read or write. That is roughly 300 million people in India. What are their chances of making an informed choice about whether or not to give their data in exchange for a service? The answer is zero—unless they have an educated relative or a friendly bank manager who will sit them down and explain what they’re consenting to. Consent today is broken. It is a mere box to be checked. It is designed to be dispensed with rather than taken seriously, and thus safeguards providers more than consumers. Its reach is all encompassing. You can disagree with it, but if you do, you could be denied service.
Why fix consent when it doesn’t seem to cause any tangible harm? When people don’t seem to care enough about it to read the dense, lengthy forms? The answer is simple. Data is the new currency. As someone famously said, if you’re not paying for the product, you are the product. In a recent study on what data protection and privacy mean to the Indian people, a multidisciplinary team from Dalberg, Dvara Research and Consultative Group to Assist the Poor used human centred design methods in four regions of India, to probe deeper around the notion of consent.
Participants included illiterate farmers in rural Uttarakhand, social media users in the slums of Mumbai, housewives in rural Tamil Nadu and migrant workers in Delhi, amongst others, all earning less than $10 a day. Through more than 50 in-depth interviews of 2-3 hours conducted in peoples’ homes, shops and farms, using design research tools such as scenario cards, ecosystem trust maps, mock consent forms, etc., the team uncovered not just what people say about privacy, but how they think and actually act in different data-giving situations.
The first discovery was that people, even poor people who couldn’t read or write, when made aware of what they were consenting to, cared deeply about it. They did not wish to be unknowing, unwilling subjects in today’s world, and wanted a square chance to assess the trade-offs in a given situation. They wanted to know what their data was worth, who was using it for what purposes, how it was stored and shared and where it goes to die?
The study also tested different forms of consent with a range of people from different age groups, literacy levels and tech savviness. This led to a second learning: people unanimously preferred verbal, pictorial or video consent over pages of illegible fine print. Simple visuals that at a glance communicated what data rights one was signing away made people more aware and confident about giving necessary data. When people knew what they were consenting to, it built trust with providers. Equally, obfuscating providers were viewed in a negative light. Making consent understandable at a glance is one way of giving people control over their data in a world where they can often feel weak and disempowered in the face of large corporations or government.
This also means consent needs to shift from being binary, whereby if you’re not agreeable to certain terms, you should still be able to get a minimum product or service. We prototyped “bite-sized” consent with people to see how they responded to it. To our surprise, people preferred getting a partial product or service in return for partial data. This means, for instance, if one was not agreeable to sharing one’s location data, one might forego certain aspects of personalization that a service may offer. Or in other instances pay a premium to not let your data be shared with third parties or advertisers. But currently this choice is restricted only to a few providers.
More broadly, consumers didn’t seem to understand why certain types of data are being demanded of them, what providers do with it, what their rights are in case of a data breach (and even whether such rights exist and are enforceable) and lastly who bears liability in case of tangible harm caused to them. Take for example the case of phone impersonators, one of the most common instances of financial fraud in India today. It was so pervasive that people believed fraud to be inevitable. Assuming that you were one of the unsuspecting people who believed a person posing as a bank official, should you be made liable for your loss? Current liability falls mostly on the consumer, even when he or she hasn’t knowingly consented for their phone number to be passed on to third parties with porous security systems. Worse, the same people often held themselves responsible and had little knowledge or means of seeking redressal.
The good news for data regulators and service providers is that people are willing to part with certain types of data based on a clear guarantee that the data they share will not be misused and cause them any harm (including mis-selling, blackmail, identity theft, financial loss, etc.).
The recent white paper by the Srikrishna Committee on data protection and privacy provides hope that India can lead the way for how to get this right. But first, we must fix consent.
Priti Rao and Varad Pande are with Dalberg.
Catch all the Business News, Market News, Breaking News Events and Latest News Updates on Live Mint. Download The Mint News App to get Daily Market Updates.