India’s data protection law may not be enough to secure people’s privacy

People rarely understand the consequences of what they have agreed to or the harms that could result from these decisions.
People rarely understand the consequences of what they have agreed to or the harms that could result from these decisions.

Summary

  • Data protection laws which assume people know what they’re doing when they offer consent for the use of their data have a fatal flaw. To fix it, data collectors must be held responsible for the privacy harms they cause, whether or not they took consent.

Economists assume that individuals act rationally, always responding to incentives in their own self-interest. They are assumed by many economic models to have complete information and the ability to perfectly calculate costs and benefits. 

French economist Jean-Baptiste Say called this axiomatic stereotype of the abstract individual who exercises rational choice Homo economicus.

But, as Daniel Kahneman, Amos Tversky and Richard Thaler demonstrated, humans are not always rational in the way those economic models suggest or assume. 

They are driven by emotions, guided by intuition and influenced by bias. Which is why, more often than not, they do not act in real life the way economists predict in their models.

I’ve recently come to realize that privacy law has a stereotype of its own. One that is, in much the same way as Homo economicus, also fatally flawed.

Data protection laws rely on consent. Data fiduciaries are required to only process personal data if they have been permitted to do so by the data principal, who has the autonomy to decide for herself just what can and cannot be done with her personal data. This is an approach borrowed from contract law and therefore proceeds under the assumption that consent is provided freely.

Also read: Data privacy rules in limbo, tech industry on edge

This suggests that data protection law has a stereotype of its own—Homo privaticus if you will—for the ideal data principal who is fully capable of privacy self-management.

But the reality is somewhat different. More often than not, consent is sought from us in binary terms—in the form of take-it-or-leave-it privacy notices that we have no ability to negotiate. 

Refusing to provide consent is not really an option, considering that participation in modern society has come to depend on our ability to use online services that our consent will unlock.

Even when we do have the ability to choose what can or cannot be done with our data, we do not have the data we need to make informed decisions on those choices. 

No one can evaluate all possible implications of providing consent in the manner sought or the impact that would eventually have on our privacy. 

Even if the data fiduciary has listed the full details of all it intends to do with our data in its stated privacy policy, these details are often phrased in ways that render the permissions being sought unknowable—even for the most technically savvy among us. 

This gets further complicated when personal data is transferred to other entities, as the risks we then have to evaluate are dependent on the future actions of yet-unknown persons.

The decisions we make are also far from rational. We are, more often than we realize, manipulated by dark patterns—design strategies that rely on human psychology to trick us into doing things we did not fully intend.

Also read: Mint Explainer: Concerns around Digital Personal Data Protection law

These techniques purposely obfuscate the options available to us, leading us to forgo the protections that could have guarded our privacy. This is what Richard Thaler refers to as ‘sludge’—choice architecture that discourages us from acting in our best interests (as opposed to the more beneficial nudges that his Nobel prize-winning work is all about).

That is why data protection laws framed using concepts borrowed from the traditional law of contract, relying on the assumption that consent is given freely and rationally with full knowledge of all facts necessary to make informed choices, could let us down in the real world of online engagement. 

People rarely understand the consequences of what they have agreed to or the harms that could result from these decisions. The consent they give is not consent at all.

India’s new data protection law also relies on the same consent framework. Data fiduciaries must obtain the consent of Indian data principals before they can process their personal data and provide users with the tools they need to manage their own privacy. 

Which means that India has also proceeded on the assumption that Homo privaticus accurately represents the population of internet users in the country, despite the fact that this has not proven to be so in any other part of the world.

We need to re-imagine our approach to the protection of personal data. In 2017, I wrote a paper titled ‘Beyond Consent: A New Paradigm for Data Protection,’ in which I argued in favour of a brand new approach, one that held data fiduciaries responsible for the privacy harms they caused, regardless of whether or not they had obtained the consent of the data principal for their actions.

Ignacio Cofone makes the case for a similar approach in his recent book, The Privacy Fallacy: Harm and Power in the Information Economy. He points out that instead of holding data fiduciaries liable based on principles derived from contract law, we should instead use tort law to regulate privacy. 

Also read: Businesses had better adapt quickly to India’s new privacy law

Rather than holding data fiduciaries liable for any breach of contractual provisions, we should make them accountable for the data harms they cause.

While it might seem that India is already too committed to the consent-centric approach for this to make a difference, it will be up to the (still to be established) Data Protection Board to interpret just how the principles set out in the law are to be interpreted. 

If we can use tort law, as Cofone suggests, to complement the statutory liability that has been set out, we may at least partly be able to strengthen data protection in a manner that serves people’s privacy needs better.

Catch all the Business News, Market News, Breaking News Events and Latest News Updates on Live Mint. Download The Mint News App to get Daily Market Updates.
more

MINT SPECIALS