A new paradigm for privacy
If we discard consent we can build a privacy framework that can serve as a model for the rest of the world
Last week, the government made it mandatory to link Aadhaar numbers to tax returns and set itself a target of one year within which it would link all mobile numbers to the Aadhaar database. While the Supreme Court agreed to refer these issues to a larger bench, it seemed happy to let the government continue to incorporate Aadhaar into all aspects of our lives. So much has been said about these decisions that I don’t want to add my voice to the chorus—except to say that it brings into sharp focus the lack of a privacy law in the country.
Perhaps in anticipation of these events, a number of academic papers have been published recently, agitating the need for a privacy legislation. They have broadly suggested the enactment of a law along the lines of the OECD (Organisation for Economic Cooperation and Development) data protection principles articulated in the 1980s—that personal data is the property of the data subject and cannot be used without his consent.
ALSO READ: Building a legal ecosystem for Aadhaar
Most privacy laws have been built on this model and if we go down this path, our law will be consistent with global practice. However, if we make consent the cornerstone of our privacy jurisprudence, we will have taken a conscious decision to place upon the data subject, the burden of determining whether or not the use of personal data for a particular purpose is in his interest. In our present data-intensive world, this is a question the data subject is ill-equipped to answer.
Today data is collected, processed and transferred in more ways than can be comprehensively enumerated. Our online activity is logged; our financial transactions tracked and correlated against location, age and time of day; and our physical activity measured using wearable and other smart devices. All this data is stored in the cloud and is easily accessible through application program interfaces (APIs) for further processing. Databases are designed to interconnect with each other and use deep learning algorithms to find patterns in ways that even the best data scientists will struggle to understand. Providing meaningful informed consent under these circumstances is impossible.
The one person in the data processing workflow, who might have visibility into the possible outcomes of data processing is the organization collecting the data—the data controller. It knows what the data will be used for, as well as the algorithms through which the data will be processed. It is best equipped to assess the possible consequences—both intended and unintended—of its use. More importantly, it has the ability to consciously determine the outcome of the data processing. It makes more sense to hold the data controller accountable for ensuring that no harm befalls the data subject than use poorly informed consent provided by the data subject as a licence to process.
The trouble with this model is that the data controller’s interests are not always aligned with the data subject’s. There will be instances where safeguarding the privacy of a data subject runs contrary to the commercial interests of the data controller. Thankfully there are legal constructs designed to address exactly this sort of misalignment. Directors have a fiduciary obligation towards their company that must override any allegiance that the director owes to individual shareholders. Company law requires directors to fulfil their fiduciary obligation even if doing so is contrary to the interests of the shareholder who appointed him. Surely we can impose a similar fiduciary responsibility on data controllers.
ALSO READ: Key design elements for data protection
There could be other situations where the commercial interests of the data controller run contrary to those of the data subject. Take, for example, the use of financial information to assess creditworthiness. If the data controller is required to focus solely on promoting the interest of the data subject, it will only consider information that establishes a favourable credit rating. Doing so would run contrary to the commercial interests of the data controller whose business depends on lending only to those borrowers who can repay. In such circumstances the fiduciary responsibility of the data controller should extend to ensuring that the data in its possession is processed in a fair and non-discriminatory manner. And that it does not use other extraneous facts in its possession to unfairly discriminate against the data subject.
In a way, it is a blessing that India took its time to enact a data protection law. Without the baggage of a consent-based privacy jurisprudence, we have the freedom to enact a law that is appropriate to our data-intensive world. While the rest of the world is struggling to redesign their laws that are based on a data protection model conceived of in the 1980s when data volumes were a mere trickle compared to today, India has the opportunity to build, from scratch, a forward thinking privacy framework that can address the current reality and can serve as a model for the rest of the world.
Let’s hope we seize the opportunity.
Rahul Matthan is a partner at Trilegal. Ex Machina is a column on technology, law and everything in between.
His Twitter handle is @matthan.