Last week, the government released a set of draft amendments to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021. Once enacted, digital intermediaries will have to ensure that the community standards to which they hold their users answerable comply with Indian law and India’s constitutional principles. This, the government clarified, has become necessary because a number of intermediaries have taken it upon themselves to act in violation of the rights of Indian citizens.
The new amendments also propose the constitution of a Grievance Appellate Committee that will be tasked with dealing with “problematic content” in an expeditious manner. Users unsatisfied with how their complaint to an intermediary has been handled will be able to appeal the decision at this body. And have it resolved within 30 days.
The proposal has been met with varying degrees of consternation. Editorial pieces, including in this newspaper, have called this yet another attempt by the government to either curtail or interfere with free speech. The remark that some intermediaries have acted in violation of citizens’ rights is being read as a snide reference to instances when intermediaries refused to take down content that did not violate their community guidelines, despite pressure from the Centre to do so. Hence, what the government sees as an escalation mechanism to provide redress to users against unfair decisions of the social media platforms they subscribe to, many members of civil society view as just another tool of government censorship.
Both sides are right. And a wee bit wrong.
Content moderation is a wicked problem. It calls for every piece of suspicious content to be evaluated against a number of different legal standards—such as authorship, the harm it might cause to a person’s reputation, the legality of that content in the context of the age of its intended audience, etc. As much as social media platforms are designed to enable free speech, they must also eliminate—or at least mitigate—the harms that could arise from speech unfettered. They need to arrive at a balance between the rights of persons who post and those they offend.
Content moderators often have complex decisions to make. They have to decide what stays up and what must be taken down. Where to draw the line between speech which is acceptable and that which is not? Most often, the issues are so clear-cut that even lightly trained content moderators can get it right. Every now and then, however, even the most experienced among their ranks will not.
Some of the decisions they have to take in the course of a day’s work are so gnarly that even the finest legal minds would have been left scratching their heads. Are disparaging remarks posted about an individual defamatory as alleged? Do they malign his character with falsehood or are they in fact based on truth? Is this re-mix of an existing song original enough to qualify as a novel work, or does it need a licence from the copyright holder before it’s posted? Is a given statement expressing angst over a decision just normal human frustration or an attempt to foment a violent agitation?
Civil society is concerned that if the proposed Grievance Appellate Committee allows the government to have the last word on questions such as this last one, it will exercise this power to quell dissent. As much as I share this concern, I have similar reservations about leaving all such decisions entirely in the hands of private enterprise.
After all, not all appeals to the Grievance Appellate Committee will be about government take-downs. Some will address illegal content—like violations of copyright. What if an artist’s original composition is marked for take-down because of some imagined infringement in a decision that is, even on appeal, not reversed? To most up-and-coming artists, their only path to commercial success lies in being able to impress the audiences that these social networks have to offer. If they are forced, without recourse, to take their content down, that could be the end of their careers.
That said, it is impossible to ignore the concern that civil society raises. Should an appeal be made by a government agency whose take-down notice has been rebuffed, is it not likely that a government-appointed appellate committee will rule in favour of its own agency? How do we mitigate the likelihood of such an eventuality and still preserve the right to appeal?
One solution could be for the industry to establish a self-regulatory appellate body to which appeals from all content moderation decisions can be referred. It could be staffed with a cross-section of experts from industry and the domain of law, so that its decisions will be sufficiently robust—informed both by industry context as well as applicable laws and judicial precedents taken into account.
Ideally, this body should operate as an appellate forum for all content moderation decisions, regardless of the platform from which the appeal originates. This will keep it beyond the power hierarchy of the platforms themselves, offering the process a measure of independence that is absent in internal grievance redressal systems. Since it will not be operated by the government, it will, hopefully, have the neutrality required to remain impartial while deciding on take-down notices issued by the government.
The government has already indicated that it is open to considering self-regulatory alternatives. Now it is for the industry to get the design right.
Rahul Matthan is a partner at Trilegal and also has a podcast by the name Ex Machina. His Twitter handle is @matthan
Catch all the Business News, Market News, Breaking News Events and Latest News Updates on Live Mint. Download The Mint News App to get Daily Market Updates.
MoreLess