In 1995, when the World Wide Web was still in its infancy, the pioneers of that nascent industry were seen as no different from publishers. The websites they ran were treated like magazines to which writers could contribute articles. And just as magazine publishers could be sued for what their authors wrote, websites had to be accountable for what users posted.
A series of decisions in US courts sharply underscored the exposure of this brand-new industry to third-party content. In Compuserv vs. Cubby, the court laid down a 2-step test to hold online services liable for third-party content. It said that a website would only be immune from prosecution if it had: (i) no editorial control over the content; and (ii) no reason to know that that content was objectionable. Stratton Oakmont vs. Prodigy extended this by declaring that any website that moderated user-generated content would not be entitled to immunity—a ruling which, somewhat perversely, punished companies that were taking the trouble to remove inappropriate content from the web.
Realizing the absurdity of this outcome, US Senators Cox and Wyden set out to enact a law that would restore the much-needed protection of internet companies. They inserted into the Communications Decency Act a new Section 230 that stated: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” Since this was a small part of a far more controversial legislation, it flew completely under the radar, with most of legislators who had approved it likely unaware of the profound impact it would have on how the internet would be governed.
In his book The Twenty-Six Words That Created the Internet, Jeff Kosseff argues that it is precisely because of the freedom of speech guaranteed by these 26 words, that the internet, as we know it today, has come into existence. US courts have consistently interpreted Section 230 in a way that encourages businesses to be unafraid of third-party content, allowing them to innovate fearlessly. This is why so many successful internet businesses are based in the US, where they can flourish without having to constantly look over their shoulder.
The US Supreme Court has been hearing oral arguments in Gonzales vs. Google, a case brought against YouTube for not only making ISIS videos available on the website, but actively disseminating them through its recommendation algorithms—placing paid advertisements in proximity to ISIS-created content and allegedly sharing this revenue with ISIS.
In order to win, the petitioners will have to argue that YouTube is not entitled to immunity under Section 230. To do that, they are looking to distinguish between “recommended content” on one hand and the “recommendations” that YouTube makes on the other. Since recommended content is uploaded by the user, it would fall within the definition of “information provided by another information content provider”, which is entitled to protection under Section 230. However, the algorithmic recommendations that YouTube makes, the petitioners argue, are akin to placing a message in big bold letters next to a video stating, “You should watch this.” Since the message is not “information provided by another information content provider”, it cannot be entitled to the Section 230 exemption.
This case has become something of a lighting rod among conservative politicians, most of whom believe that tech companies have grown too powerful. They have complained about the censorship carried out by these entities, arguing that their moderation decisions have more to do with political ideology than freedom of speech. To them, this case is an opportunity to bring Big Tech companies under control, and, given that at least one of the judges in the conservative-majority court has publicly remarked on questionable precedents established over the years in relation to Section 230, they are hopeful of a favourable outcome.
If the plaintiffs succeed in convincing the court to apply a narrower interpretation of Section 230 than is in current used, the legal exposure of large tech platforms will increase so significantly that they will be forced to radically change the ways in which they operate. Online intermediaries will be forced to filter all the speech that appears on their sites—to the point where some may choose not to host user-generated content at all.
There is no doubt that this will change the way in which content moderation takes place on the internet. In many other countries, internet businesses are already required to expeditiously remove content that is allegedly defamatory or illegal. In some instances, they have to proactively screen posts to ensure that harmful third-party content does not even appear online. So far, the approach that these countries have taken to content moderation has been looked down upon as inappropriate for the internet age. If, however, this case ends up being decided in favour of the petitioners, the US could go down a similar path.
Among the many number of amicus briefs that have been filed before the court, there is a bipartisan one by Cox and Wyden, the original authors of the American law, urging the court to retain the protections guaranteed under the law they drafted.
The US Supreme Court is expected to issue its opinion this summer, and no matter what it rules, it will be significant. And the whole world will be watching.
Rahul Matthan is a partner at Trilegal and also has a podcast by the name Ex Machina. His Twitter handle is @matthan.
Catch all the Business News, Market News, Breaking News Events and Latest News Updates on Live Mint. Download The Mint News App to get Daily Market Updates.