Home / Companies / People /  ‘Content regulation lapses cast doubts on Facebook’s biz model’

Social networking platform Facebook is facing several challenges across countries. In India, it has been in the eye of the storm for alleged “right wing" bias, and its inability to take down hate content. Amber Sinha, executive director of Bengaluru-based think tank Centre for Internet and Society, talks about the shortcomings of the Facebook business model and the need to address them. Excerpts from an email interview:

Is Facebook facing a public relations crisis, considering that it has failed to manage its image in most markets?

There is no doubt that the last few years have been an ongoing public relations nightmare for not only Facebook, but all Big Tech companies. However, the problem is not one of image. Repeated data and content regulation lapses on Facebook’s part have emerged, and these have rightly raised severe questions about their business model, and led to more attention, scrutiny and questioning from several governments across the world, including India.

After the revelation about Cambridge Analytica’s use of Facebook to profile and manipulate users with political content emerged, the Indian government has been engaged in a series of ad hoc communications with Facebook and other social media companies. In July 2018, information technology minister Ravi Shankar Prasad had warned that social media platforms could not “evade their responsibility, accountability and larger commitment to ensure that their platforms were not misused on a large scale to spread incorrect facts projected as news and designed to instigate people to commit crime".

There continues to be little clarity on how social media companies should be regulated, but there is growing consensus that they operate in a relative regulatory vacuum.

Why is it important to fill this regulatory vacuum?

The platformization of the web has allowed the creation of business models, wherein several technology companies have started intermediating our access to many services and products. These platforms are structured in a way that traditional sectoral regulations often do not apply to them. Further, because these business models provide “free" services dependent on advertising revenue based on monetization of data, they also often fall outside the purview of consumer protection laws.

In several cases, including for Facebook, serving as the primary platform for consumption of news for many people, traditional regulations may also not fully make sense. However, instances such as these highlight and underscore the need for formulating appropriate regulation.

What explains the Facebook controversies? Where does the problem lie?

The backlash Facebook is facing right now about its regulation of hate speech arises from flaws in its own policies and processes. The first issue is of inconsistent application of Facebook’s community guidelines. These guidelines are a set of general rules, applicable globally, often with limited guidance about local contexts. This is not the first time that patterns of inconsistent application have been brought to attention.

For instance, Sarah Koslov from Georgetown University Law Centre, has pointed out how in some cases in Israel and Palestine, Facebook’s rules “favour elites and governments over grassroots activists and racial minorities". The second issue is that of structural problems within the organization, which has led to serious conflicts of interest.

What kind of structural issues?

The structural issues seem to arrive from a lack of full appreciation of the various aspects of public policy that a company like Facebook’s platform and engagements intersect with. The legal and ethical responsibilities of the platform, with respect to content regulation, requires them to clearly uphold internationally-recognized principles and domestic laws on free speech, hate speech, thresholds for takedown, etc.

Similarly, on data protection and privacy, the platform is expected to discharge its legal obligations as per local law and, ideally, set standards in accordance with the most robust data privacy obligations globally.

On the other hand, the public policy teams at Big Tech companies like Facebook routinely need to engage with government stakeholders. It is to be expected that the objectives and interests of the content regulation teams can collide with those who work on government affairs.

There must be a very clear segregation between the two, and those working in government affairs should not be in a position to influence how hate speech or other content is regulated on the platform.

Does Facebook have an identity crisis, as it pretends to be a news platform to attract advertisers, but is free of rules that govern traditional publishers?

This situation is less of an identity crisis; it is much more of a situation created by regulatory oversight, lagging way behind emerging, and, in this case, somewhat established business models of social media platforms. So far, platforms like Facebook have enjoyed safe harbour protections. These protections are based on the idea that intermediaries, Facebook in this case, are mere dumb conduits for the distribution of speech of its users, and are not speakers themselves.

This argument no longer holds water. Using highly-intelligent prioritization algorithms, most, if not all, social media platforms affirmatively shape the form and substance of content posted by users in some manner.

Facebook employs techniques to ensure that each user sees stories and updates in their “News Feed" that they may not have seen on their last visit to the site. It analyses, sorts and reuses user data to make meaning out of user’s “reactions", search terms and browsing activity in order to curate the content of each user’s individual feed, personalized advertisements and recommendations.

That said, it will be a stretch to equate social networking platforms to publishing and media companies. While platforms like Facebook do is significantly mediate our consumption of content, and play a role far beyond mere transmission, they do not control the content that is posted.

How do you resolve a problem such as the one with FB? What laws are required to regulate social media platforms?

It is important to recognise the nature of different roles that platforms like Facebook play, and shape regulation based on that. Currently, the regulation paints all internet intermediaries with the same brush.

There is a definite need for differentiating between infrastructure information intermediaries (such as ISPs) and content information intermediaries that facilitate communication (such as social media networks). It might be possible to create content-neutral standards for infrastructure information intermediaries that do not primarily focus on content transmission.

For example, a set of content-neutral standards (like common carrier regulations) could apply to infrastructure intermediaries, while separate standards that are not content-neutral would apply to content intermediaries. Given their control and power over our user experience online, intermediaries do owe us a duty of care.

The right to free speech and protection of equality and dignity are recognized by most constitutions around the world. We need a shift from content regulation and safe harbour to meta regulation, under which companies which determine and dictate online moderation are held accountable to a set of principles in line with international standards of freedom of opinion and expression.

Centre for Internet & Society has been a recipient of research grants from Facebook since 2017

Catch all the Corporate news and Updates on Live Mint. Download The Mint News App to get Daily Market Updates & Live Business News.
More Less

Recommended For You

Trending Stocks

×
Get alerts on WhatsApp
Set Preferences My ReadsWatchlistFeedbackRedeem a Gift CardLogout