A few weeks ago, I wrote in this column that two researchers from Facebook Inc. had published a startling research note on the company’s corporate blog in mid-December. The note was titled Hard Questions: Is Spending Time on Social Media Bad for Us? The note meandered along, but it contained an extraordinary admission of partial culpability from what is arguably the world’s leading social media company.

Facebook has maintained for years that it is a ‘force for good’ in this world, so the partial admission of guilt in the blog was startling. The blog post tried to present a ‘balanced’ view, where it cited research on both sides–one set claiming that social media is bad for us and the other set stating that social media can actually be a force for good. The post concluded that whether social media is good or bad for us comes down to how an individual uses social media, and that mere passive consumption of the platform’s “News Feed" rather than active interaction with friends and family might actually worsen a person’s mental health.

The News Feed is the landing page that users of the social network’s site find themselves on when they connect to Facebook. Facebook has in current years been criticized for being a vector in the viral spread of “false news", or at least in the spread of news that it wasn’t spending enough time vetting to make sure was true. It has plugged on regardless, while making some attempts in the meanwhile to ‘flag’ fake news as inappropriate.

Facebook has been managing the fallout from revelations last year that the Russians had tried to influence the 2016 US presidential election by buying ads and planting and promoting false news stories. After initially dismissing those concerns immediately after the election, Mark Zuckerberg, co-founder and boss of the firm, later admitted that he was mistaken. Facebook eventually said that as many as 120 million people had viewed election-oriented ads that Russians had bought on the platform.

On 20 December, Tessa Lyons, one of the firm’s product managers, admitted in a blog post that using “disputed" flags against news items to identify false news was in fact leading to unintended consequences. She stated that academic research on correcting misinformation has shown that putting a strong image, like a red flag, next to an article may actually entrench deeply held beliefs—the opposite effect to what is intended. Though Lyons said that the firm would use alternative methods to classify such news, such as pointing users toward reliable publications that carry a view with more veracity, this comes across as an admission that Facebook isn’t really capable of vetting news, and that it’s a fool’s errand for it to even try to do so.

Zuckerberg began 2018 by sharing with his audience the personal challenges he hoped to conquer during the year. The chief one he mentioned in his 4 January Facebook post was making the site a force for good. “The world feels anxious and divided," Zuckerberg wrote, “and Facebook has a lot of work to do—whether it’s protecting our community from abuse and hate, defending against interference by nation states, or making sure that time spent on Facebook is time well spent."

In a follow-up to show that Zuckerberg’s actions match his words, Facebook last week announced that it will make sweeping changes to how the “News Feed" is presented to its approximately 2 billion users worldwide. Facebook says it will prioritize what users’ friends and family share and comment on while de-emphasizing content from publishers and brands.

On the face of it, this is a fundamental change for the company. The dissemination of news by Facebook and the gargantuan advertising revenue that results from its operations is big business. The company earns significant cash from such operations and therefore this departure from its standard practice may actually hinder organizations that are looking to capture users’ attention through advertisements on the platform. As a result, Facebook’s shares fell by almost 5% on Friday, 12 January.

The changes that Facebook is making still raises questions of whether people may end up seeing more content that is congruent with their own beliefs if they end up frequently interacting with posts that only reflect the similar views of their friends or family. And false news may still spread—for instance, if a relative or friend posts a link with a false article that has many comments, the post will still be prominently displayed on a user’s News Feed.

It also remains to be seen how much will actually be done. This is not the first time Zuckerberg has said the company would change the algorithm in the News Feed to move away from news. For instance, he once did so in June 2016, before the US election, but it’s unclear what changed, especially when we take the admission about the consumption of Russian-bought ads into account.

As W.H. Auden said, “In a crowd, passion like rage or terror is highly contagious; each member of a crowd excites all the others, so that passion increases at a geometric rate. But among members of the public, there is no contact. If two members of the public meet and speak to each other, the function of their words is not to convey meaning or arouse passion but to conceal by noise the silence and solitude of the void in which the public exists."

“Occasionally, the public embodies itself in a crowd and so becomes visible—in the crowd, for example, which collects to watch the wrecking gang demolish the old family mansion, fascinated by yet another proof that physical force is the Prince of this world against whom no love of the heart shall prevail."

Maybe Facebook should get out of the news business entirely, considering the network effect it has on human psychology.

Siddharth Pai is a world-renowned technology consultant who has personally led over $20 billion in complex, first-of-a-kind outsourcing transactions.

Close