Active Stocks
Thu Apr 18 2024 15:59:07
  1. Tata Steel share price
  2. 160.00 -0.03%
  1. Power Grid Corporation Of India share price
  2. 280.20 2.13%
  1. NTPC share price
  2. 351.40 -2.19%
  1. Infosys share price
  2. 1,420.55 0.41%
  1. Wipro share price
  2. 444.30 -0.96%
Business News/ Mint-lounge / Features/  Facebook’s experiment with emotional contagion
BackBack

Facebook’s experiment with emotional contagion

Facebook's 'behavioural' experiment may have caused major outrage, but this could be shaping the News Feed in the future

For a period of time in early 2012, Facebook’s data scientists interfered with the news feeds of more 689,000 unsuspecting users.Premium
For a period of time in early 2012, Facebook’s data scientists interfered with the news feeds of more 689,000 unsuspecting users.

Facebook is in the news again. No, not for another acquisition or app launch, but for the “behavioural experiment" the social network giant embarked on sometime back. “This was part of ongoing research companies do to test different products, and that was what it was; it was poorly communicated," Facebook’s chief operating officer Sheryl Sandberg, said.

Users are angry, and the authorities in Britain, Ireland and France are investigating the legality of the experiment.

The experiment

For a period of time in early 2012, Facebook’s data scientists interfered with the news feeds of more than 689,000 unsuspecting users. This was part of a study, Experimental Evidence Of Massive-scale Emotional Contagion Through Social Networks, by Facebook, Cornell University and the University of California in the US—the idea was to conduct an “experiment on Facebook to show that emotional states can be transferred to others via , leading people to experience the same emotions without their awareness". Some of these users were shown more negative content than usual, while the news feed of the rest was filled with positive updates and feeds. Facebook wanted to analyse the behavioural changes through posts users might put up—did those positive or negative updates alter their mindset, at least while communicating on Facebook?

Users weren’t told

Whenever a situation like this arises, the organization at the receiving end usually pulls out its Terms and Conditions (T&C) book, and points to some clause written in complicated language, which most users may not have bothered to read. In this case that too hasn’t worked out well for Facebook. Reports suggest that four months after the experiment began, the word research was added to the T&C, to justify any such activities using user data: “For internal operations, including troubleshooting, data analysis, testing, research and service improvement".

None of the users were asked whether they wished to participate in what is essentially a psychological study.

Did they really apologize?

We already know that Sandberg says the study was “poorly communicated". This seems the kind of apology that doesn’t really apologize for the experiment, but for the fact that users were annoyed by this manipulation. At least Facebook has been forthcoming about the entire episode.

Why did this experiment happen in the first place?

In a public Facebook post on 29 June, this is what Facebook data researcher Adam Kramer had to say: “And at the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect it—the result was that people produced an average of one fewer emotional word, per thousand words, over the following week. The goal of all of our research at Facebook is to learn how to provide a better service. Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it.... In hindsight, the research benefits of the paper may not have justified all of this anxiety."

In the Cornell Chronicle, published on 10 June, Cornell University had already outlined its vision on the research: “Facebook, with more than 1.3 billion users of every emotive disposition, and its news feed feature—in which a constantly tweaked, Facebook-controlled ranking algorithm regularly filters posts, stories and activities enjoyed by friends—proved an ideal place to start." It went on to clarify that none of the researchers working on this experiment ever saw the actual content on any of the posts—they monitored positive and negative keywords and their occurrence. Some numbers have been given for this—more than three million posts were analysed, with a total of 122 million words. About four million of those words were “positive" and 1.8 million were “negative".

Is the US government spying on people?

Well, there seems to be some confusion. The same Chronicle article says, “An earlier version of this story reported that the study was funded in part by the James S McDonnell Foundation and the Army Research Office. In fact, the study received no external funding."

But government involvement may not be as far-fetched as it seems to most readers. It has been documented that the US government ran a “Cuban Twitter" account between 2009-12, with the aim of spreading political messages in a country where Web access for the general public was limited. Documents that emerged when Edward Snowden leaked classified documents on US spying programmes have shown how governments are “using online techniques to make something happen in the real or cyber world".

The best response

While people were voicing their displeasure in different ways, Erin Kissane (@akissane), the director of content @OpenNews tweeted: “Get off Facebook. Get your family off Facebook. If you work there, quit. They’re f****ng awful."

Unlock a world of Benefits! From insightful newsletters to real-time stock tracking, breaking news and a personalized newsfeed – it's all here, just a click away! Login Now!

Catch all the Business News, Market News, Breaking News Events and Latest News Updates on Live Mint. Download The Mint News App to get Daily Market Updates.
More Less
Published: 04 Jul 2014, 12:57 PM IST
Next Story footLogo
Recommended For You
Switch to the Mint app for fast and personalized news - Get App