Fake news, Donald Trump and the pressure on Facebook
- Blockchain startups in India shift to overseas markets to raise ICOs
- Sony India appoints Sunil Nayyar as managing director
- IITs should be among top hundred global institutions: Satya Pal Singh
- SC dismisses plea challenging Nitish Kumar’s membership to Bihar legislative council
- Kodak and Sanyo 55-inch TVs: affordable 4K TV face-off
As content of dubious authenticity swirls on platforms like Facebook, Twitter and Google, many in the media worry consumers may lose trust in stories that are actually true. Maybe most uncomfortable are the social media companies, especially Facebook. They make millions in ad revenue by distributing information, but the last thing they want are the responsibilities that come with being a publisher, like ensuring stories are accurate
1. Why is fake news in the news?
Some Hillary Clinton supporters say a flood of fake items may have swayed the results of the US election in favour of president-elect Donald Trump. They’re not alone. The “impresario of a Facebook fake-news empire”, Paul Horner, told the Washington Post, “I think Trump is in the White House because of me”. BuzzFeed found that of the 20 fake election stories that were most shared, commented-on and reacted-to on Facebook, 17 were pro-Trump or anti-Clinton.
2. What were some of the biggest fake election stories?
That Pope Francis endorsed Trump. That an FBI agent suspected of leaking Clinton’s e-mails was found dead. That a protester admitted being paid $3,500 to disrupt a Trump rally. That Trump once called Republicans “the dumbest group of voters in the country”.
3. Did it really influence the election’s outcome?
It’s hard to say. What we do know is that this is the first election in which the majority of US adults got their news from social media. And that news came to them in a very personalized, filtered fashion if they were getting it on Facebook—serving up what they would want to see.
4. Who’s producing fake news?
Fake news can come from many sources. Some are in it for the money, like teenagers in Macedonia pumping out pro-Trump articles or a pair of 20-something friends in California who call themselves “the new yellow journalists”. Other sources of misleading information are trying to push an agenda. And sometimes the machine is fed by plain old mistakes.
5. How does it disseminate so quickly?
What do you click on? For most of us, it’s stuff that sparks some emotion: surprise, sadness, anger or confusion. That’s what people share on social media, too. Facebook’s algorithm amplifies this content by promoting posts that trigger that kind of attention. Next, advertisers pay for slots next to these stories because they want to be where the eyeballs are. Finally, the flat landscape of social media wipes out many of the filters we used to use to judge content. At a newsstand, there’s a clear difference between the Washington Post and the National Enquirer. But their Facebook posts can look similar in your timeline.
6. What can be done?
Facebook chief executive officer Mark Zuckerberg initially played down the problem. But he announced on 18 November that Facebook will get input from journalists and fact-checkers on how to improve its algorithm. There are other ideas, too, like finding a way to warn users if a story might be false, or disrupting the economics of fake news.
7. What about Twitter?
It’s not getting hammered as hard on this issue. It’s not that fake news doesn’t get shared there, but Twitter shows users everything they sign up to follow in reverse-chronological order, while Facebook decides what goes into people’s news feeds based on its algorithm.
8. Can an algorithm really tell what’s true and what’s false?
The Internet presents a spectrum of information, with hyper-partisan opinion stories masquerading as news, plus lots of satire and funny memes. What’s an algorithm to do? Facebook’s engineers have trained their algorithm to know that if something is really popular, it must be relevant. It might be easy for the company to suppress an outright hoax, but it’s harder to automate the decision on what to do about propaganda-like content meant to rile people up. Bloomberg