Is WhatsApp’s restriction on forwards enough to curb fake news?
WhatsApp’s new feature will restrict rampant resharing but more needs to be done to curb fake news
After the government directive to come up with more effective tools to curb fake news, Facebook-owned WhatsApp is introducing a new feature to prevent rampant forwarding of media messages, particularly in India where users forward more messages, photos and videos than any other country.
With this new feature in place users won’t be able to forward a single message more than five times on any group or with individual users. After five forwards, the quick forward button (highlighted with a curved arrow) next to the message will not show anymore, restricting users from sharing them further. WhatsApp suggested the limit on forwards could be lower in other countries, where people forward fewer messages.
The new feature is immediately rolling out to all Android and iOS devices which have received the latest WhatsApp update. WhatsApp has confirmed that the restriction will apply on all message types including regular chats, videos, documents, photos and GIFs.
In an official blog post WhatsApp states, “We believe that these changes – which we’ll continue to evaluate – will help keep WhatsApp the way it was designed to be: a private messaging app.”
More than 50 people have been reportedly killed, including a Karnataka engineer last week, in India in mob violence incited by fake message forwards on the platform that boasts more than 200 million active users in India alone. WhatsApp has also been misused to influence public opinion during elections. It was reportedly used to inflame tensions by sharing fake messages during the Karnataka elections.
Restriction on forwards can be a more effective tool than some of the recently added features such as labelling forwards or increasing the control of admins over groups, but it is not enough.According to Prasanto K Roy, a technology policy and media professional, “Increasing friction for forwards is a good start, and will have some positive effect in slowing viral spread” but it not enough. “Fake news will still spread. When someone gets a fake video of what he believes to be an attack on his community, he will forward it to the 10 or 15 groups he’s part of--even it he has to do it in two batches now instead of one. So a video that inspires mob violence and which spread to a 100,000 people a few hours will take two or three times that time to reach that audience. The increased friction might reduce the velocity of some forwards--for instance an interesting piece someone has read. But the really viral stuff, the lynch-mob inspirations, the urgent warnings, those, people will take the effort to spread. No, this little bit of friction is not enough,” adds Roy.
He believes that WhatsApp should be able to filter messages, so that those spreading virally and wildly raise red flags--and are deleted from across its platform, if they’re found to be fake or dubious.