WhatsApp and parent company Facebook have the widest reach among other social media platforms and both platforms have been criticized extensively for their part in disseminating misinformation and fake news
WhatsApp and parent company Facebook have the widest reach among other social media platforms and both platforms have been criticized extensively for their part in disseminating misinformation and fake news. While WhatsApp has been trying to employ various features that can curb misinformation, Facebook has been facing flak for not doing enough. Facebook Messenger is now borrowing a page from its daughter company.
Facebook Messenger will be restricting forward messages on the platform, the company revealed in a blog. Messages can only be forwarded to five people or groups at a time. The company claims this move will slow the spread of viral misinformation and harmful content that has the potential to cause real world harm.
The official blogpost read, “We believe controlling the spread of misinformation is critical as the global COVID-19 pandemic continues and we head toward major elections in the US, New Zealand and other countries. We’ve taken steps to provide people with greater transparency and accurate information."
The blog further stated the steps that are being put in place for information related to Covid-19 as well as to encourage good voting practices in countries that are entering their election phase.
The statement claims, “Coronavirus (COVID-19) Community Hub gives people access to up-to-date information and resources designed to help them stay safe. And our Voting Information Center and voter registration notifications ensure people know how to register to vote and encourage them to make their voices heard. We are introducing a forwarding limit on Messenger to help curb the efforts of those looking to cause chaos, sow uncertainty or inadvertently undermine accurate information."