While WhatsApp encrypts all communication which makes it harder for authorities to monitor illegal activities, there are lot of unencrypted content and that is what is allowed researchers at the Foundation to detect CSAM. (Reuters)
While WhatsApp encrypts all communication which makes it harder for authorities to monitor illegal activities, there are lot of unencrypted content and that is what is allowed researchers at the Foundation to detect CSAM. (Reuters)

Circulation of child sexual abuse material still rampant on WhatsApp: report

  • In the last three months, WhatsApp has banned approximately 250,000 accounts each month suspected of sharing child sexual abuse material (CSAM)
  • WhatsApp is turning to all available unencrypted information including user reports to detect and prevent circulation of CSAM

New Delhi: Misuse of messaging platforms like WhatsApp and Telegram for illegal activities is well known. While these companies have been trying hard to curb them, criminals seem to find new ways to exploit them again and again. According to the findings of Delhi based Cyber Peace Foundation, chat groups in WhatsApp are still being used to share child sexual abuse material (CSAM).

“The members are first solicited publicly using these invite links and then the trusted ones (in this case, who would actually engage in sharing CSAM) are called on to join a more private group. Numbers used to create groups are often virtual numbers used by Indian users. There are several such tactics being used by group admins and members," said Nitish Chandra, a cybersecurity expert and Manager-Training and Policy Cyber Peace Foundation, in an email reply to Mint. The findings were first reported by The News Minute.

According to the Foundation’s report, many groups are created by people with Indian phone numbers with their group descriptions in Hindi. Many of these groups also lay emphasis on sharing of child pornography videos and no other links, while some require their members to share a specified number of videos every day or face dismissal from the group. Their activities are not limited to circulation of CSAM. Many even solicit physical contact with children with details like price, location and time.

When these groups were first discovered by the Foundation in an earlier study form last year, they were found to be using third party applications to host invite links to these groups. When the matter was reported to Google, the third party apps used for the purpose were taken down from the Play Store in December.

On its part WhatsApp is aware of the misuse of its platform for distribution of CSAM and is constantly stepping up their capabilities to keep the platform safe. In an official statement to Mint, a company spokesperson said, “WhatsApp cares deeply about the safety of our users and we have no tolerance for child sexual abuse. We rely on the signals available to us, such as group information, to proactively detect and ban accounts suspected of sending or receiving child abuse imagery. We have carefully reviewed this report to ensure such accounts have been banned from our platform."

To be sure, WhatsApp is turning to all available unencrypted information including user reports to detect and prevent circulation of CSAM. In the last three months, WhatsApp has banned approximately 250,000 accounts each month suspected of sharing such content. The company is using a photo-matching technology called PhotoDNA that proactively scan profile photos for CSAM. If their systems detect any such image, they ban the user and the accounts within that group. The company is also encouraging users to report such or any problematic content to the company.

While WhatsApp encrypts all communication which makes it harder for authorities to monitor illegal activities, there are lot of unencrypted content and that is what is allowed researchers at the Foundation to detect CSAM. Also, they are helping WhatsApp identify these groups, as mentioned earlier. Chandan points out, that encryption makes it harder to proactively filter content. But such technology exists in the interest of privacy and security. Most of the information shared on the report is, however, not from personal chats or group chats. It is actually public information like group icons and descriptions that has been called out.

WhatsApp isn’t the only platform in India which is being used for such activities. Chandan claims that they have done similar investigation into another platforms like popular live streaming app Bigo Live and found that one of the pictures that was the icon for a group circulating CSAM on WhatsApp was actually the picture of one of the broadcasters in one of the screenshots on the first page on Bigo Live.

Chandan adds it is the reason why the Foundation is calling for a software and content regulator in India to at least set out the basic norms and compliance standards, citing the example that Bigo Live is rated safe for children in the age group of 12 and above in India whereas in US it is rated safe for children in above 17 and above.

Close