Active Stocks
Thu Apr 18 2024 12:15:28
  1. Tata Steel share price
  2. 163.45 2.12%
  1. Power Grid Corporation Of India share price
  2. 284.50 3.70%
  1. Infosys share price
  2. 1,435.10 1.44%
  1. NTPC share price
  2. 359.70 0.13%
  1. Wipro share price
  2. 451.05 0.55%
Business News/ Companies / News/  Apple responds to criticism about new child safety features
BackBack

Apple responds to criticism about new child safety features

Communications Safety uses “on-device” machine learning to identify and blur sexually explicit images, which means it does not connect to any cloud server. It will notify a parent of a child who views or sends such an image

Apple on 6 August, announced new software that will scan images on its devices, messaging platform iMessage and cloud platform iCloud, to look for child sexual abuse material. (Photo: Bloomberg)Premium
Apple on 6 August, announced new software that will scan images on its devices, messaging platform iMessage and cloud platform iCloud, to look for child sexual abuse material. (Photo: Bloomberg)

NEW DELHI: Tech giant Apple has responded to some of the concerns cited by its privacy advocates about its new child safety tools. The company, on 6 August, announced new software that will scan images on its devices, messaging platform iMessage and cloud platform iCloud, to look for child sexual abuse material (CSAM). The system had been widely criticized by privacy advocates like the Electronic Frontier Foundation and Edward Snowden, and WhatsApp CEO Will Cathcart, among others.

In an FAQ document, Apple claimed that the tools do not break the privacy assurances the company has given to its users so far, it doesn’t break end-to-end encryption and that the company will not gain access to any user communications because of the new software. The child safety feature on iMessage is called Communications Safety, and it’s distinct from CSAM detection on iCloud.

Notably, Communications Safety uses “on-device" machine learning to identify and blur sexually explicit images, which means it does not connect to any cloud server. It will notify a parent of a child who views or sends such an image. On the other hand, CSAM detection on iCloud scans images when they are uploaded to Apple’s cloud service. It then notifies Apple, which verifies the alert before contacting the authorities.

Further, the company clarified that CSAM detection will not scan every photo stored on a user’s device, only the ones uploaded to iCloud. However, Apple devices have ways for photos to be uploaded to the cloud automatically, which is a feature that users can turn off if they want to. Turning off the “iCloud Photos" feature will disable CSAM detection too.

“Existing techniques as implemented by other companies scan all user photos stored in the cloud. This creates privacy risk for all users. CSAM detection in iCloud Photos provides significant privacy benefits over those techniques by preventing Apple from learning about photos unless they both match to known CSAM images and are included in an iCloud Photos account that includes a collection of known CSAM," the company claimed in the FAQ.

WhatsApp CEO, Will Cathcart, had said Apple’s new system could be misused by governments and the company itself, by adding different images to the CSAM database that iCloud photos are matched against. To this, Apple says that the company’s systems are designed to prevent this. “CSAM detection for iCloud Photos is built so that the system only works with CSAM image hashes provided by NCMEC and other child safety organizations," the company said.

“This set of image hashes is based on images acquired and validated to be CSAM by child safety organizations. There is no automated reporting to law enforcement, and Apple conducts human review before making a report to NCMEC. As a result, the system is only designed to report photos that are known CSAM in iCloud Photos. In most countries, including the United States, simply possessing these images is a crime and Apple is obligated to report any instances we learn of to the appropriate authorities," the company added.

Perhaps more importantly, Apple claims that it will refuse any demands from governments to add images to the CSAM database. “We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before and have steadfastly refused those demands. We will continue to refuse them in the future," the company said.

Unlock a world of Benefits! From insightful newsletters to real-time stock tracking, breaking news and a personalized newsfeed – it's all here, just a click away! Login Now!

ABOUT THE AUTHOR
Prasid Banerjee
An engineering dropout, Prasid Banerjee has reported on technology in India for various publications. He reports on technology through text and audio, focusing on its core aspects, like consumer impact, policy and the future.
Catch all the Corporate news and Updates on Live Mint. Download The Mint News App to get Daily Market Updates & Live Business News.
More Less
Published: 10 Aug 2021, 03:17 PM IST
Next Story footLogo
Recommended For You
Switch to the Mint app for fast and personalized news - Get App