Home / Technology / News /  Experts share concerns over Apple’s new child-safety tool

Privacy advocates, tech executives and experts have raised concerns over Apple’s new child-safety measures for its devices. The company said on 6 August that it will use built-in software to scan images that users store on their devices and its messaging and cloud platforms in a bid to flag content that shows child sexual abuse material (CSAM).

Experts cautioned that the system amounts to creating back doors in Apple’s software, which cybercriminals can exploit in the long run. Back door is a piece of code that allows unauthorized access to user devices and can be used for illegal surveillance.

“To say that we are disappointed by Apple’s plans is an understatement. Apple has historically been a champion of end-to-end encryption, for all of the same reasons that EFF has articulated time and time again," said digital rights group Electronic Frontier Foundation (EFF) in a blog post. “Apple’s compromise on end-to-end encryption may appease government agencies in the US and abroad, but it is a shocking about-face for users who have relied on the firm’s leadership in privacy and security," it added.

Whistleblower Edward Snowden also criticized Apple for the change. Snowden was among the signatories of an open letter asking Apple to withdraw the update. The letter was signed by 3,000 individuals. Apple later circulated an internal memo, telling its employees that the company expects pushback against the update. “We know that the days to come will be filled with the screeching voices of the minority," the memo said. Snowden called this memo “unbelievable" in a tweet. “Apple is now circulating a propaganda letter describing the internet-wide opposition to their decision to start checking the private files on every iPhone against a secret government blacklist as ‘the screeching voices of the minority’. This has become a scandal," he added.

The new child safety tools are going to be driven only by US law for now, but it is a platform-level update, meaning it could easily come to other countries where Apple operates.

“If you look at it purely from the letter of the law, without the consumer’s consent, this algorithm is akin to surveillance or a malicious code being embedded in the phone because it engages in targeted scanning, albeit for ethical goals. Looking at Indian cybersecurity law, if Apple runs this algorithm on data stored in a cloud or phone as an update and without express consent, then that qualifies as a cybersecurity incident," said Akash Karmakar, a partner at the Law Officer of Panag & Babu.

“The mitigating factor is that you agree to Apple’s terms and conditions while setting up the phone, which is how they acquire consent. Although I do not think that the consumer has a real choice to decline to accept the terms and return the phone or review the terms at an Apple Store prior to purchasing the phone," he added.

Apple may have a small market share in India, but it’s a very popular brand among youngsters and the C-Suite community. While analysts peg the company’s market share in the overall smartphone market here at 2-3%, it accounts for almost half of the total premium smartphones sold in India. During its June quarter earnings call, Apple reported “double-digit" growth in India.

Meanwhile, messaging giant WhatsApp’s chief executive Will Cathcart also opposed Apple’s move. In a series of tweets, Cathcart said WhatsApp won’t be using systems such as the one employed by Apple though it does intend to tackle CSAM content itself too. “Instead of focusing on making it easy for people to report content that’s shared with them, Apple has built software that can scan all the private photos on your phone—even photos you haven’t shared with anyone. That’s not privacy," Cathcart said in a tweet.

Cathcart also said the system Apple has built could “very easily" allow the company to “scan private content for anything they or a government decides it wants to control". In addition, he pointed out that different countries will have different definitions of what’s acceptable, suggesting that the system could be used to enhance surveillance on users and break privacy norms.

Subscribe to Mint Newsletters
* Enter a valid email
* Thank you for subscribing to our newsletter.
Recommended For You
Edit Profile
Get alerts on WhatsApp
Set Preferences My ReadsFeedbackRedeem a Gift CardLogout