- Sputnik International, 1920
World
Get the latest news from around the world, live coverage, off-beat stories, features and analysis.

Apple Confirms It Has Already Been Scanning iCloud Mail to Combat Child Exploitation

© REUTERS / MIKE SEGARThe Apple Inc logo is seen hanging at the entrance to the Apple store on 5th Avenue in Manhattan, New York, U.S., October 16, 2019.
The Apple Inc logo is seen hanging at the entrance to the Apple store on 5th Avenue in Manhattan, New York, U.S., October 16, 2019. - Sputnik International, 1920, 24.08.2021
Subscribe
Eric Friedman, head of Apple's Fraud Engineering Algorithms and Risk unit, puzzled many earlier this month when he claimed that the tech giant has "the greatest platform for distributing child porn." His curious statement has provoked several questions for Apple regarding user privacy, phone scans and the details of its new anti-child abuse effort.
As Apple seeks to detect and report potential child abuse imagery, the tech giant also appears to be coming clean about the scope, and duration of its surveillance on users. 
The company confirmed to Apple-centered outlet 9To5Mac that its plan to periodically scan users' iCloud photos and iCloud backups for Child Sexual Abuse Material (CSAM) is only partially new, as Apple - has been routinely scanning both incoming and outgoing iCloud mail for such content. 
Apple's email surveillance has been active since 2019, and would apply to those using the Mail app on an iOS-enabled device. 
The company's admission came in response to 9To5Mac probing the anti-fraud chief's claim that Apple was "the greatest platform for distributing child porn." 
The outlet, like many privacy advocates, questioned how Apple would know about the distribution of such content without conducting some form of surveillance. 
Apple did not address Friedman's comments directly, but did highlight the scanning of users' iCloud mail, as well as the scanning of "other data," which does not refer to iCloud backups. 
Some hundreds of CSAM reports are submitted each year. 
Based on Apple's response, it is possible the anti-fraud chief was using known data to make inferences about other exploitative content that exists on its platform. 
Apple's planned rollout of new surveillance capabilities has also been slammed by an international coalition of civil and policy rights groups, including the American Civil Liberties Union, Privacy International, the Electronic Frontier Foundation, and Access Now. 
The group asserted that the expanded capabilities could be "used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children." 
Concerns about the security of iMessage's end-to-end encryption were also raised. 
"Once this backdoor feature is built in, governments could compel Apple to extend notification to other accounts, and to detect images that are objectionable for reasons other than being sexually explicit," the coalition argued.  
Newsfeed
0
To participate in the discussion
log in or register
loader
Chats
Заголовок открываемого материала