San Francisco, United States:
Apple said on Thursday that iPhones and iPads will soon begin detecting child sexual abuse images and report them when uploaded to its online storage in the United States, a move that privacy advocates say is raising concern.
“We want to help protect children from predators that use communication tools to recruit and exploit them, and limit the distribution of child sexual abuse material (CSAM),” Apple said in an online post.
With new technology, software that powers Apple mobile devices can match offending photos on a user’s phone to a database of known CSAM images provided by child safety organizations, then flag the images when uploaded to Apple’s online iCloud storage, the company said.
However, several digital rights organizations say the changes to Apple’s operating systems create a potential “back door” to gadgets that could be exploited by governments or other groups.
Apple counters that it will not have direct access to the images and emphasized steps it has taken to protect privacy and security.
The Silicon Valley-based tech giant said the photo matching would be “enabled by a cryptographic technology” to determine “if there is a match without revealing the result” unless the image contained images of child sexual abuse.
Apple will report such footage to the National Center for Missing and Exploited Children, which is working with law enforcement, according to a company statement.
India McKinney and Erica Portnoy of the digital rights group Electronic Frontier Foundation said in a statement that “Apple’s compromise on end-to-end encryption may appease government agencies in the United States and abroad, but it’s a shocking turn of events for users who have trusted on the company’s leadership in privacy and security.”
Watch out for messages
The new video surveillance feature is part of a suite of tools coming to Apple’s mobile devices, according to the company.
Apple’s texting app, Messages, will use machine learning to recognize and warn children and their parents when receiving or sending sexually explicit photos, the company said in the statement.
“If you receive this kind of content, the photo will become blurry and the child will be warned,” Apple said.
“As an extra precaution, the child may also be told that, to make sure they are safe, their parents will be notified if they do see it.”
Similar precautions are taken if a child tries to send a sexually explicit photo, Apple said.
Messages will use machine learning on devices to analyze images associated with messages to determine if they are sexually explicit, Apple said.
The feature is on its way to the latest Macintosh computer operating system, as well as iOS.
Meanwhile, personal assistant Siri will be taught to “intervene” when users try to seek out topics related to child sexual abuse, Apple said.
Greg Nojeim of the Center for Democracy and Technology in Washington, DC said that “Apple is replacing its industry-standard end-to-end encrypted messaging system with a surveillance and censorship infrastructure.”
This, he said, would leave users “vulnerable to abuse and scope creep, not just in the United States, but around the world.”
“Apple must refrain from these changes and restore its users’ confidence in the security and integrity of their data on Apple devices and services.”
Apple has built its reputation for defending privacy on its devices and services, despite pressure from politicians and police to access people’s data in the name of fighting crime or terrorism.
“Child exploitation is a serious problem, and Apple isn’t the first tech company to change its privacy-protection stance in an effort to fight it,” said EFF’s McKinney and Portnoy.
“In the end, even a thoroughly documented, carefully thought out and narrowly defined back door is still a back door,” she added.
(This story was not edited by NewsMadura staff and was generated automatically from a syndicated feed.)