Apple on Tuesday filed an appeal against a copyright case it lost against security startup Corellium, which is helping researchers investigate programs like Apple’s planned new method of detecting child sexual abuse images.
A federal judge last year dismissed Apple’s copyright claims against Corellium, which is making a simulated iPhone that researchers are using to investigate how the severely restricted devices work.
Security experts are among Corellium’s core customers, and the flaws they discovered have been reported to Apple for cash premiums and used elsewhere, including by the FBI in hacking into the phone of a mass shooter who killed several people in San Bernardino, California.
Apple makes its software difficult to research, and the specialized research phones it offers to pre-selected experts have many limitations. The company declined to comment.
The appeal came as a surprise because Apple had just settled other claims with Corellium regarding the Digital Millennium Copyright Act, avoiding a lawsuit.
Experts said they were also surprised that Apple revived a battle against a major supplier of investigative tools just after they claimed investigators would put a check on its controversial plan to scan customer devices.
“Enough is enough,” said Corellium chief executive Amanda Gorton. “Apple cannot pretend to hold itself accountable to the security research community and at the same time try to make that research illegal.”
Under Apple’s plan announced earlier this month, software will automatically check photos scheduled to be uploaded from phones or computers to iCloud online storage to see if they match digital identifiers of known child abuse images. If enough matches are found, Apple employees will check if the images are illegal, then cancel the account and refer the user to law enforcement.
“We will prevent abuse of these child protection mechanisms by relying on people to circumvent our copy protection mechanisms,” is a rather disjointed argument internally,” tweeted David Thiel of the Stanford Internet Observatory.
Since Apple has marketed itself as committed to user privacy and other companies only scan content after it’s been stored or shared online, digital rights groups have objected to the plan.
One of their main arguments was that governments could theoretically force Apple to scan for banned political material as well, or to target a single user.
In defending the program, Apple executives said researchers could verify the list of banned images and examine what data was sent to the company to keep it honest about what it was looking for and from whom.
An executive said such assessments made privacy in general better than would have been possible had the scanning taken place in Apple’s storage, where it keeps the encryption secret.
© Thomson Reuters 2021