Corellium is Double Checking on Apple’s New CSAM Plan
Table of Contents
- By Dawna M. Roberts
- Published: Sep 14, 2021
- Last Updated: Mar 18, 2022
Recently, Apple has been in the news concerning their announcement that they would soon be scanning all iCloud content looking for images that indicate child abuse. Security company Corellium is offering to pay to verify the validity of this new feature.
What Does the New CSAM Plan Change Mean?
According to Apple, “protecting children is an important responsibility.” to that end, the company plans on implementing changes to its messaging service. For example, when an underage device user tries to send a sexually explicit photo, they will see a warning. If an underaged user receives a similar picture, it will be blurred with a warning that the image “may be sensitive.” The user has the option to still view it, but another message will pop up.
The feature explained above will use on-device AI and machine learning to differentiate between what is sexually explicit and what is not. If the system detects abusive images, Apple will then alert the National Center for Missing and Exploited Children.
Another feature getting a lot of press includes Apple scanning all devices and iCloud storage for Child Sexual Abuse Material, or CSAM. Despite Apple’s claims that this will in no way infringe upon users’ privacy, critics say that there is no way that can be true.
Apple explains,
“Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the unreadable set of known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. Private set intersection (PSI) allows Apple to learn if an image hash matches the known CSAM image hashes without learning anything about image hashes that do not match. PSI also prevents the user from learning whether there was a match.”
To make things even more confusing, Apple continued with, “For example, if a secret is split into 1,000 shares, and the threshold is 10, then the secret can be reconstructed from any 10 of the 1,000 shares. However, if only nine shares are available, then nothing is revealed about the secret.”
What is Corellium Doing?
According to 9to5Mac “Security company, Corellium is offering to pay security researchers to check Apple CSAM claims, after concerns were raised about both privacy, and the potential of the system for misuse by repressive governments.”
There has been much talk about the potential for abuse and areas where weaknesses could potentially open a backdoor. Corellium wants to ensure that before that door is opened, someone secures the lock.
Therefore, Corellium is offering $5,000 in grants, plus free use of its iOS virtualization platform for a year for security researchers to thoroughly vet this new system before it goes online across the country.
9to5Mac says that ‘Since that initial announcement, Apple has encouraged the independent security research community to validate and verify its security claims. As Apple’s SVP of Software Engineering Craig Federighi stated in an interview with the Wall Street Journal, “Security researchers are constantly able to introspect what’s happening in Apple’s [phone] software, so if any changes were made that were to expand the scope of this in some way—in a way that we had committed to not doing—there’s verifiability, they can spot that that’s happening.’ Supporters commend Apple for opening up its operating system to scrutiny by third-party evaluators. Corellium believes that its platform is uniquely capable of thoroughly testing and vetting the new features before they go online.
The company is accepting grant applications from security researchers, but they will need to agree to specific terms upon approval.