The anti CSAM features will be first appearing in three areas including Messages, iCloud, and Siri and Search.
Messages
The Messages app will now include tools that will warn children and their parents before sending and receiving sexually explicit images. If received such a message the child will be warned and the image blurred. The kid will be assured it is ok if they do not want to view the photo, and the parents of the child will be informed if the child chooses to see it. This is done using on-device intelligence hence making sure the user’s privacy is not violated. Similar protections are available if a kid attempts to send sexually explicit photos. The kid will be warned before the photo is sent, and the parents can receive a message if the child chooses to send it.
Siri and Search
Siri and Search will now be able to guide and help children and parents stay safe online. The service will now also include more resources related to the matter. Users can now also ask Siri to help them report CSAM and they will be guided on filing a report.
Siri will also intervene when users try to perform a search related to CSAM, they will be informed that interest in the topic can be harmful and problematic, and provide resources to help with the issue.
iCloud and CSAM detection
The most prominent step in Apple’s stance is its CSAM detection system that will try matching your images against a list of knows CSAM image hashes provided by the US National Center for Missing and Exploited Children (NCMEC) and other children’s safety organizations before an image is stored in iCloud.

“Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result, The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.” Apple said.
“Using another technology called threshold secret sharing, the system ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content. The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account.”
Apple can not normally manually view the images until the image hash reaches a set threshold, when that happens Apple will manually look at the hash, vouchers, and all the metadata. If it is distinguished as CSAM Apple will disable the Account and a report will be sent to NCMEC.
Backdoor?
Apple’s approach received criticism stating that the Cupertino giant is preparing to add a backdoor to its services. The fact that Apple can manually interpret the scanned images makes the feature controversial given that the company once declined the FBI when it was asked to create a backdoor for the FBI to access a terrorist’s iPhone.
Apple has some of the best privacy practices in the entire industry and we believe will not be using the technology to collect user data but for those that do not trust you can stop Apple from scanning your photos by not using iCloud Photos.
When will these features be implemented?
These new features are expected to go live with the release of Apple’s new iOS15, iPadOS15, macOS Monterey, and WatchOS later this year alongside Apple’s upcoming iPhone 13 and Macbook Pros.