Early last month, Apple announced a couple of new child safety features for all its software platforms. One of the new features was designed to scan iPhone and iPad photos for Child Sexual Abuse Material (CSAM). Some security experts raised concerns about this feature, arguing that governments could potentially use it to access their citizens' data. Although Apple initially claimed that its method of detecting known CSAM was not a threat to user privacy, the company has now postponed its launch.

In a recent statement (via 9to5Mac), Apple said:

"Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

Apple previously planned on releasing the new child safety features as part of the iOS 15, iPadOS 15, and macOS Monterey updates. However, the updates won't include the new child safety feature when they roll out later this year. At the moment, Apple hasn't shared a tentative timeline for the rollout, and it hasn't provided any details regarding the changes it plans to make to address the privacy concerns.

In simple terms, Apple's CSAM scanning feature, in its current state, matches photos on your iPhone or iPad with a database of known CSAM images provided by child safety organizations. If it detects any CSAM on a device, it can proactively alert a team of human reviewers who can contact law enforcement after verifying the material. Security researchers argue that while the algorithm is currently trained to detect CSAM, it could be adapted to scan for other imagery or text, making it a valuable tool for authoritarian governments.