Home » Blog » Apple scanning iPhones for child sex abuse images

Apple scanning iPhones for child sex abuse images

Tech giant Apple announces details of a scanning system to find child sexual abuse material (CSAM) on US customers’ iPhones and other devices.

Apple scanning iPhones for images of abuse

Before storing images onto iCloud Photos, the technology will look for matches for CSAM that are already known. If the system detects a match, it is assessed by human review before reporting the user to law authorities.

However, there are concerns over the privacy of doing this. Experts fear authoritarian governments could use the technology to spy on its citizens. The system could extend to scanning phones for other types of prohibited content or political information.

Apple are to release new versions of iOS for their devices later this year. It will have “new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy”.

The US National Center for Missing and Exploited Children (NCMEC), along with other child safety organisations, have a database of known child sexual abuse pictures. Each image is translated into a numerical code, orĀ “hashes”. Apple’s system works by comparing images on Apple devices to those on this database, checking for matches to the code.

According to Apple, the technology can also identify edited but similar versions of original pictures.

Privacy issues

Apple claim the system has an “extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account”. Every report of CSAM is manually reviewed to confirm the match. Once identified, they can then take further steps to disable user accounts and involve law enforcement.

The firm insist that compared to existing techniques, the new technology offers “significant” benefits with regards to privacy. This is because Apple only learns about users’ pictures if their iCloud Photos account contains known CSAM.

However, privacy experts still have concerns about the system. They say that Apple are encouraging a view of it being acceptable behaviour to scan the content on people’s phones.

Security researcher Matthew Green says: “Regardless of what Apple’s long term plans are, they’ve sent a very clear signal. In their (very influential) opinion, it is safe to build systems that scan users’ phones for prohibited content”. He believes it doesn’t matter whether Apple “turn out to be right or wrong” in what they are doing. Creating this system opens it up for governments to “demand it from everyone”.

Thank you for reading Apple scanning iPhones for child sex abuse images


Garden Sheds - Dunster House Taarmo Shed

Looking for a storage solution for your garden furniture, tools and belongings? Look no further – Garden Sheds at affordable prices.