imac 1999636 1280

Apple To Scan iPhones For Child Abuse Sex Images

Reading Time: 3 minutes

Last Updated on August 7, 2021 by Calvin C.

It’s bad news for perverts as Apple has decided to scan iPhones for child abuse sex images. This means if you have such content on your device, you risk facing wrath of the law as any positive hits are relayed straight to NCMEC (National Center For Missing or Exploited Children).

This feature is also available on other Apple devices, namely Mac, iPad and Apple Watch, to curb the spread of Child Sexual Abuse Material (CSAM) in the U.S.

All the photos you upload to iCloud Photos are scanned first and in addition, this feature makes sure devices used by minors under 13 years are safe.

This means even photos sent using iMessages are subject to the scan ad compared against a known database.

What about Siri? If you try to search for any CSAM-related material, Siri will quickly warn you of the problems you face with that kind of material.

Scan iPhones For Child Abuse Sex Images

How CSAM-related material is detected

The technology is called NeuralHash and involves matching images on your device against a database of know CSAM image hashes in the cloud.

This database is uploaded by National Center For Missing or Exploited Children and other related child protection bodies.

In an announcement, Apple said “Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations.

Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.”

On-device matching takes place automatically and the system uses a cryptographic technology called private set intersection.

In the technical summary, Apple highlights that the technology is extremely accurate and all reports are manually reviewed to make sure there are no false positives.

Privacy concerns

While the roll-out of this technology is noble, concerns that have been raised by researchers because a lot of things can go wrong.

The feature can be used for malicious purposes, like framing high-ranking individuals for possession of child p0rn, to creating a vulnerability that can be exploited by cybercriminals.

In a series of tweets, Matthew Green said, “For the past decade, providers like Apple, WhatsApp/Facebook, Snapchat, and others have been adding end-to-end encryption to their text messaging and video services. This has been a huge boon for privacy. But governments have been opposed to it.”

He goes on to write, “These are bad things. I don’t particularly want to be on the side of child p0rn and I’m not a terrorist. But the problem is that encryption is a powerful tool that provides privacy, and you can’t really have strong privacy while also surveilling every image anyone sends.”

This means the tug-of-war between privacy experts and authorities who want to police the digital platforms is unlikely to go away any time soon.

Conclusion

Apple has finally buckled to pressure from the government to create a backdoor to its encryption systems.

Previously this tech giant was hailed for maintaining its stance to protect the privacy and security of its users.

While I’m not against this new feature, to be rolled out with iOS 15, on-device surveillance definitely always raises eyebrows.

The Center for Democracy and Technology also issued out a statement on the mater and is as follows:

“The mechanism that will enable Apple to scan images in Messages is not an alternative to a backdoor—it is a backdoor. Client-side scanning on one “end” of the communication breaks the security of the transmission, and informing a third party (the parent) about the content of the communication undermines its privacy.

Organizations around the world have cautioned against client-side scanning because it could be used as a way for governments and companies to police the content of private communications.”

What are your thoughts on the subject? Leave comments below and share the article article with your friends on social media.

For added privacy, make sure you use a VPN. You can check out our recommendations for iOS and for MacOS. Got no time? Get NordVPN:

4.8
  • Overall BEST VPN
  • Save 68% for a 2-year plan
  • Unblocks Netflix
  • Adblock & antimalware

Tech writer and VPN expert. DIY enthusiast and loves anything to do with space science.

Leave a Reply

Your email address will not be published. Required fields are marked *