Last Updated on September 6, 2021 by Calvin C.
Apple yielded to pressure from groups calling for a review of a scheduled, controversial feature that scans user’s devices for Child Sexual Abuse Material (CSAM). Users highlighted that the tool is dangerous in the wrong hands as the privacy of users is compromised. In response, Apple has delayed plans to scan devices and the new dates of rolling out the feature are yet to be determined.
The company gave a statement on their website as follows:
“Update as of September 3, 2021: Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material.
Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features “
There are no details of what has been reviewed but obviously the tech giant wants create a win-win situation.
Instead of only addressing the issue of CSAM, user’s won’t be left vulnerable to cybercriminals who can take advantage of the tool for malicious activities.
Initially, Apple announced that the feature will be launched with iOS 15 and MacOS Monterey in late 2021, but that is no-longer the case.
In case you missed the first announcement of the feature, you find the full details in this article, but a summary is as follows.
Back in August 2021, Apple announced new features that would help limit the spread of CSAM using it’s devices.
These measures include scanning images stored on iCloud, scanning the messaging app and also enabling Siri or the Search function to flag any CSAM-related requests.
NueralHash technology is used to scan images on user’s iPhones, iPads or Macs and match them against a vast database of CSAM images.
The database is updated and maintained by National Center of Missing and Exploited Children (NCMEC).
If a user exceeds a set threshold, the iCloud account is disabled, a manual review is ordered and the user is reported to the authorities.
Why monitor user’s Apple devices?
Government agencies have been pushing for access to user data on smart devices because threats of child pornography and terrorism continue to rise.
However, this request clashes with the need for online privacy and unfortunately, cybercriminals go undetected as security features like encryption hide their activities.
There is need to reach a compromise so that children can be protected while at the same time preserving personal privacy.
Some organizations, like The International Coalition, have taken a firm stance against the feature altogether, no-matter how Apple tries to repackage it.
In an open letter, this organization comprising of 90+ civil society organizations, highlighted the following:
“Once this capability is built into Apple products, the company and its competitors will face enormous pressure — and potentially legal requirements — from governments around the world to scan photos not just for CSAM, but also for other images a government finds objectionable. Those images may be of human rights abuses, political protests, images companies have tagged as “terrorist” or violent extremist content, or even unflattering images of the very politicians who will pressure the company to scan for them. And that pressure could extend to all images stored on the device, not just those uploaded to iCloud. Thus, Apple will have laid the foundation for censorship, surveillance, and persecution on a global basis.”
Read the full letter here.
In my opinion, valid points were raised by the International Coalition and Apple has to consider other ways to achieve its objectives without opening glaring backdoors.
What do you thing about this new development? Leave comments below and share the article with your friends on social media.