Watchdogs Sound Alarm After Apple Reveals Plan To Upload Software To iPhones That Scans User’s Photos

Privacy watchdog groups sounded the alarm late on Thursday evening after tech giant Apple revealed that the company will be uploading software to user’s iPhones that scans for images of child sex abuse, warning that the move creates a backdoor to user’s private lives, that it is essentially opening Pandora’s box and that it will be used by governments.

“Apple intends to install software on American iPhones to scan for child abuse imagery, according to people briefed on its plans, raising alarm among security researchers who warn that it could open the door to surveillance of millions of people’s personal devices,” The Financial Times reported. “The automated system would proactively alert a team of human reviewers if it believes illegal imagery is detected, who would then contact law enforcement if the material can be verified.”

The report noted that the system will be called “neuralMatch” and will only be initially rolled out in the U.S. with Apple adding in blog post that the software will “evolve and expand over time”. The software is expected to be included in iOS 15, which is set to be released next month.

The company claimed that the software provides “significant privacy benefits over existing techniques since Apple only learns about users’ photos if they have a collection of known CSAM in their iCloud Photos account.”

However, despite Apple’s claims, academics and privacy watchdogs are deeply concerned about what the move signals long-term.

Ross Anderson, professor of security engineering at the University of Cambridge, said: “It is an absolutely appalling idea, because it is going to lead to distributed bulk surveillance of . . . our phones and laptops.”

The New York Times explained how the purported technology will work:

The iPhone operating system will soon store a database of hashes of known child sexual abuse material provided by organizations like the National Center for Missing & Exploited Children, and it will run those hashes against the hashes of each photo in a user’s iCloud to see if there is a match.

Once there are a certain number of matches, the photos will be shown to an Apple employee to ensure they are indeed images of child sexual abuse. If so, they will be forwarded to the National Center for Missing & Exploited Children, and the user’s iCloud account will be locked. Apple said this approach meant that people without child sexual abuse material on their phones would not have their photos seen by Apple or the authorities.

“If you’re storing a collection of [child sexual abuse material], yes, this is bad for you,” said Erik Neuenschwander, Apple’s privacy chief. “But for the rest of you, this is no different.”

“No matter how well-intentioned, @Apple is rolling out mass surveillance to the entire world with this,” Edward Snowden tweeted. “Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow. They turned a trillion dollars of devices into iNarcs—*without asking.*”

The Daily Wire is one of America’s fastest-growing conservative media companies and counter-cultural outlets for news, opinion, and entertainment. Get inside access to The Daily Wire by becoming a member.


Read More From Original Article Here:

" Conservative News Daily does not always share or support the views and opinions expressed here; they are just those of the writer."
*As an Amazon Associate I earn from qualifying purchases

Related Articles

Sponsored Content
Back to top button
Available for Amazon Prime
Close

Adblock Detected

Please consider supporting us by disabling your ad blocker