iOS comes with Backdoor but May be for Good Reason

The iOS 15.2 update could change the iPhone experience forever.


The upcoming iOS 15.2 update will be the most radical iPhone update in Apple's history. The most radical not only because of new features and services, but also because of a new approach to the privacy of iMessage users.

Apple's Child Safety Plan has two separate updates. The first is to scan photos on the Phone before syncing them with iCloud. Using artificial intelligence (AI) technology, images will be matched against government databases in search of child pornography. The second update will allow parents to include Apple's AI algorithms on their kids' iPhones that will alert them if a child receives or sends explicit images via iMessage.

The company's earlier announced system for scanning images on the CSAM phone has raised concerns among human rights defenders, according to which the new system will open the door for spying on users. In this regard, Apple decided to postpone the deployment of CSAM to users' devices for a while. While there is no sign of iPhone scanning at this time, there is a lightweight update for iMessage in iOS 15.2 beta for developers. Lightweight because the company appears to have backtracked on its original plan to notify parents of children under 13 about viewing tagged photos.

There was no announcement of any innovations in iMessage, but the update for it in the beta version of iOS 15.2 is different from the categorization of photos, which is already carried out on the iPhone using AI algorithms. However, the critical issue is that iMessage is end-to-end encrypted, and the update is essentially Apple adding AI monitoring to the platform. Yes, this initial use case is still very limited, but the technical barriers to wider monitoring have already been removed. This fundamentally changes iMessage, and there will be no turning back.

As experts from the Electronic Frontier Foundation (EFF) warned earlier , “It is impossible to create a client-side scanning system that would only be used to detect explicit images sent or received by children [...] Even with the best of intentions creating such a system would break the key encryption functions of the messenger and pave the way for more abuse. "

"Even a fully documented, elaborate and highly targeted backdoor is still a backdoor," the EFF said.

Read Also
Post a Comment