Apple takes a step toward opening the back door


Technology sector update

Child sexual abuse is abominable. However, for many smartphone users, the idea that the government or the police can read content on a device that is almost an extension of their own is also offensive.The need to resolve crime and protect privacy conflicts when Apple decides to scan US iPhones Image of child abuseActivists fighting for better child protection will celebrate. But this move set an important precedent.

Apple has long refused to insert “back door“In some cases, the code that allows law enforcement to access its devices. After the shootings in San Bernardino, California in 2015 and Florida in 2019, it twice rejected the FBI’s request to help them unlock their phones. Requirements-Although Apple has stated that it has provided data including iCloud backups. As encryption has become the key to many products and services, Facebook and other technology groups have also Opposition Allow “Special Access”.

Encrypted devices and messaging are the gospel of organized crime, terrorists, and child abuse. But large technology companies and privacy advocates argue that creating any type of backdoor opens the way for hackers, cybercriminals, or unscrupulous governments to abuse it.

In the sense of providing direct access to content through the operating system, Apple’s “neuralMatch” is not-exactly-a backdoor. In addition, if required by law enforcement, Apple has decrypted photos on its iCloud server. The precedent is that its technology will now actively screen images on the iPhone — breaking the fence surrounding its device — looking for images that match images in a database of known child abuse images in the United States. When the photos are uploaded to iCloud, matches will be marked, researched by human auditors, and sent to law enforcement after verification.

Privacy activists have warned that by allowing this pattern matching in encrypted photos on the iPhone, Apple is opening up to itself and others, accepting pressure from the government to ask them to respond to other types of content (such as opposition parties). Image of protest) do the same thing. The company can refuse, but may face mandatory legislative requirements. The apparent disclosure of the Pegasus spyware of Israel’s National Bureau of Statistics on thousands of targets shows that many governments are willing to use backdoor mechanisms.

Apple may hope that by cooperating with US authorities to crack down on one of the most ethical and vile activities using digital encryption, it can resist legislation that forces it to go further.Available in the U.S., U.K., Australia, New Zealand and Canada visit Technology companies should include mechanisms that enable governments (with appropriate legal powers) to access data. The danger is that Apple will only arouse people’s appetite. Some competitors are privately angry, believing that the Cupertino-based company has broken the ranks and acknowledged an important principle.

However, some people in the security community speculate that Apple may be preparing to introduce more encryption protections around the data on iCloud, and these protections do not currently exist. By encrypting other data stored on iCloud, helping to find child abuse material may reduce law enforcement’s current access to iCloud. This could provide a welcome additional protection for dissidents in Hong Kong. In recent years, Apple and other foreign groups have been forced to store the data of Chinese users in domestic data centers.

Cooperation between tech giants and law enforcement is essential to legal efforts to fight crime and maintain security, but the “back door” is full of dangers. Not only its users, but billions of mobile phone users all over the world hope that Apple’s move will not prove to be the thin end of a larger wedge.


Source link