WhatsApp attacked Apple’s child safety “surveillance” despite the government’s praise


WhatsApp condemned Apple’s new child safety tool as a “very worrying… surveillance system”, even as governments around the world cheered the decision to actively search for illegal photos of child sexual abuse.

The standoff triggered a battle between other technology platforms and officials, calling on them to use similar tools.

An Indian government official told the Financial Times on Friday that it “welcomes” Apple’s new technology and sets a benchmark for “other technology companies”, while an EU official said that the technology group has designed “quite elegant solution”.

U.S. Senator Richard Blumenthal Call Apple’s new system has taken an “innovative and bold step.”

“It’s time for others-especially Facebook-to follow their example,” Tweet The British Minister of Health and former Home Secretary Sajid Javid (Sajid Javid).

However, it is said that Apple’s competitors in Silicon Valley have “white-hot” the system that scans photos of US users’ iPhones before uploading them to iCloud, and iCloud will be launched as part of the next iOS version.

Will Cathcart, head of WhatsApp, said: “This method brings some very worrying things to the world.” “This is a surveillance system built and operated by Apple that can be easily used to scan private content to find them. Or anything the government decides to control. It’s disturbing to see them act without expert involvement.”

“We will not adopt it in WhatsApp,” he added.

The enthusiastic response of legislators will only exacerbate concerns raised by the security and privacy community that Apple has set a dangerous precedent that could be exploited by authoritarian regimes or overzealous law enforcement agencies.

Applications including Facebook’s WhatsApp, Telegram, and Signal, and Google, which owns the Android operating system, have been urged to copy Apple’s model.

“To say that we are disappointed with Apple’s plan is an understatement,” India McKinney and Erica Portnoy of the Digital Rights Group Electronic Frontier Foundation said in a blog post. “Apple’s compromise in end-to-end encryption may appease government agencies in the United States and abroad, but for users who rely on the company’s leadership in privacy and security, this is a shocking shift.”

In recent months, there has been increasing political pressure on technology companies around the world, requiring governments to access encrypted content, including messages, photos, and videos.

Indian Prime Minister Narendra Modi recently passed a law to force technology platforms such as WhatsApp to track the source of illegal messages, effectively breaking end-to-end encryption. WhatsApp is currently in a lawsuit with the government in an attempt to obstruct the new regulations.

Last October, in a Open the envelope Signed by the “Five Eyes” countries plus Japan and India, officials including British Home Secretary Pritty Patel and former U.S. Attorney General William Barr stated that they “urge the industry to address our serious concerns, that is, to completely exclude Any legally accessible content”.

They pointed out that child abuse is one of the reasons why they believe that technology companies should develop alternative methods to allow authorities to access device content, and that “there is a growing consensus between the government and international agencies that action must be taken”.

Critics questioned Apple’s promise to limit itself to scanning images of child abuse. Canadian cryptography researcher and executive director Sarah Jamie Lewis said: “I hate landslides, but I look at the slopes. Governments all over the world are covering it with oil, and Apple is just pushing its customers to the edge.” NGOs Open privacy.

Although there is no U.S. legislation to force Apple to search for this material, its move comes at a time when the United Kingdom and the European Union are preparing new legislation-the Online Security Act and the Digital Services Act-which will give technology companies greater responsibility To limit the spread of child pornography and other forms of harmful content.

Apple’s decision to advance its own independent system instead of joining in cross-industry negotiations with regulators around the world has angered its Silicon Valley neighbors—especially in their Unite and support In 2016, he launched a legal battle with the FBI over interviewing the iPhone of a suspected terrorist.

Matthew Green, a safety professor at Johns Hopkins University, said: “Some of the reactions I heard from Apple’s other competitors are that they are incandescent lamps.” Online video discussion Thursday with Stanford University researchers.

Former Facebook security director and current director of the Stanford Internet Observatory, Alex Stamos, said in the same discussion that Apple “does not care that everyone is trying to achieve this delicate international balance.” “Obviously, WhatsApp will face direct pressure,” he said.

On Thursday, an Apple executive admitted in an internal memo that his actions caused an uproar. “We have seen many positive responses today,” Sebastien Marineau wrote in a note obtained on the Apple blog 9to5Mac“We know that some people have misunderstandings and many people are worried about its impact, but we will continue to explain and detail these features so that people can understand what we are building.”

Facebook and Google have yet to comment publicly on Apple’s statement.

Apple has previously been criticized in certain quarters for not taking more measures to prevent the spread of abusive materials, especially on iMessage. Because the iPhone’s messaging app is end-to-end encrypted, the company cannot see any photos or videos exchanged between users.

Information exchanged between two senior Apple engineers indicates that some people within the company believe it can do more, and this information is evidence of the iPhone maker’s recent legal dispute with Epic Games.

In the exchanges at the beginning of last year, this is the first exposed Through the Technical Transparency Project, Eric FriedmanThe head of Apple’s fraud engineering algorithm and risk department said that compared with Facebook, “we are the best platform for distributing child pornography.”

Friedman added: “We choose not to know in enough places that we really can’t say” how much child sexual abuse material might exist.

Additional reporting by Stephanie Findlay in Delhi, Valentina Pop in Brussels and Hannah Murphy in San Francisco


Source link