“A very bad idea” Apple outraged the public: the new tool can search not only for child porn

According to the Financial Times, security experts have warned that Apple’s new tool, announced yesterday, could be used for surveillance, putting the personal information of millions of people at risk.

Their concerns are based on data that Apple shared with some US scientists earlier this week. Two security experts attending Apple’s briefing confirmed that a proposed system, called neuralMatch, will alert Apple if it detects child sexual abuse (CSAM) material on an iPhone or iPad. Apple will contact law enforcement to verify the information.

While security researchers support Apple’s efforts to curb the proliferation of CSAM, some have expressed concern about the potential for this tool to be misused by governments to gain access to their citizens’ data. Ross Anderson, professor of security engineering at the University of Cambridge, said: “This is an absolutely terrible idea because it will lead to massive distributed tracking of our phones and laptops.”

“A very bad idea” Apple outraged the public: the new tool can search not only for child porn

Matthew Green, professor of computer science at the Johns Hopkins Institute for Information Security, also expressed concern on Twitter and wrote: “But even if you believe Apple will not allow these tools to be misused, there is something to worry about. These systems rely on a database of “problematic media hashes” that you, as a consumer, cannot view. The hashes use a new proprietary neural hashing algorithm developed by Apple and has been approved by NCMEC for use. We don’t know much about this algorithm. “

Although the algorithm is currently trained to detect CSAM, it can be adapted to scan other targeted images or text, such as anti-government signs, making it an extremely useful tool for authoritarian governments. Apple’s precedent could also force other tech giants to offer similar features.

.
Source Link

You may also like