Apple postponed photo validation due to criticism

Apple has announced that it will not launch child protection features as originally planned. Instead, she intends to consult, gather more information and make changes. In an email sent to AppleInsider and other media outlets, Apple announced that it had decided to indefinitely postpone CSAM (Child Sexual Abuse Detection) functions following public backlash.

“Last month, we announced plans to introduce features to help protect children from bullying who use communication to engage and exploit them, and to limit the distribution of child sexual abuse material. Based on feedback from customers, advocacy groups, researchers and others, we decided we needed more time to gather information and make improvements before we release these critical child safety features, ”Apple wrote in an email.

In early August, the company announced the CSAM features and announced that they will appear on Apple devices with the release of new versions of the operating systems for the iPhone, iPad, MacBook and Apple Watch later this year. These features include detecting child sexual abuse images in iCloud and blocking potentially harmful messages. Security experts wrote an open letter urging Apple not to implement new features as they violate privacy. The company detailed its intentions and outlined how the CSAM functions should work, but that didn’t change anything. Ultimately, Craig Federighi publicly stated that there was confusion over the simultaneous announcement of two new features.

Source Link

You may also like

Larry Fink praised Bitcoin
Top News
David

Larry Fink praised Bitcoin

BlackRock CEO Larry Fink confirmed his optimistic position against Bitcoin, calling it “currency of fear” and a modern form of