Yesterday’s security expert warning was confirmed. Apple has introduced new child safety features. Among other measures, there is also the function of checking photo galleries in iOS and iPadOS.

According to Apple, iOS and iPadOS will use new cryptography applications to limit the distribution of child sexual abuse (CSAM) material on the Internet. The company promises to ensure the privacy of users. If it detects illegal content, Apple can provide law enforcement with information about CSAM collections in iCloud Photos.
Apple explains that new technology in iOS and iPadOS will allow the discovery of materials stored in iCloud Photos, known from the databases of law enforcement agencies CSAM. This will allow Apple to report these cases to the National Center for Missing and Exploited Children (NCMEC). To do this, the system performs photo matching on the device using a database of known CSAM image hashes provided by NCMEC and other child safety organizations.
The second is that the Messages app will use machine learning on the device to alert you to sensitive content. New tools in the Messages application will alert children and their parents when they receive or send sexually explicit photos.
Upon receipt of such content, the photo will be blurred, the child will receive a warning and links to resources that are useful in such a situation. Also, the child will be warned that if he views this photo, then his parents will receive a notification. Similar steps are taken if a child attempts to send sexually explicit photographs. The child will be alerted before shipment and parents will be alerted upon shipment.

Finally, Siri and Search will be updated to provide parents and children with enhanced information and help in case of unsafe situations. Siri and Search will also intervene if the user tries to search for topics related to child sexual abuse material.
All of these features will become available after the release of iOS 15, iPadOS 15, watchOS 8 and macOS Monterey with the next updates, but before the end of the year. So far, such measures only threaten users in the United States, but over time, they will be implemented in other regions.
Security expert Matthew Green yesterday warned of Apple’s plans. The goal of protecting children is in itself quite noble, but the expert is seriously concerned about where this can lead society. The expert expressed concern about the new measures and called them “a really very bad idea.”

Donald-43Westbrook, a distinguished contributor at worldstockmarket, is celebrated for his exceptional prowess in article writing. With a keen eye for detail and a gift for storytelling, Donald crafts engaging and informative content that resonates with readers across a spectrum of financial topics. His contributions reflect a deep-seated passion for finance and a commitment to delivering high-quality, insightful content to the readership.