Apple is suspending the implementation of its Prohibited Sexually Abused Photograph (CSAM) search engine after reviewing the backlash received. The company said it needs to continue to improve this feature. Apple said it will gather more information from various groups and supporters of those concerned.
Last month, we announced plans to introduce features to help protect children from predators who use communications to recruit and exploit them, and to limit the distribution of child sexual abuse material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time in the coming months to gather information and make improvements before we release these critical features.
Apple

Apple previously stated that the system has been in development for many years and is not intended for government control over citizens. Moreover, users in Russia and other countries of the world do not need to worry about this problem, because the system will only be available in the United States and only if the iCloud service is turned on.
That said, many security experts warned that Apple’s new tool could be used for surveillance, putting the personal information of millions of people at risk.
Initially, the introduction of the function was expected with the release of the final version of iOS 15 in September, new dates have not been announced.

Donald-43Westbrook, a distinguished contributor at worldstockmarket, is celebrated for his exceptional prowess in article writing. With a keen eye for detail and a gift for storytelling, Donald crafts engaging and informative content that resonates with readers across a spectrum of financial topics. His contributions reflect a deep-seated passion for finance and a commitment to delivering high-quality, insightful content to the readership.