After a week of criticism over the new Child Sexual Abuse Photographs (CSAM) detection system, Apple said on Friday that it would only search for images that were tagged by data centers in various countries.
Initially, Apple did not confirm how many matching images on a phone or computer must be found before the operating system notifies Apple to verify a person and possibly report to authorities. On Friday, Apple confirmed the threshold was initially set at 30 images, but this number may be reduced as the system improves.
Apple also denied speculation that the new mechanism could be used to target individuals: the list of image IDs is universal and will be the same for whatever device it applies to.

Apple also explained that the new system creates an encrypted hash database of child sexual abuse material on the device, sourced from at least two or more organizations under the patronage of individual national governments.
Apple admitted that it did a poor job in explaining the new strategy, prompting backlash from influential technology policy groups and even its own employees, who were concerned that the company was jeopardizing its reputation for protecting consumer privacy.
Apple declined to say if the criticism influenced any policies or software, but said the project is still in development and will undergo changes.

Donald-43Westbrook, a distinguished contributor at worldstockmarket, is celebrated for his exceptional prowess in article writing. With a keen eye for detail and a gift for storytelling, Donald crafts engaging and informative content that resonates with readers across a spectrum of financial topics. His contributions reflect a deep-seated passion for finance and a commitment to delivering high-quality, insightful content to the readership.