IT

Apple has released a 6-page FAQ on child abuse photo scanning technology

Sarah Tew / CNET

Apple said it last week Announced to add features Scan people’s iPhones, iPads, Mac computers, and Apple Watches for material on child sexual abuse in photo apps in future software updates. On Monday, the company released a new document in hopes of alleviating privacy concerns.

A 6-page document called “”Extended protection for children, ”Is a guide to frequently asked questions about future features.

“At Apple, our goal is to create technology that empowers people and enriches people’s lives,” the company said in an opening overview. , And limit the spread of child sexual abuse material (CSAM). “

After admitting that some people are concerned about how to do this, the company put together a document to “address these questions and provide clearer and more transparent process”. Say

According to Apple, CSAM protection for scanning photos is “designed to keep CSAM away from iCloud Photos without providing Apple with information about photos other than those that match known CSAM images.” is. He adds that even owning these images is “illegal” in most countries, including the United States.

The company adds that this feature only affects users who use iCloud Photos to store their photos, and “doesn’t affect users who haven’t chosen to use iCloud Photos.”

According to Apple, the feature “doesn’t affect data on other devices” and “doesn’t apply to messages.” It also emphasizes denying requests from governments trying to extend the functionality to include non-CSAM images.

“We have faced demands to build and deploy government-mandated changes that reduce user privacy, and we have categorically rejected those demands,” the company said. Is writing.

When it comes to identifying the exact person, Apple said, “The system is less than a trillionth of a year likely to falsely flag a particular account,” the company said before sending the report. We are doing “human review”. National Center for Missing and Exploited Children. Apple concludes that “system errors and attacks do not report innocent people to the NCMEC.”

https://www.cnet.com/tech/mobile/apple-puts-out-six-page-faq-on-child-abuse-photo-scanning-tech/#ftag=CADf328eec Apple has released a 6-page FAQ on child abuse photo scanning technology

Back to top button