CNBC reports:
After objections about privacy rights, Apples said Friday it will delay its plan to scan users’ photo libraries for images of child exploitation.
“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements.”
Apple’s CSAM detection system was supposed to go live for customers this year. It’s unclear how long Apple will delay its release following Friday’s announcement.
Read the full article.
Apple is taking “additional time over the coming months to collect input and make improvements” before releasing their previously announced Child Safety Features.
Full statement, full text in ALT: pic.twitter.com/1UNVgtwyeV
— Rene Ritchie (@reneritchie) September 3, 2021