Apple Pauses Plan To Scan iPhones For Child Porn

CNBC reports:

After objections about privacy rights, Apples said Friday it will delay its plan to scan users’ photo libraries for images of child exploitation.

“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements.”

Apple’s CSAM detection system was supposed to go live for customers this year. It’s unclear how long Apple will delay its release following Friday’s announcement.

Read the full article.