News

Apple delays its criticized plan to scan iPhones for child exploitation images

Silhouette of a mobile user seen next to a screen projection of the Apple logo in this picture illustration taken March 28, 2018.
Dado Ruvic | Reuters

Apple on Friday said it would delay a controversial plan to scan users’ photo libraries for images of child exploitation.

“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material,” the company said in a statement. “Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

Apple immediately stirred controversy after announcing its system for checking users’ devices for illegal child sex abuse material, or CSAM. Critics pointed out that the system, which can check images stored in an iCloud account against a database of known CSAM imagery, was at odds with Apple’s messaging around its customers’ privacy.

The system does not scan a user’s photos, but instead looks for known digital “fingerprints” that it matches against the CSAM database. If the system detects enough images on a user’s account, it is then flagged to a human monitor who can confirm the CSAM imagery and pass the information along to law enforcement if necessary.

This is breaking news. Please check back for updates.

Products You May Like

Articles You May Like

Google Launches Campaign to Pressure Apple to ‘Fix Texting’, Adopt RCS Messaging
Crossbeats Ignite Grande With Ultra-HD LTPS Display Launched in India
Crossbeats Ignite S4 Max Smartwatch With Always-on Display, Bluetooth Calling Support Launched in India
Twitter Breach Said to Have Exposed Anonymous Account Owners
WhatsApp ‘Delete for Everyone’ Feature Gets Extension to Over 2 Days

Leave a Reply

Your email address will not be published.