Tech News

Apple postpones child sexual exploitation image search plan

Pinterest LinkedIn Tumblr

After objections over privacy rights, Apple said on Friday it had decided to delay its plan to scan users’ photo libraries for images of child exploitation.

“Last month, we announced plans for resources to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of child sexual abuse material,” the company said in a statement. .

“Based on feedback from customers, advocacy groups, researchers and others, we’ve decided to set aside additional time in the coming months to gather information and make improvements before launching these critically important child safety features.”

blank

Apple shares fell slightly on Friday morning. Apple immediately sparked controversy after announcing its system to check users’ devices for illegal child sexual abuse material. Critics pointed out that the system, which can check images stored in an iCloud account against a database of known “CSAM” images, was at odds with Apple’s messages about its customers’ privacy.

The system does not scan a user’s photos, but instead looks for known “fingerprints” that match the CSAM database. If the system detects enough images in a user’s account, they will be flagged to a human monitor who can confirm the images and pass the information on to authorities if necessary.

Apple’s CSAM detection system was due to go live for customers this year. It’s not clear how long Apple will delay its launch after Friday’s announcement.

Despite concerns about Apple’s plan, it’s actually standard practice among tech companies. Facebook, Dropbox, Google and many others have systems that can automatically detect CSAM uploaded to their respective services.

Source: CNBC

Have an article/sponsored post to share? Whatsapp: +2348129656985.

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments
Pin It