Apple Postpones Child Abuse Images Scan Amid Privacy Outcry

Published September 5th, 2021 - 11:00 GMT
Apple Postpones Child Abuse Images Scan Amid Privacy Outcry
This "Child Protections" feature, which was announced in August, was globally criticized by other tech giants and users think that this tool can be the start of 'infrastructure for surveillance and censorship.' (Shutterstock)
Highlights
Apple has backtracked its decision, a move that was welcomed by many users

Apple is temporarily delaying the previously announced plan to scan users' devices for child sexual abuse material amid a widespread stream of criticism and privacy worries.

As explained by Apple in a website statement:

Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.

Apple Faces Privacy Concerns

According to the iPhone maker, the tool scans phones, as well as cloud storage for child abuse imagery then reports 'flagged' owners to the police after being human-checked by an Apple employee.

This "Child Protections" feature, which was announced in August, was globally criticized by other tech giants and users think that this tool can be the start of 'infrastructure for surveillance and censorship.'

The head of WhatsApp, Will Cathcart, was among those who took a standoff and tweeted:

Apple has backtracked its decision, a move that was welcomed by many users including whistleblower Edward Snowden:

Apple and Child Protection

The delay announcement didn't specify the kind of inputs Apple would be gathering nor how many months would the delay be affective. However, the iPhone maker has announced the introduction of new child safety features in three main areas:

1. New communication tools that enable parents to watch their children's content.

2. New applications of cryptography in both iOS and iPadOS that will limit the spread of Child Sexual Abuse Material (CSAM) online.

3. Siri and Search will provide parents with any unsafe situations as well as intervening when users try to search for CSAM-related topics.

Apple

Messages will warn children and their parents when receiving or sending sexually explicit photos. Source: Apple

 


© 2000 - 2021 Al Bawaba (www.albawaba.com)

You may also like