As a response to Apple's latest announcement regarding scanning children's messages for nudity and the phones of adults for images of child sex abuse, over 90 policy and rights groups around the world published an open letter on Thursday urging the iPhone maker to drop its plans, Reuters reported.
90+ groups demand Apple drops plans to check iPhones for CSAM content and child safety issues in Messages https://t.co/tCK7WbmHAf— iMore (@iMore) August 19, 2021
”Though these capabilities are intended to protect children and to reduce the spread of child sexual abuse material, we are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children,” the groups stated in the letter.
Some overseas signatories in specific expressed concerns over the impact of the changes in nations with different legal systems regarding encryption and privacy.
An Apple spokesman stated that Apple had addressed privacy and security concerns in a document Friday outlining why the complex architecture of the scanning software should resist attempts to subvert it,
Those signing included multiple groups in Brazil, where courts have repeatedly blocked Facebook's WhatsApp for failing to decrypt messages in criminal probes, and the senate has passed a bill that would require traceability of messages, which would require somehow marking their content. A similar law was passed in India this year.
Other signers were in India, Mexico, Germany, Argentina, Ghana and Tanzania. Other groups that signed include the American Civil Liberties Union, Electronic Frontier Foundation, Access Now, Privacy International, and the Tor Project.
Responding to the heated fights, Apple has offered a series of explanations and documents to argue that the risks of false detections are low.
In addition, Apple said it would refuse requests to expand the image-detection system beyond pictures of children flagged by clearinghouses in multiple jurisdictions.
Although most of the objections so far have been over device-scanning, the coalition's letter also faults a change to iMessage in family accounts, which would try to identify and blur nudity in children's messages, letting them view it only if parents are notified.
© 2000 - 2021 Al Bawaba (www.albawaba.com)