Info@NationalCyberSecurity
Info@NationalCyberSecurity

Apple cites privacy concerns to refuse detection of child sexual abuse material – EURACTIV.com | #childsafety | #kids | #chldern | #parents | #schoolsafey


US tech giant Apple, which decided to halt the development of a photo-scanning tool to detect child sexual abuse material last December, has now offered data privacy concerns as the main reason behind the decision.

While Apple announced its decision to kill the photo-scanning tool at the end of last year,  they only provided an explanation last Thursday (31 August), thanks to an exchange between an Apple manager and the leader of a company fighting against child sexual abuse.

Erik Neuenschwander, director of user privacy and child safety at Apple, detailed the company’s choice in an email replying to Sarah Gardner, CEO of the Heat Initiative, which encourages tech companies to “detect and eradicate child sexual abuse images and videos on their platforms.” The exchange was published by Wired.

According to Neuenschwander, Apple has “considered scanning technology from virtually every angle” and they “concluded it was not practically possible to implement without ultimately imperilling the security and privacy of our users,” the email said.

Such concerns have been at the heart of the controversy ever since the introduction of the EU draft law aimed at detecting and removing online child sexual abuse material (CSAM).

Now, however, as Gardner put it, “the most valuable and prestigious tech company in the world” and “a global leader in user privacy” publicly expressed that, in their view, scanning material cannot go hand-in-hand with users’ privacy.

Child protection or violation of privacy?

The proposed regulation aiming to prevent and combat online child sexual abuse material (CSAM) would make it a requirement for digital platforms in the EU to detect and report such material.
The draft law is dividing people into either thinking it …

“Scanning every user’s privately stored iCloud content” would pose “serious unintended consequences for our users,” Neuenschwander wrote.

He also warned that scanning for one type of content “opens the door for bulk surveillance and could create a desire to search other encrypted messaging systems across content types (such as images, videos, text, or audio) and content categories”.

In her email, Gardner asked: “How can users be assured that a tool for one type of surveillance has not been reconfigured to surveil for other content such as political activity or religious persecution?”

Finally, Neuenschwander added that scanning systems are “not foolproof” and there is “documented evidence from other platforms that innocent parties have been swept into dystopian dragnets” which have “made them victims when they have done nothing more than share perfectly normal and appropriate pictures of their babies”.

Ella Jakubowska, senior policy adviser at the European digital rights association EDRi, shared Apple’s concerns. “All of the goodwill in the world doesn’t change the technical reality that there is not a way to safely and securely scan encrypted messages or services,” she told EURACTIV.

“Despite years of trying, Apple have proven that the mass scanning measures currently being explored in the EU are not fit for purpose. Even the European Commission’s own impact assessment shows that you cannot have generalised access to private communications in a safe and privacy-respecting way,” she added.

Child sexual abuse: New informal documents focus on detection orders

Detection orders are the focus of several informal documents, distributed during this week’s technical meetings in the European Parliament about the draft law to prevent online child sexual abuse.

The proposed regulation aims to prevent and combat online child sexual abuse …

Yet, some companies, including the NGO Thorn, which seeks to defend children online and has its own software, Safer, to detect CSAM, seem to disagree.

“We need to stop pitting user privacy and child safety against each other because with tools like the ones created by Thorn and others alongside an adequate framework with robust safeguards, we can have both,” Emily Slifer, director of policy at Thorn, told EURACTIV.

These tools, she added, “have been reliably used for years” and are “constantly being improved, getting better day by day”.

According to Slifer, Apple has the knowledge and expertise “to create solutions that balance both privacy and child safety. What’s needed are more solutions geared towards finding the right balance”.

EURACTIV also reached out to Javier Zarzalejos, an EU lawmaker who is the European Parliament’s rapporteur for the CSAM file.

“One of the guiding principles of the Regulation is technological neutrality, thus the Regulation does not prohibit or prefer any specific tool or technology for the providers to fulfil their obligations under the Regulation, as long as these technologies and tools comply with certain safeguards,” Zarzalejos said.

He also added that Apple’s upcoming on-set devise tools will be included in the possible mitigation measures.

[Edited by Zoran Radosavljevic]

Read more with EURACTIV

!function(f,b,e,v,n,t,s)
{if(f.fbq)return;n=f.fbq=function(){n.callMethod?
n.callMethod.apply(n,arguments):n.queue.push(arguments)};
if(!f._fbq)f._fbq=n;n.push=n;n.loaded=!0;n.version=’2.0′;
n.queue=[];t=b.createElement(e);t.async=!0;
t.src=v;s=b.getElementsByTagName(e)[0];
s.parentNode.insertBefore(t,s)}(window, document,’script’,
‘https://connect.facebook.net/en_US/fbevents.js’);
fbq(‘init’, ‘307743630704587’);
fbq(‘track’, ‘PageView’);

————————————————


Source link

National Cyber Security

FREE
VIEW