Info@NationalCyberSecurity
Info@NationalCyberSecurity

Apple backed out of a controversial child protection feature and now we know why | #childsafety | #kids | #chldern | #parents | #schoolsafey


Apple introduced new child safety protections to help detect known child sexual abuse material (CSAM) in August of 2021. Little more than a year later, it backed down on its CSAM-scanning plans and now we know why.

Part of Apple’s child protection initiative was to identify known CSAM material before it was uploaded to iCloud Photos, but that proved controversial among privacy advocates who worried that a precedent was being set and that the technology could be misused. Now, it appears that Apple ultimately agreed.

window.reliableConsentGiven.then(function(){
!function(f,b,e,v,n,t,s){if(f.fbq)return;n=f.fbq=function()
{n.callMethod? n.callMethod.apply(n,arguments):n.queue.push(arguments)}
;if(!f._fbq)f._fbq=n;
n.push=n;n.loaded=!0;n.version=’2.0′;n.queue=[];t=b.createElement(e);t.async=!0;
t.src=v;s=b.getElementsByTagName(e)[0];s.parentNode.insertBefore(t,s)}(window,
document,’script’,’https://connect.facebook.net/en_US/fbevents.js’);
fbq(‘init’, ‘1765793593738454’);
fbq(‘track’, ‘PageView’);
})

————————————————


Source link

National Cyber Security

FREE
VIEW