Yahoo Finance’s Dan Howley breakdowns down the controversy and concerns over Apple’s new feature to protect children.
KRISTIN MYERS: Time now for “Tech Column” with our Tech Editor Dan Howley. He’s here to give us the details on Apple’s child abuse image scanning and how it’s great technology– or is it? Dan, give us all the details.
DAN HOWLEY: That’s right, Kristin. Apple is going to be rolling this new technology out and its upcoming versions of iOS, MacOS, WatchOS, as well as iPadOS. And really what this is going to do is essentially download these hashed images. Now, they’re not actual images. There’s basically just numbers that represent images.
You can’t reverse engineer them into images. There’s no images going on your phone. But what’s going to happen is when you download this upcoming update, these numbers will appear on your phone. And what this is supposed to do is match up with similar numbers that correspond to child sexual abuse material that Apple has received from the National Center for Missing and Exploited Children.
And the basic idea is if you connect to your cloud account, your phone will automatically start scanning on device for any matches to those numbers that match up with those images. If it shows that there’s a match when you’re connected to iCloud, it will then create this little voucher on your phone. And it will upload that voucher along with that photo to your iCloud account.
You won’t know it’s happening, and the idea there is to not spook any potential criminals so they delete all of the content and can’t be taken care of. If it reaches a certain threshold of vouchers on your account, Apple will then go into a manual review to make sure that this isn’t some kind of misunderstanding or, you know, there’s some kind of mistake.
Apple, though, says there’s one in a trillion chance of that happening. So chances are it’s not going to be a mistake. And then they will use that information to alert the National Center for Missing and Exploited Children. And this is all about basically taking care of that child sexual abuse material, which has exploded in recent years thanks to things like smartphones, like encryption, social media.
And Apple is kind of taking this approach as a means to what they say is protect privacy, while also ensuring that this material isn’t available. But the security researchers that I’ve spoken to have said that this is essentially like a Pandora’s box, where if a totalitarian government, authoritative regime comes in– for instance, China– and tells Apple to add other hashes– meaning other numbers that represent things like, for instance, pro-democracy movements, LGBTQ+ content, it would then be able to search that kind of information on users’ phones and out them to the government.
Now, Apple says it will not do that. It won’t bow to any government that requests that. And for now, the company is only rolling this out in the US. So it’s not as though this is going to be an international thing. But the fear is that if it does go international, it could put people at-risk who are trying to protest against totalitarian regimes, part of free government movements, and things of that nature.
So this is really kind of what’s the best approach. And the researchers that I spoke to said there really isn’t a best approach to this. One researcher had said that, look, if maybe Apple had launched this at small scale, proved that it works, then that may be worth the risk. Another said the only way, really, is to not provide this information to law enforcement so that the government can’t get a handle on it– instead, provide a warning to users.
And that might not necessarily be the best way, but it’s possibly the only way to ensure that there’s no overreach here. So it really does come down to where the priorities lie. And free speech advocates will say, look, this could put people in danger. Others will say that it’s worth it if it comes to protecting children.
And again, there has been an explosion of this kind of content online because of things like smartphones, social media, the dark web, what have you. So it really is a complicated issue. But Apple says that they’re going to move forward with it and, again, that they would not bow to any pressure from any regime. The issue, though, is, of course, one of their biggest regions is China, and they’ve had to make some concessions there, such as giving the keys to Apple’s iCloud encryption for Chinese citizens there to the government, as well as not allowing virtual private network apps in the Apple Chinese App Store, because the government doesn’t want people getting around the great firewall there.
And that’s obviously not something that Apple would want to do. But as they say, they follow the laws of the land wherever they operate.
KRISTIN MYERS: All right, thanks so much to our Tech Editor Dan Howley for breaking all of that down for us.