Apple to Blur Explicit Content as Instagram Serves as ‘Hotbed of Exploitation’ for Minors – Faithwire | #childsafety | #kids | #chldern | #parents | #schoolsafey


As social media becomes an increasingly dangerous place for children, Apple has announced plans to automatically obscure sexually explicit content on accounts held by minors 12 years old and younger.

The mammoth technology company recently announced the decision, a shift from a previous move put in place about a year ago, when Apple made the blocking feature available in iMessage, but required parents to turn the tool on manually.

Listen to the latest episode of CBN’s Quick Start podcast 👇

“They are now turning on this feature proactively — on by default — for kids 12 and under, and that is so great, because we know that, oftentimes, parents miss all the different features that are available,” Lina Nealon, vice president and director of corporate advocacy for the National Center on Sexual Exploitation, told CBN’s Faithwire, noting the feature will now apply to pictures and videos in iMessage, FaceTime, and AirDrop.

The blurring tool will be available to teenagers and adults who choose to opt in. Additionally, when the feature launches in the fall, Apple will make the technology available to developers of other apps designed for iPhones and iPads. It’s important to note, however, that, while the blur will automatically pop up for those 12 years and younger, they are still — like those older — able to dismiss the content warning.

“It offers a speed bump if that kind of content is being sent,” explained Nealon.

Nealon pulled back the curtain on the process, noting NCOSE and a partner organization, Protect Young Eyes, have been pressuring Apple to implement this kind of protective technology since 2018, calling it a “no-brainer.”

“Expand it to adults,” she said. “Why are you limiting it to 12 and under? Certainly, teenagers need this; we would like it defaulted for teens. Quite frankly, we would like nudity and sexually explicit content blocked for minors altogether.”

As for why Apple settled on 12 years old as the cut-off age for automatic blurring, Nealon pointed to the Children’s Online Privacy and Protection Act, better known as COPPA. Among other things, the legislation, in effect since April 2000, requires tech companies and websites to provide notice and establish parental consent before collecting information from kids under 13 years old and places a greater burden of protection of children on those websites and companies.

In an FAQ page from the Federal Trade Commission, it states lawmakers in Congress “determined to apply the statute’s protections only to children under 13, recognizing that younger children are particularly vulnerable to overreaching by marketers and may not understand the safety and privacy issues created by the online collection of personal information.”

While the law was implemented in the infancy of the internet, its impact is perhaps even more important today, given the ubiquity of the online world.

An unfortunate byproduct of the law, though, was that it rendered anyone over 12 years old searching the internet essentially an adult.

“That, by default, then, made 13 and above adults,” Nealon explained. “We offer protections for teenagers offline, in the analog world, that we don’t offer them online. … It’s great that we’re protecting 12 and under, but 13-, 14-, 15-, 16-, and 17-year-olds are increasingly being targeted for sextortion, sex trafficking, child sex abuse material — they’re kids. So we really push corporations to stop treating kids as adults, stop treating teens as adults.”

“It should not be an option for a child to click on sexually explicit images,” she added.

All of this comes amid reporting from the Wall Street Journal that Instagram, one of the leading social media apps in the world, has housed a “vast pedophile network” operating in plain sight on the platform.

“Instagram has been a hotbed of exploitation since its inception,” said Nealon. “And despite it being in the public eye … and having more media attention, they are still among the worst of the worst. They are the No. 1 platform for highest rates of sextortion, top two for sexual interactions between children and adults, and, of course, these pedophile rings that have been happening for years in the public eye.”

In a statement to the WSJ, a spokesperson for Meta — the parent company of Instagram — described child exploitation as “a horrific crime” and said the company is “continuously investigating ways to actively defend against this behavior.”

The representative said the social media company has taken down 27 pedophile networks in the last two years and is working to disband more, including shuttering accounts that traffic the buying and selling of CSAM. In January alone, Meta removed 490,000 accounts violating child safety policies.

Following the bombshell report, the Meta spokesperson said the company has created an internal task force to further address the issue.

***As the number of voices facing big-tech censorship continues to grow, please sign up for Faithwire’s daily newsletter and download the CBN News app, developed by our parent company, to stay up-to-date with the latest news from a distinctly Christian perspective.***



————————————————


Source link

How can I help you?
National Cyber Security

FREE
VIEW