Login

Register

Login

Register

#parent | #kids | YouTube’s kids app has a rabbit hole problem | #sextrafficing | #childsaftey | #cybersecurity | #infosecurity | #hacker



YouTube Kids is a colorful, stripped-down version of YouTube, full of animations, bright colors, and cartoon avatars meant to keep the youngest internet users engaged. When scrolling through the app, kids can see everything from Nickelodeon song mashups to prank series to baking videos — a cheerful-seeming microcosm of actual YouTube.

But child safety advocates and some members of Congress say there’s a problem with the app and corresponding website: an autoplay feature that keeps one video coming after another with no pause or interruption. When one video ends, another video selected by YouTube Kids’ recommendations algorithm automatically plays. Right now, autoplay is on by default, and there is no way to turn off autoplay, so YouTube Kids will continue to feed children algorithmically curated videos that run indefinitely unless someone happens to show up and intervene.

“If you’re a 6- or 7-year-old child, you’re basically watching what’s being recommended to you,” Rep. Lori Trahan (D-MA), a member of Congress whose kids use the app, told Recode. “If you combine that with the endless loop of autoplay, then if you’re not sitting next to your child while they’re watching YouTube, that can get away from you pretty quickly.”

Trahan says she’s worried the default autoplay setting is a manipulative design tactic meant to keep children online for as long as possible, a concern she raised with Google CEO Sundar Pichai during a March hearing about misinformation. She’s not alone. A recent letter to YouTube CEO Susan Wojcicki from the House Oversight Committee’s subcommittee on consumer and economic policy, which has launched an investigation into the platform, says the app is harmful because it “places the onus on the child to stop their viewing activity, rather than providing a natural break or endpoint.”

After Recode asked about the inability to turn off autoplay in the Kids app, YouTube said, “In the coming months, users will also be able to control the autoplay feature in YouTube Kids.” The company did not say why it made that decision or why it would take so long to change the feature.

But legislators are trying to go even further. A law proposed last year called the Kids Internet Design and Safety (KIDS) Act would ban the use of autoplay in kid-targeted apps entirely, and experts at a recent Federal Trade Commission panel on dark patterns targeting children highlighted autoplay in children’s video apps as a concern, with some pointing specifically to YouTube. (Dark patterns, as Recode’s Sara Morrison has explained, are web design features that trick or pressure people into making certain choices online.)

“Platforms like YouTube Kids need child-centric design by default, so while the addition of the autoplay control is a step in the right direction, it falls woefully short of what parents and their children should be able to expect,” Rep. Trahan told Recode in response to YouTube’s change.

Meanwhile, the House subcommittee is continuing its investigation into the YouTube Kids app and has requested documents related to “behavioral analytics” and pilots YouTube has run for features like autoplay. A senior Democratic aide confirmed to Recode that investigators had heard from parents and child safety groups about the feature being problematic. YouTube said it has provided an initial response to the subcommittee and will provide more information in the coming weeks.

But amid a pandemic that’s left parents working and kids learning from home, critics warn that small design choices make things harder for parents trying to set reasonable limits on what young children see online. Some say features like forced autoplay manipulatively glue children to their platforms with low-quality content. So even if YouTube does add the option, some are worried that leaving it on by default — or allowing it at all — could still do harm.

YouTube Kids’ autoplay takes control away from parents

YouTube Kids is largely modeled off the regular version of YouTube. The homepage of the app is a simplified version of the YouTube homepage, suggesting different videos in categories like “Reading” and “Shows.” Launched in 2015, YouTube Kids is designed to show only content targeted toward kids. That includes cartoons and home videos, as well as videos from production outfits like Disney and Vox that get pulled onto the YouTube Kids platform from YouTube proper. YouTube says there are now more than 35 million weekly viewers on YouTube Kids in more than 80 countries.

YouTube Kids is not a stranger to controversy. In the past, the app has been criticized for allowing violent and sexualized content, and the company even warns parents that its systems may not exclude all inappropriate content. A 2017 report from the New York Times found that children were encountering violent videos including popular children’s cartoon characters.


YouTube Kids removed this video from the platform after Recode asked about it.
Screenshot of YouTube Kids

In 2019, the Federal Trade Commission found that YouTube had advertised its popularity among children to toy makers like Mattel and Hasbro and inappropriately collected information about children, allegedly violating the Children’s Online Privacy Protection Rule (COPPA), which limits the data that platforms can collect from children they know to be under 13. The agency forced YouTube to make major changes to kids-focused content on its main site. Unlike the regular YouTube platform, YouTube Kids now requires parents’ consent to collect children’s data (though regular YouTube does not). YouTube was also fined a record $170 million.

But some parents who have used the YouTube Kids platform, including parents in Congress, say the kids’ version still has manipulative features, like autoplay, that make it harder for families to protect their children.

“It feels like this is engineered to keep kids watching an endless stream of videos one after another and have them never be over,” Amanda Kloer, a campaign director with the child safety group ParentsTogether who uses the app with her kids, told Recode. “When you combine that with the algorithm-driven recommendations, which show content that parents have not viewed and have not approved, you can very easily get into situations where kids are watching content that was meant for older kids, that an individual parent or family finds objectionable.”



Original Source link

Leave a Reply

Shqip Shqip አማርኛ አማርኛ العربية العربية English English Français Français Deutsch Deutsch Português Português Русский Русский Español Español

National Cyber Security Consulting App

 https://apps.apple.com/us/app/id1521390354

https://play.google.com/store/apps/details?id=nationalcybersecuritycom.wpapp


Ads

NATIONAL CYBER SECURITY RADIO

Ads

ALEXA “OPEN NATIONAL CYBER SECURITY RADIO”

National Cyber Security Radio (Podcast) is now available for Alexa.  If you don't have an Alexa device, you can download the Alexa App for free for Google and Apple devices.   

nationalcybersecurity.com

FREE
VIEW