Meta to take action after investigation finds Instagram algorithm promoted child sex abuse material | #childsafety | #kids | #chldern | #parents | #schoolsafey

Meta Platforms Inc.-owned Instagram today said it will take action after an investigation by The Wall Street Journal and researchers at Stanford University and the University of Massachusetts Amherst revealed the company’s algorithm was promoting child sexual abuse material to pedophiles.

The investigation found that accounts owned and operated by minors were creating self-generated child sexual abuse material, SG-CSAM, after which the algorithm promoted it while buyers and sellers contacted each other via direct messaging. When the algorithm recommended such content, it often appeared with the warning: “These results may contain images of child sexual abuse.” The potential buyer can then choose: “See results anyway.”

Instagram has now apparently removed the feature after being contacted by The Journal. Nonetheless, the investigation found that minors were advertising their sexual content out in the open with hashtags such as #pedowhore and #preteensex. Some accounts had explicit handles, such as “Little slut for you.”

Perhaps even more shocking was the investigation found these accounts were offering menus to potential buyers, which included bespoke content of them having sex with animals or even hurting themselves. The investigation linked 405 accounts selling SG-CSAM on Instagram belonging to one network, while the researchers also found 128 accounts on Twitter selling similar material.

“An industry-wide initiative is needed to limit production, discovery, advertisement, and distribution of SG-CSAM; more resources should be devoted to proactively identifying and stopping abuse,” said researchers. “These networks utilize not only social media platforms, but file sharing services, merchants, and payment providers.”

Meta said it has now set up a task force to deal with the problem but added that in January alone, it took down 490,000 accounts that had violated its child safety policies. Meta also said that it has taken down 27 pedophile networks. “We’re continuously exploring ways to actively defend against this behavior, and we set up an internal task force to investigate these claims and immediately address them,” the company said in a statement.

Photo: Alexander Shatov/Unsplash

Your vote of support is important to us and it helps us keep the content FREE.

One-click below supports our mission to provide free, deep and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy



Source link

National Cyber Security