Researchers at Stanford sound alarm on Mastodon’s significant Child Abuse Material problem | #childsafety | #kids | #chldern | #parents | #schoolsafey


Researchers at Stanford’s Internet Observatory released a paper today revealing some worrying insights concerning the social media platform Mastodon, which they say is rife with Child Abuse Material, CSAM.

The researchers said in the study that the issue has become very problematic on decentralized social networks, more so than centralized networks such as Twitter Inc., Instagram, or YouTube.  The researchers found that moderation tools were limited on the former, adding that there is no “built-in mechanism to report CSAM to the relevant child safety organizations.”

In general, they said, the Fediverse – decentralized, autonomous networks that run on thousands of servers all over the world – lacks safety infrastructure. While there is an appeal in social media that’s not run in a top-down manner, it seems there’s a certain amount of chaos in decentralization. One of them is the proliferation of the said images.

The researchers pulled up 112 instances of CSAM over 325,000 posts on Mastodon in just two days. They said they found the first image after looking for just five minutes. The images could easily be searched for. They discovered that 554 pieces of CSAM content matched hashtags or keywords that can be used by people selling such images. Some of these included children charging a few dollars for photos or videos.

This is not a problem only found on decentralized platforms, of course, but the researchers said having fewer tools to moderate than centralized platforms has created a cesspit of pedophilia activity that is particularly bad in Japan, where CSAM content is sold in both the Japanese and English language. Still, they said all over Mastodon the proliferation of CSAM is right now “disturbingly prevalent.”

“Federated and decentralized social media may help foster a more democratic environment where people’s online social interactions are not subject to an individual company’s market pressures or the whims of individual billionaires,” the researchers concluded “For this environment to prosper, however, it will need to solve safety issues at scale, with more efficient tooling than simply reporting, manual moderation and defederation. The majority of current trust and safety practices were developed in an environment where a small number of companies shared context and technology, and this technology was designed to be efficient and effective in a largely centralized context.”

They believe that some decentralized networks in the Fediverse can address this issue by using some of the same components used in centralized social media networks. As the Fediverse grows, they said, there will need to be investment to deal with this issue.

Photo: Rolf van Root/Unsplash

Your vote of support is important to us and it helps us keep the content FREE.

One-click below supports our mission to provide free, deep and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU

————————————————


Source link

How can I help you?
National Cyber Security

FREE
VIEW