Info@NationalCyberSecurity
Info@NationalCyberSecurity

The walls have been breached: Attorneys general from 54 US states and territories call on Congress to restrict AI from creating child porn | #childsafety | #kids | #chldern | #parents | #schoolsafey


Congress must act quickly to prevent the exploitation of children by Artificial Intelligence (AI) technology.

This, according to a letter from attorneys general of 54 U.S. states and territories sent to leaders of the House and Senate asking them to “establish an expert commission to study the means and methods of AI that can be used to exploit children specifically.”

The AGs also requested an extension of current restrictions on child sexual abuse materials (CSAM) specifically to cover AI-generated images.

“Congress should act to deter and address child exploitation, such as by expanding existing restrictions on CSAM to explicitly cover AI-generated CSAM,” the letter stated. “This will ensure prosecutors have the tools they need to protect our children.”

The letter also gives examples of how technology might be used to harm children.

“As a matter of physical safety, using AI tools, images of anyone, including children, can be scoured and tracked across the internet and used to approximate or even anticipate a victim’s location,” the AGs wrote. “Most disturbingly, AI is also being used to generate child sexual abuse material.

“For example, AI tools can rapidly and easily create ‘deepfakes’ by studying real photographs of abused children to generate new images showing those children in sexual positions.”

The technology allows users to create a convincing image with just a brief description of what they want to see.

Rebecca Portnoff, the director of date science for a nonprofit child-safety group called Thorn, said she’s seen a growth in sexually exploitive photos in just the last year.

“Children’s images, including the content of known victims, are being repurposed for this really evil output,” Portnoff said, according to The Washington Post. “Victim identification is already a needle-in-a-haystack problem, where law enforcement is trying to find a child in harm’s way. 

“The ease of using these tools is a significant shift, as well as the realism. It just makes everything more of a challenge.” 

Child-safety experts believe many offenders are using open-source tools, which can be run in an unrestricted and unpoliced way. Once such example can be downloaded on any consumer-grade PC with a GPU and has filters that are easily bypassed, according to The Post.  

One user reportedly saw someone use the tool to try to generate fake swimsuit photos of a child actress, calling it “something ugly waiting to happen,” The Post reports. 

“We are engaged in a race against time to protect the children of our country from the dangers of AI,” AG letter reads. “Indeed, the proverbial walls of the city have already been breached. Now is the time to act.”

/* Default comment here */

!function(f,b,e,v,n,t,s)
{if(f.fbq)return;n=f.fbq=function(){n.callMethod?
n.callMethod.apply(n,arguments):n.queue.push(arguments)};
if(!f._fbq)f._fbq=n;n.push=n;n.loaded=!0;n.version=’2.0′;
n.queue=[];t=b.createElement(e);t.async=!0;
t.src=v;s=b.getElementsByTagName(e)[0];
s.parentNode.insertBefore(t,s)}(window, document,’script’,
‘https://connect.facebook.net/en_US/fbevents.js’);
fbq(‘init’, ‘860035728224732’);
fbq(‘track’, ‘PageView’);

————————————————


Source link

National Cyber Security

FREE
VIEW