The FBI is warning that scammers are using AI technology to create sexually explicit deepfake photos and videos of people in a bid to extort money from them, also known as “sextortion.”
The threat is particularly disturbing because it exploits the benign photos people post on their social media accounts, which are often public. Thanks to advancements in image- and video-editing software, a bad actor can take the same photos and use them to create AI-generated porn with the victim’s face.
“The FBI continues to receive reports from victims, including minor children and non-consenting adults, whose photos or videos were altered into explicit content,” the agency said(Opens in a new window) in the alert. “The photos or videos are then publicly circulated on social media or pornographic websites, for the purpose of harassing victims or sextortion schemes.”
As a result, the FBI is warning the public about the danger of posting photos and videos of themselves online. “Although seemingly innocuous when posted or shared, the images and videos can provide malicious actors an abundant supply of content to exploit for criminal activity.”
The FBI did not say how many complaints it has received. But the agency issued the alert as it’s seen thousands of sextortion schemes targeting minors. This can involve an online predator pretending to be an attractive girl and then duping a teenage boy into sending them nudes. The scammer will then threaten to post the nudes online unless money is paid.
Recommended by Our Editors
In today’s alert, the FBI noted recent sextortion schemes have also involved the use of deepfakes. “As of April 2023, the FBI has observed an uptick in sextortion victims reporting the use of fake images or videos created from content posted on their social media sites or web postings, provided to the malicious actor upon request, or captured during video chats,” the agency said. In some cases, the predators will also use the deepfakes to pressure a victim into sending them “real sexually-themed images or videos.”
In the meantime, the rise of malicious deepfakes could cause more states to outlaw their use. Only a few states, such as Virginia and California, have banned deepfake porn. But last month, Rep. Joe Morelle (D-NY) introduced federal legislation(Opens in a new window) to ban non-consensual deepfakes, turning them into a criminal offense.
Like What You’re Reading?
Sign up for SecurityWatch newsletter for our top privacy and security stories delivered right to your inbox.