(844) 627-8267 | Info@NationalCyberSecurity
(844) 627-8267 | Info@NationalCyberSecurity

Scammers are using AI-generated voice clones, the FTC warns : NPR | #datingscams | #lovescams | #facebookscams | #datingscams | #love | #relationships | #scams | #pof | #match.com | #dating


This Wednesday, April 14, 2016, file photo, shows a push-button landline telephone, in Whitefield, Maine.

Robert F. Bukaty/AP

hide caption

toggle caption

Robert F. Bukaty/AP

This Wednesday, April 14, 2016, file photo, shows a push-button landline telephone, in Whitefield, Maine.

Robert F. Bukaty/AP

For years, a common scam has involved getting a call from someone purporting to be an authority figure, like a police officer, urgently asking you to pay money to help get a friend or family member out of trouble.

Now, federal regulators warn, such a call could come from someone who sounds just like that friend or family member — but is actually a scammer using a clone of their voice.

The Federal Trade Commission issued a consumer alert this week urging people to be vigilant for calls using voice clones generated by artificial intelligence, one of the latest techniques used by criminals hoping to swindle people out of money.

“All [the scammer] needs is a short audio clip of your family member’s voice — which he could get from content posted online — and a voice-cloning program,” the commission warned. “When the scammer calls you, he’ll sound just like your loved one.”

If you’re not sure it’s a friend or relative, hang up and call them

The FTC suggests that if someone who sounds like a friend or relative asks for money — particularly if they want to be paid via a wire transfer, cryptocurrency or a gift card — you should hang up and call the person directly to verify their story.

—————————————————-


Source link

National Cyber Security

FREE
VIEW