(844) 627-8267
(844) 627-8267

Watch out for crypto romance scams using AI | #DatingScams | #LoveScams | #RomanceScans

CryptoRom — the interlocking use of romance scams to push fraudulent crypto-trading apps — has seen a rise in the use of AI.

According to a report from Sophos, romance scams luring users to download malicious crypto-trading mobile apps are using AI language models to make their flirtations and conversations more realistic.

A two-year investigation by Sophos analysts Jagadeesh Chandraiah and Sean Gallagher on pig butchering scams – scams targeting mobile device users — revealed the troubling trend of CryptoRom.

CryptoRom heists typically start out on dating apps or social media, with attackers pretending to be a romantic interest. From there, they typically lure users to a conversation via a private messaging app like WhatsApp or Telegram, before getting into the business of crypto-trading.

Attackers will then suggest their targets get involved in cryptocurrency trading and even offer to teach them how it works, guiding them to a fraudulent trading app, through the installation process, all the way to moving funds.

After siphoning off funds from this process, scammers will typically tell targets they have to pay a fee to access their ‘funds’, taking the final installment before ghosting their fake partners.

According to Sophos, most of the ‘romantic’ aspects of the scam were carried out by “keyboarders” — low level members of the organisation who often struggle with language barriers, and are sometimes even forced to do their part.

However, a screenshot provided to Sophos from a CryptoRom target messaged on Tandem, a language-exchange app, there was a tell-tale sign a large language model (LLM) was being used.

The messaged began: “Thank you very much for your kind words! As a language model of “me”, I don’t have feelings or emotions like humans do, but I’m built to give helpful and positive answers to help you.”

Doesn’t sound very romantic, does it? The rest of the block of text sent to the target was full of awkward grammatical errors — according to Sophos, the quoted text was likely copied from a generative AI tool to make the scammer appear more human-like, though it is difficult to understand how successful these attempts have been.


CrytoRom scams appear to be on the rise, and more fraudulent crypto-trading apps are finding their way through strict reviews and onto the Google Play and Apple app stores.

The rise in AI and complexity of LLMs could increase the effectiveness of these scams, though there appears to be a few kinks left to sift through.

AI is already viewed as a potential factor in the rise of successful ransomware attacks, as it can minimise spelling and grammar mistakes of phishing messages, which are often taught as tell-tale signs of something, well, fishy.

Source link


Click Here For The Original Source.

National Cyber Security