Arabic Arabic Chinese (Simplified) Chinese (Simplified) Dutch Dutch English English French French German German Italian Italian Portuguese Portuguese Russian Russian Spanish Spanish
| (844) 627-8267

Hackers exploit ChatGPT to lure young girls | #datingscams | #lovescams | #datingscams | #love | #relationships | #scams | #pof | | #dating

ChatGPT was claimed to have been wrongly utilised by the scammers to target young girls by tricking them for personal information.

New Delhi,UPDATED: Jan 12, 2023 16:40 IST

Hackers exploit ChatGPT to lure young girls

By Nidhi Bhardwaj: An Israeli cyber security company on Wednesday claimed the malicious intent of hackers to exploit the powerful chatbot developed by OpenAI to lure young girls and take personal information. As per the information, cybercriminals have exploited ChatGPT to create convincing personas to establish their evil agenda.

What makes ChatGPT questionable?

A recent report suggests the malicious use of ChatGPT for personal gain and the creation of new chatbots to mimic young girls, thus making them the targets.

Hold Security’s founder Alex Holden claimed he has observed dating scammers exploiting ChatGPT to create convincing personas.He said, “Scammers are creating female personas to impersonate girls to gain trust and have lengthier conversations with their targets.”

In the wake of this scam, the Israeli cybersecurity firm also claimed that scammers opted for ChatGPT to build bots and sites to dupe the user’s minds into sharing their personal information. Moreover, the researchers posed fear in these highly targeted scams and phishing campaigns as a loss of privacy.

Several users came forward to express their plight by mentioning how hackers abused ChatGPT by using it to create code-up features of a marketplace on the Dark Web, like Silk Road and Alphabay, the report stated.

How is ChatGPT being highly exploited?

After its launch in November, the ChatGPT has garnered the eyeballs of users. As claimed by the San Francisco-based Open AI firm, ChatGPT promotes human-like conversational tools. However, this human mimic simulation of ChatGPT can also code malicious software and can be modified to encrypt a device without involving user interaction, reported Check Point, an American-Israeli multinational provider of software, and combined hardware products for IT security.

The company further stated, “This malicious software can monitor users’ keyboard strokes and create ransomware. An attacker could create an authentic-looking spear-phishing email to run a reverse shell that can accept commands in English.”

Click Here For Original Source.

. . . . . . .


Source link

National Cyber Security