(844) 627-8267
(844) 627-8267

‘Godmode’ GPT-4o jailbreak released by hacker — powerful exploit was quickly banned | #hacking | #cybersecurity | #infosec | #comptia | #pentest | #hacker


A jailbroken version of GPT-4o hit the ChatGPT website this week, lasting only a few precious hours before being destroyed by OpenAI. 

Twitter user “Pliny the Prompter,” who calls themselves a white hat hacker and “AI red teamer,” shared their “GODMODE GPT” on Wednesday. Using OpenAI’s custom GPT editor, Pliny was able to prompt the new GPT-4o model to bypass all of its restrictions, allowing the AI chatbot to swear, jailbreak cars, and make napalm, among other dangerous instructions.



——————————————————–


Click Here For The Original Story From This Source.

.........................

National Cyber Security

FREE
VIEW