
OpenAI, creators of the revolutionary ChatGPT, have launched a colossal bug bounty program. With over 4,500 cybersecurity enthusiasts hunting for vulnerabilities in their public-facing technology, OpenAI stands poised to rewrite the rulebook on ensuring AI security.
Updated Jun 26, 2023 | 11:24 AM IST
Hacker Haven: OpenAI Bug Hunting Bonanza Kicks Off in Silicon Valley
KEY HIGHLIGHTS
- OpenAI rallies over 4,500 hackers in an unprecedented bug bounty program targeting public-facing technology.
- The AI firm offers a whopping $20,000 for individual bug discoveries, driving a massive response from the global cybersecurity community.
- Despite the hefty participation, a relatively small number of prizes are expected, showcasing OpenAI’s focus on quality over quantity.
Rules of Engagement: Exposing the Underbelly, not the Brain
Operating under specific rules, the hackers’ mission isn’t the AI’s code. It’s the auxiliary systems, the cloud resources, plugins, and third-party service connections. In essence, OpenAI is challenging them to pry into its technical infrastructure while shielding the proprietary AI brain – the heartbeat of ChatGPT.
OpenAI Bug Bounty Program
Silver Linings in the Cloud
Paying the Piper: Cash Rewards for the Swift and Skilled
OpenAI isn’t skimping on the rewards. Each bug disclosure could fetch up to a cool $20,000 – a tantalizing lure for any hacker. The response? An overwhelming surge of over 4,500 researchers signing up, only eclipsed by Tesla’s program with about 5,000 participants. The contest had already accepted 50 vulnerabilities as of mid-June, with an average payout of around $786.
Bugcrowd gets over 4500 participations
undefined
The Aftermath: Expectations and Reality
Despite the exciting participation numbers, don’t expect a shower of prizes. According to Ellis, OpenAI’s tech footprint isn’t sprawling enough to warrant a plethora of prizes. It’s more about quality over quantity. As Ellis puts it, it’s not a “sprawling set of real estate.”
Click Here For The Original Story From This Source.