Home Tech

OpenAI offers bug bounty for ChatGPT — but no rewards for jailbreaking its chatbot

OpenAI has launched a bug bounty, encouraging members of the general public to seek out and disclose vulnerabilities in its AI providers together with ChatGPT. Rewards vary from $200 for “low-severity findings” to $20,000 for “distinctive discoveries,” and reviews are submittable by way of crowdsourcing cybersecurity platform Bugcrowd.

Notably, the bounty excludes rewards for jailbreaking ChatGPT or inflicting it to generate malicious code or textual content. “Points associated to the content material of mannequin prompts and responses are strictly out of scope, and won’t be rewarded,” says OpenAI’s Bugcrowd web page.

Jailbreaking ChatGPT normally includes inputting elaborate eventualities within the system that enable it to bypass its personal security filters. These would possibly embrace encouraging the chatbot to roleplay as its “evil twin,” letting the person elicit in any other case banned responses, like hate speech or directions for making weapons.

OpenAI says that such “mannequin questions of safety don’t match properly inside a bug bounty program, as they aren’t particular person, discrete bugs that may be straight mounted.” The corporate notes that “addressing these points typically includes substantial analysis and a broader method” and reviews for such issues must be submitted by way of the corporate’s mannequin suggestions web page.

Though such jailbreaks reveal the broader vulnerabilities of AI programs, they’re seemingly much less of an issue straight for OpenAI in comparison with conventional safety failures. For instance, final month, a hacker referred to as rez0 was in a position to reveal 80 “secret plugins” for the ChatGPT API — as-yet-unreleased or experimental add-ons for the corporate’s chatbot. (Rez0 famous that the vulnerability was patched inside a day after they disclosed it on Twitter.)

As one person replied to the tweet thread: “In the event that they solely had a paid #BugBounty program – I’m sure the gang may assist them catch these edge-cases sooner or later : )”


#OpenAI #presents #bug #bounty #ChatGPT #rewards #jailbreaking #chatbot

Exit mobile version