Remix.run Logo
chromacity 2 hours ago

Because it can't and it's a publicity stunt. It achieves three goals:

1) Underscores to the general public that the models are amazingly powerful and if you're not using them, your competitors will out-innovate you,

2) Sends the message to regulators that they don't need to do anything because the companies are diligent to prevent harm,

3) Sends the message to regulators that they sure should be regulating "open-source" models, because these hippies are not doing rigorous safety testing.

Both Anthropic and OpenAI have been playing that game for years.

jfrbfbreudh 2 hours ago | parent | next [-]

If it can’t, then it makes more sense to make the bounty as high as possible instead of a measly $25k

chromacity 25 minutes ago | parent | next [-]

If it's an existential threat to humanity, and if OpenAI is valued at nearly $1T, why set the bounty at a measly $25k? The going rate for an iPhone zero-day is six to seven figures. Some companies will pay you more than $25k for a website XSS.

Because this is not a serious effort to address a serious risk. It's a PR stunt, the bounty is for a simple jailbreak and not a bioweapon, and they don't necessarily want to spend a lot of money or get people really invested in breaking their safety filters.

duchef 2 hours ago | parent | prev [-]

They don't want anyone to actually do it.

IAmGraydon an hour ago | parent | prev [-]

I’m glad people are starting to recognize this, but when will the general public? Never?