Remix.run Logo
diggan 21 hours ago

> If you have found a security vulnerability, we encourage you to report it via our bug bounty program

It seems like reporting bugs/issues via that program forces you to sign a permanent NDA preventing disclosures after the reported issue been fixed. I'm guessing the author of this disclosure isn't the only one that avoided it because of the NDA. Is that potentially something you can reconsider? Otherwise you'll probably continue to see people disclosing these things publicly and as a OpenAI user it sounds like a troublesome approach.

ragona 15 hours ago | parent [-]

(Note; I also work for OpenAI Security — though I’ve not worked on our bounty program for some time. These just my thoughts and experiences.)

I believe the author was referring to the standard BugCrowd terms, which as far as I know are themselves fairly common across the various platforms. In my experience we are happy for researchers to publish their work within the normal guidelines you’d expect from a bounty program — it’s something I’ve worked with researchers on without incident.

winstonhowes 12 hours ago | parent [-]

100%. We want to ensure we can fix real security issues responsibly before details are published. In practice, if a researcher asks to disclose after we've addressed the issue, we're happy for them to publish.

DANmode 8 hours ago | parent [-]

In practice, it sounds like you guys didn't accept this dude's valid vuln because he didn't register and sign his life away.

tptacek 3 hours ago | parent [-]

They just stated it was all just model hallucination, and was not in fact a valid vuln.

DANmode 2 hours ago | parent [-]

shrugs If you're convinced, I'm convinced!

tptacek 2 hours ago | parent [-]

I'm convinced.