Remix.run Logo
dotancohen 10 hours ago

Money for a report and a patch, with convincing test cases, might be worthwhile. Even if a machine generates them.

josefx 9 hours ago | parent | next [-]

> Even if a machine generates them.

Why? If it is a purely machine generated report there is no need to have dozens of third parties that throw them around blindly. A project could run it internally without having to deal with the kind of complications third parties introduce, like duplicates, copy paste errors or nonsensical assertions that they deserve money for unrelated bugfixes.

A purely machine generated report without any meaningfull contribution by the submitter seems to be the first thing you would want to exclude from a bug bounty program.

TheDong 8 hours ago | parent | prev | next [-]

Not necessarily. Reviewing an issue report is already enough time. Reviewing a patch is even more developer time.

The problem they had before was a financial incentive to sending reports, leading to crap reports that wasted time to review. Incentivizing sending reports + patches has the same failure mode, but they now have to waste even more time to review the larger quantity of input.

Anyway, for most cases I'd bet that Daniel can produce and get reviewed a correct patch for a given security bug quicker than the curl team can review a third-party patch for the same, especially if it's "correct, but ai-written".

jraph 10 hours ago | parent | prev | next [-]

I've read this idea that we could make people pay for security reports a few times here on HN (and you get back the money if the report is deemed good). That feels very wrong.

If I find a security issue, I'm willing to responsibly disclose it, but if you make me pay, I don't think I will bother.

Punishing bad behavior to disincentivize it seems more sensible.

yorwba 8 hours ago | parent | next [-]

For a person finding bugs for a living, an up-front fee to have their report reviewed by a maintainer would amount to an investment towards receiving a bug bounty if their report is valid and valuable. Just the cost of doing business.

It would discourage drive-by reports by people who just happened to notice a bug and want to let the maintainers know, but I think for a project that's high-profile enough to be flooded by bogus bug reports, bugs that random users just happen to notice will probably also get found by professional bug hunters at some point.

bluGill 7 hours ago | parent [-]

Only if the system is fair. If I as a maintainer want to scam I can just close the report as invalid, collect the $$$. Then a week latter I fix the issue with a commit that looks like it is unrelated.

I wouldn't do the above, but it is easy to see how I could run that scam.

yorwba 6 hours ago | parent [-]

You can look at how the maintainer dealt with previous bug reports to decide whether you can trust them or not. If there haven't been any previous bug reports but they nonetheless ask for a fee to help deal with the large volume of bug reports, yeah, that might be a scam. If you're running their software, maybe also check whether it's full of malware.

bluGill an hour ago | parent [-]

Good scammers pay out once in a while. Casinos hate it when everbody body loses, and love it when one person wins big - they need just enough losers to make money while enough winners to show off.

ufmace 4 hours ago | parent | prev | next [-]

I get what you're saying, but I don't think punishing bad behavior is practical here. It's like a "enumerating badness" problem - there's way more bad actors with nothing to lose and not much practical way to punish them. There's too many of them and they all have no reputation to damage.

Not saying I have a better solution, just that it's a hard problem. Maybe dissuading some good people who have genuine security issues but don't feel like paying just has to be a cost of doing business.

ezst 9 hours ago | parent | prev [-]

Punishing bad behaviour does close to nothing, because the problem at hand is one of high asymmetry between the low effort to submit vs the high effort to review. I do agree that paying for reports isn't ideal, and we should find other ways to level the playing field, but in the meantime I haven't heard of anything as effective.

jraph 9 hours ago | parent [-]

> the problem at hand is one of high asymmetry between the low effort to submit vs the high effort to review

Hence the threat to shame publicly I suppose.

Actually, Daniel Stenberg previously responded to this proposal the same way as me [1] (and maybe would still do). Coincidentally, I was reading your answer at about the same time as this part of the talk.

[1] https://www.youtube.com/watch?v=6n2eDcRjSsk&t=1823s (via https://news.ycombinator.com/item?id=46717556#46717822)

ezst 9 hours ago | parent [-]

Doesn't work when using throwaway accounts, the low effort gets only marginally higher.

creata 10 hours ago | parent | prev | next [-]

> Even if a machine generates them.

That sounds wonderfully meritocratic, but in the real world, a machine generating it is a very strong signal that it's bullshit, and the people are flooding maintainers using the machines. Maintainers don't have infinite time.

MBCook 6 hours ago | parent | prev | next [-]

What was a kind design to thank good contributors is now a lottery.

Throw enough AI crap at enough projects and you may get a payout.

The incentives fail in the face of no-effort flooding. They accidentally encourage it.

hobs 10 hours ago | parent | prev [-]

To be clear, no, it is not, because of the opportunity cost of all the other slop. That's what this is all about.

johnisgood 10 hours ago | parent [-]

Then no bug reports and no fixes. Sounds good enough.

latexr 10 hours ago | parent | next [-]

Of course there are still bug reports and fixes without financial compensation. The proof is all of open-source, including cURL.

mikkupikku 10 hours ago | parent | prev [-]

They'll still get bug reports and fixes from people who actually give a shit and aren't just trying to get some quick money.