| |
| ▲ | john_strinlai 4 hours ago | parent | next [-] | | hope you are also blacklisting google's project zero, and practically every other major player in the vulnerability reporting space, as all use roughly the same bog standard 90+30 policy. this was a failure of the kernel security team, and their stance on communicating security issues with their downstreams. | |
| ▲ | eaf7e281 5 hours ago | parent | prev | next [-] | | Same. They do become famous, but not in a wholly positive way. | | |
| ▲ | esseph 3 hours ago | parent [-] | | I used to think the context of the fame mattered. At least in the US, it does not. Hell, Crowdstrike is still purchased. |
| |
| ▲ | selectively 6 hours ago | parent | prev | next [-] | | Researchers are under no obligation to engage in coordinated disclosure and are free to sell 0day for profit. Just fyi. Be glad it was disclosed at all. Be glad a patch was available prior to release. | | |
| ▲ | lambda 6 hours ago | parent | next [-] | | If they want to be seen as responsible rather than opportunistic, then yeah, they should do a proper coordinated disclosure. Sure, they have no legal obligation to disclose, but we all also have no legal obligation to buy their services. Blacklisting bad actors like this is the right move to discourage this kind of behavior. | | |
| ▲ | john_strinlai 3 hours ago | parent | next [-] | | >they should do a proper coordinated disclosure. they did a proper coordinated disclosure, following the industry standard 90+30 process. that is why the exploit dropped 30 days after the patch landed. the kernel team should have communicated with their downstream about the importance of the patch. that is the kernel security team's responsibility -- and they are much better positioned to do that than crossing your fingers and hoping every reporter will contact every distro every single time there is a vulnerability. there are very good reasons disclosure works this way, backed by a couple of decades of debate about it. | |
| ▲ | selectively 6 hours ago | parent | prev [-] | | Who cares about how you are seen when you are selling 0day for big bucks? The bad actor makes more money than the 'legitimate' one without breaking any law. Punishing someone who didn't alert distros despite a patch being available encourages the company to simply find flaws and sell them for profit - it pays more to begin with. | | |
| ▲ | _yttw 6 hours ago | parent | next [-] | | If they want to take advantage of disclosure for marketing, they're either going to need to accept the norms around responsible disclosure, or they're going to need to accept how shirking those norms will come off. That's life in society. Sometimes it's annoying and sometimes it doesn't feel rational, but these norms have been negotiated throughout the history of our industry and are the way they are for reasons good and bad. I just don't see the point in complaining about how shirking the norms of your industry will make you look irresponsible. I don't really care that they could have decided to sell the vulnerability instead. It isn't material. | | |
| ▲ | tptacek 5 hours ago | parent | next [-] | | It is absolutely not true that viable commercial vulnerability labs need to "accept the norms around responsible disclosure". There are no such norms. "Responsible disclosure" is an Orwellian term cooked up between @Stake and Microsoft and other large vendors to coerce researchers into synchronizing with vendor release schedules. It was fantastically successful at that, and it's worth pushing back on at every opportunity. Tavis Ormandy dropped Zenbleed right onto Twitter. He's doing fine. You can blacklist him if you want; I imagine he's not going to notice. | | |
| ▲ | SCHiM 5 hours ago | parent | next [-] | | Microsoft's policy is: "if you contact us with a vulnerability, you automatically agree to the terms of our responsible disclosure policy", which includes waiting 30 days after patch was created, and says nothing about how long that process takes. There is actually no way to give them a friendly heads up, and then do your own thing. The only way not to be bound is by not sending them any notification at all... | | |
| ▲ | prmoustache 2 hours ago | parent | next [-] | | Since no contract is signed, this is just pure fantasy from your part. | |
| ▲ | leni536 4 hours ago | parent | prev [-] | | I wonder if "if you contact us... you automatically agree" stands in court. That's just ridiculous. | | |
| |
| ▲ | _yttw 5 hours ago | parent | prev [-] | | You're right, they don't need to. They have an alternative, to accept what people say or think about them in response. That's what I said. | | |
| ▲ | expedition32 19 minutes ago | parent [-] | | So how do we feel about Linux distributors who have their heads up their asses and sat on their hands for 30 days? |
|
| |
| ▲ | selectively 6 hours ago | parent | prev [-] | | Those norms do not exist. Those are people asking companies to do stuff to benefit the person complaining for free, and many companies will not do that. | | |
| ▲ | _yttw 6 hours ago | parent [-] | | It seems to me you're unaware of them, but there are strong norms around disclosure. They've been discussed for decades. It is the expectation that vendors would be notified in a scenario like this. | | |
| ▲ | selectively 5 hours ago | parent [-] | | No, there are users who want those to be norms. Qualified researchers happily sell substantive vulns to people who pay (Governments/Cellebrite and companies like that) enough to quell any complaint. | | |
| ▲ | _yttw 5 hours ago | parent [-] | | Which is again, irrelevant to the question of how disclosure works and what expectations there are around it because that is not disclosure and is not what was being discussed. |
|
|
|
| |
| ▲ | dirasieb 5 hours ago | parent | prev [-] | | it’s called building and preserving a high trust society, you wouldn’t understand | | |
| ▲ | DaSHacka 2 hours ago | parent [-] | | How does someone being incentivized to sell a vulnerability to a private organization over disclosing it publicly preserve a "high trust society"? Do you mean in the context of a "deceptively high-trust society"? Those private actors aren't planning to sit around and hold onto these exploits they've horded forevermore, they're obviously paying for them so they can one day use them. |
|
|
| |
| ▲ | lrvick 4 hours ago | parent | prev | next [-] | | Unfortunately this is correct. As a security researcher I set millions in profit on fire for reporting vulns to projects that offer no bounties vs selling to highest bidder. I keep doing it because it is the right thing to do, but I would not blame someone that needs to feed their family making a different choice. We must get public funds to reward ethical disclosure of big impact vulns like this. | | |
| ▲ | selectively 2 hours ago | parent [-] | | Harder and harder to get good policy like what you describe when tech-adjacent people loudly argue for criminal penalties for anything other than coordinated disclosure :( |
| |
| ▲ | bigbadfeline 3 hours ago | parent | prev | next [-] | | > Researchers are under no obligation to engage in coordinated disclosure and are free to sell 0day for profit. Just fyi. Be glad it was disclosed at all. I'm so glad these so called "researchers" aren't totally evil, I'm so grateful they're only half evil, give them a lollipop. Whatever, the way they disclosed it isn't much different from no disclosure at all - the exploit would have been identified in the wild and fixed soon thereafter. "Researchers"... | | |
| ▲ | john_strinlai 3 hours ago | parent | next [-] | | the way the disclosed it is the industry standard. think of the biggest security research teams you know (e.g. google), and they follow the same process. non-security people always seem to get up in arms about it, but there is very good reasons why the industry has landed on the process it has, which has been hashed out over a few decades. | |
| ▲ | selectively 2 hours ago | parent | prev [-] | | There are two options: 1. Status quo. Researchers are free to disclose to a vendor, free to sell vulns to legitimate companies, free to do full disclosure if they want. This situation benefits security. Researchers are able to pay their bills while also doing meaningful research into OSS projects that are unable to fund the kind of security audit they need. Harm reduction, of sorts. 2. Everyone is a bad actor. No one is going to do this work for free/for a bounty. Horrible flaws will be found and shared with ransomware gangs and the like. 0day will sell for a percentage of the ransom winnings. Researchers will live like kings, everyone else will suffer. Which do you prefer? |
| |
| ▲ | jojomodding 5 hours ago | parent | prev | next [-] | | > are free to sell 0day for profit. This is not true in many jurisdictions. | | |
| ▲ | lrvick 4 hours ago | parent | next [-] | | Anyone can sell a vuln in any jurisdiction and never be caught. Lets not pretend the law is actually worth a damn here. We need an anonymous bounty system. | |
| ▲ | selectively 2 hours ago | parent | prev [-] | | Are you claiming that if I sell 0day through a broker to the national Government of a given jurisdictions that the national Government of that jurisdiction is going to criminally penalize me? If so, that's a bit naive. In the actual world, that buyer wants to buy more stuff from me, not penalize me. |
| |
| ▲ | kelnos 5 hours ago | parent | prev | next [-] | | I'm pretty sure they have a legal obligation in most jurisdictions not to sell 0days for profit. And they absolutely have a moral obligation to do things in a way to minimize damage and impact to other people's systems. (I'm not saying "responsible disclosure" is the correct way to do that, but hoarding vulnerabilities and exploits and selling them to the highest bidder certainly isn't.) This is how society needs to work. | | |
| ▲ | tptacek an hour ago | parent | next [-] | | It is categorically false that there's a legal obligation not to sell vulnerabilities. There's an obligation not to knowingly sell them directly to ongoing criminal enterprises. That's it. Plenty of people make fuckloads of money selling vulnerabilities for exploitation rather than repair. | |
| ▲ | lrvick 4 hours ago | parent | prev | next [-] | | Let me make you aware of zerodium. A broker anyone can sell vulns to, that sells to unspecified buyers you do not need to know about. | | |
| ▲ | selectively 2 hours ago | parent [-] | | (The buyers are the NSA, the IDF, Cellebrite, NSO and its successor corporation and that kind of thing. Depends on what you are offering) You'll learn who the buyers are if you routinely have the really good stuff to sell! If you are offering iOS zero click on a semi-regular basis, the buyer is going to want to try to deal with you directly and preferably offer you a more regular form of employment, if you are interested. Some national governments may offer certain benefits to you, depending on your situation. All depends on what you have to offer. If you were able to offer this https://arstechnica.com/security/2025/09/microsofts-entra-id... or something of that magnitude, a lot of problems in your life would just go away. The buyers would all be Five Eyes and the intelligence gain of having that kind of access even briefly is priceless. In a more Western-centric context, imagine if you had a flaw like that, same 'no logs are generated' and 'every single customer account is accessible' but the impacted vendor was Alibaba Cloud. The researcher would get to name their price. That's the real world, that's the world we share. We shouldn't be blind to that. |
| |
| ▲ | mschuster91 5 hours ago | parent | prev [-] | | > I'm pretty sure they have a legal obligation in most jurisdictions not to sell 0days for profit. it wasn't sold for profit, it was openly disclosed. > And they absolutely have a moral obligation to do things in a way to minimize damage and impact to other people's systems. All that "responsible disclosure" does is keep people from demanding better. |
| |
| ▲ | ux266478 4 hours ago | parent | prev | next [-] | | mmmmmm, no it would seem like they are absolutely under a social obligation to not do that. | |
| ▲ | 4 hours ago | parent | prev | next [-] | | [deleted] | |
| ▲ | 4 hours ago | parent | prev | next [-] | | [deleted] | |
| ▲ | estimator7292 4 hours ago | parent | prev | next [-] | | [flagged] | |
| ▲ | grayhatter 5 hours ago | parent | prev | next [-] | | > Researchers are under no obligation to engage in coordinated disclosure and are free to sell 0day for profit. Uh... no? If you mean legally, some people might, depending on jurisdiction. But also, ethically? yes, researchers are ethically obligated to disclose responsibly. > Just fyi. ... > Be glad it was disclosed at all. Be glad a patch was available prior to release. I am glad that a patch was available. Equally I can be glad that the linux community is strong enough to respond quickly, while also being angry that this person behaves unethically. Likewise, when people in my industry behave poorly, or unethically; I'm now the person ethically obligated to both point it out, and condemn it. Not to become an apologist demanding I should be happy watching bad things happen, when much of the fallout could have been prevented with a bit less incompetence and ignorance. | |
| ▲ | eschaton 5 hours ago | parent | prev [-] | | They should have a legal obligation to engage in coordinated/responsible disclosure, and it should be a crime to sell or disclose a 0day to anyone other than a state-designated security organization or the vendor/provider. If it won’t be handled through criminal law then it’ll be handled through civil litigation: Anyone who was exploited as a result of this disclosure should sue the discloser for contributing to the damage they’ve suffered. |
| |
| ▲ | CSSer 6 hours ago | parent | prev | next [-] | | Yes, exactly. Name and shame. | |
| ▲ | true_religion 6 hours ago | parent | prev [-] | | Same. I did not know who they were, but now they have been named and shamed. Not every publicity is good. |
|