Remix.run Logo
js4ever 2 days ago

It is highly irresponsible to disclose security vulnerabilities publicly, and in some jurisdictions it may even be illegal.

While I understand that the author attempted to contact Monster without receiving a response, publishing details of the vulnerabilities and how to exploit them only puts users at greater risk. This approach is reckless and harmful.

darth_avocado 2 days ago | parent | next [-]

It is common practice to give the company sufficient time and communicate, and then release the details once the vulnerability is patched. But it’s also common in practice to disclose the vulnerability after a set period of time if the company does not engage in any form of communication and refuses to patch the vulnerability. In this case they didn’t engage in any form of communication and then partially patched the problems. Nothing out of the ordinary here.

eclipticplane 2 days ago | parent | next [-]

What _isn't_ common practice is actually copying and posting company material on your blog. Just because a door is unlocked does not give you the right to take materials & post them.

93po 2 days ago | parent [-]

This requires you to have any amount of respect for intellectual property, which many find to be immoral

none_to_remain 2 days ago | parent | prev [-]

I have seen this in practice for vulnerabilities that affect many users of some software. If some Hackermann finds that Microsoft Windows version X or Oracle Database server version Y has a security flaw then disclosure is virtuous so that people using those can take measures. That reasoning doesn't seem to apply here.

dh2022 2 days ago | parent | prev | next [-]

My understanding is this is the standard SOP for security vulnerabilities: 1. Report the security vulnerabilities to the “victim” 2. Work with the “victim” the schedule for mitigation and publication 3. Publicize the vulnerabilities (the security researcher wants his findings to be publicly recognized)

If the victim does not acknowledge this issue it is impossible to execute step 2. So then the security researcher goes to step 3.

If the hacker has the emails sent at step 1 he will be fine.

jhanschoo 2 days ago | parent [-]

OP leaked internal business documents as part of their disclosure that had no business being in a disclosure. It looks like minor employee details have been leaked as well, which is very bad.

martin-t 2 days ago | parent | prev | next [-]

These companies treat fines as the cost of doing business and every time they lose people's personal information, they get slapped on the wrist and laugh it off while the execs get bonuses for having someone write a tearful apology to appear like victims.

I am happy every time somebody makes enough noise to make them notice and fix it because being polite and legal clearly is not working.

IlikeKitties 2 days ago | parent | prev [-]

Nah, fuck that noise. If the company reacts to a responsible disclosure notice that's nice but no one is under any obligation to help out mega corps to secure their shit. And the users aren't put at risk by the people finding the vulnerability but by the company not fixing it.

Fuck Responsible disclosure, companies should have to bid on 0 days like everyone else.

Ekaros 2 days ago | parent | next [-]

One probably should not release information from company they hacked.

On other side, if it is some piece of software immediate disclosure in public is only reasonable and prudent action. It allows every user to take necessary mitigation actions like taking their services and servers offline.

pizzalife 2 days ago | parent | prev | next [-]

There is a market for capabilities, i.e zerodays in widely used software. It has value, sometimes in the millions.

No one will buy some shitty XSS on a public website.

js4ever 2 days ago | parent | prev [-]

That argument misses the point. Yes, the company has the primary responsibility to fix their vulnerabilities, but that doesn’t justify recklessly publishing exploits. Once an exploit is public, it’s not just 'the company' that suffers, it’s every customer, employee, and partner who relies on that system.

Saying 'fuck responsible disclosure' is basically saying 'let’s hurt innocent users until the company caves.' That’s not activism, that's collateral damage.

If someone genuinely cares about accountability, there are legal and ethical ways to pressure companies. Dumping 0-days into the wild only helps criminals, not users.

IlikeKitties 2 days ago | parent | next [-]

> Saying 'fuck responsible disclosure' is basically saying 'let’s hurt innocent users until the company caves.' That’s not activism, that's collateral damage.

Correct. And I have good reasons for that. Activism has failed, consequences are required. The inevitable march towards the end of privacy due to the apathy of the unthinking majority of careless idiots will only be stopped when everyone feels deeply troubled by entering even the slightest bit of personal information anywhere because they've felt the consequences themselves.

> If someone genuinely cares about accountability, there are legal and ethical ways to pressure companies. Dumping 0-days into the wild only helps criminals, not users.

I could point to probably thousands of cases where there wasn't any accountability or it was trivial to the company compared to the damage to customers. There's no accountability for large corporations, the only solution is making people care.

93po 2 days ago | parent | prev [-]

let's be clear here, though: the root problem isn't someone finding some sensitive papers left on a printer accidentally, it's the person who left them on the printer to begin with. that's the root failure, and damage that results from that root failure is the fault of the person who left them there.

the american system clearly agrees with this, too. you see it insider trading laws. you're allow to trade on insider information as long as it was, for example, overheard at a cafe when some careless blabbermouth was talking about the wrongs things in public.