Remix.run Logo
rainonmoon 2 days ago

Try working at a company of any remote public significance and see if your view changes.

Nextgrid a day ago | parent | next [-]

There's a lot of performative "security" in such companies. You need to employ the right people (you need a "CISO", ideally someone who's never actually used a terminal in their life), you need to pay money for the right vendors, adopt the right buzzwords and so on. The amounts of money being spent on performative security are insane, all done by people who can't even "hack" a base64-"encrypted" password.

All while there's no budget for those that actually develop and operate the software (so you get insecure software), those that nevertheless do their best are slowed down by all the security theater, and customer service is outsourced to third-world boiler rooms so exploiting vulnerabilities doesn't even matter when a $100 bribe will get you in.

It's "the emperor has no clothes" all the way down: because any root-cause analysis of a breach (including by regulators) will also be done by those without clothes, it "works" as far as the market and share price is concerned.

Source: been inside those "companies of public significance" or interacted with them as part of my work.

throwawaysleep 2 days ago | parent | prev [-]

Equifax? Capital One? 23andMe? My basis for this is that you can leak everyone’s bank data and barely have it show up in your stock price chart, especially long term.

rainonmoon 2 days ago | parent | next [-]

Stock price is an extremely narrow view of the total consequences of lax cybersecurity but that aside, the notion that security doesn’t matter because those companies got hacked is ridiculous. The reason there isn’t an Equifax every minute is because an enormous amount of effort and talent goes into ensuring that’s the case. If your attitude is we should vibe code our way past the need for security, you aren’t responsible enough to hold a single user’s data.

ChrisMarshallNY 2 days ago | parent [-]

I feel as if security is a much bigger concern than it ever was.

The main issue seems to be, that our artifacts are now so insanely complex, that there’s too many holes, and modern hackers are quite different from the old skiddies.

In some ways, it’s possible that AI could be a huge boon for security, but I’m worried, because its training data is brogrammer crap.

Nextgrid a day ago | parent [-]

Security has become a big talking point, and industry vultures have zeroed in on that and will happily sell dubious solutions that claim to improve security. There is unbelievable money sloshing around in those circles, even now during the supposed tech downturn ("security" seems to be immune to this).

Actual security on the other hand has decreased. I think one of the worst things to happen to the industry is "zero trust", meaning now any exposed token or lapse in security is exploitable by the whole world instead of having to go through a first layer of VPN (no matter how weak it is, it's better than not having it).

> quite different from the old skiddies

Disagreed - if you look at the worst breaches ("Lapsus$", Equifax, etc), it was always down to something stupid - social engineering the vendor that conned them into handing them the keys to the kingdom, a known vulnerable version in a Java web framework, yet another NPM package being compromised and that they immediately updated to since the expensive, enterprise-grade Dependabot knockoff told them to, and so on.

I'm sure APTs and actual hacking exists in the right circles, but it's not the majority of breaches. You don't need APT to breach most companies.

ChrisMarshallNY 2 days ago | parent | prev [-]

I don't know if 23andMe has done so well, but many of their problems stem from a bad business model, as opposed to that awful breach.

I agree that we need to have "toothier" breach consequences.

The problem is that there's so much money sloshing around, that we have regulatory capture.