Remix.run Logo
lenerdenator 2 days ago

> Moreover, making them enormous (as you put it well "valuation-cratering") unfairly punishes people who are not directly responsible for the failure.

For better or for worse, that's how we've set up our system. The entire point of incorporation is to separate the people working at a company from the company, legally speaking. The most they can really do is fire you.

With respect to valuation-cratering, that's supposed to be considered fair in our system. If a bunch of shareholders elect a board that lets a C-suite operate a company with lax security culture, they're ultimately responsible for the losses they incur. It's only fair; they're the ones getting the profits, too.

I put emphasis on "supposed" because we don't really do this anymore. Instead of expecting shareholders to take the bath, we shift the loss onto the customers, who have to put up with the consequences of identity theft out of their own pockets.

jcgrillo a day ago | parent [-]

In some professions incorporation doesn't shield you from malpractice claims. We have a software quality problem in tech, to the point where buggy, broken software has become completely normal. Security vulnerabilities are so commonplace and routine that companies hire entire departments to respond to security breaches.

We don't have bridges falling down left and right, airplane crashes aren't common, trains don't leave the rails every day. Software is all grown up now, and we need to grow up as a discipline whether we're ready or not. It'll get a lot worse before this happens, but I'm convinced it will happen eventually.

lenerdenator a day ago | parent [-]

That would require us to have the equivalent of a bridge falling down.

I'd say some of that depends on the domain that the software is developed for. I've spent most of my 12 years developing software in healthcare IT. Typically, you don't see too many critical (meaning life-threatening) bugs in EMR/EHR software, which is one of those domains where you'd think it'd be easy to run into that sort of thing. Most of the problems in the domain have to do more with data access being granted or obtained by someone who shouldn't have it. You won't die or get seriously injured as a result of the software, but some guy in a dank basement outside Moscow might know you need your knee replaced.

A lot of that comes from the fact that software for systems that could have a bridge collapse-level of impact are already certified as a part of a larger regulatory scheme for the domain in which the software operates. Healthcare and avionics software instantly come to mind. A lot of people in the KC area make their living writing software for those domains, and while they aren't required to have an engineering license to do so, their wares have to be vetted enough that they have to work at the same level.

You'd need to convince lawmakers to set up a regulating body that tracks business and consumer software for security in the same way we do EHRs for patient safety risks.

jcgrillo a day ago | parent [-]

It's more of a slow burn than any big event you can point to, so it's possible nothing will get done. Hopefully the political pressure will develop as the political class becomes more tech savvy. They will, over time, as younger people who grew up with technology replace the older generation. If the AI bubble pops in a way that causes a financial crisis I think we'll probably see a lot more scrutiny of the tech industry in general, which could lead to progress in quality. Maybe it won't happen, but the current state of things getting shipped before they're done, constant outages, everything being kind of broken all the time... it just doesn't seem sustainable.