Remix.run Logo
adrianN 2 days ago

There already are areas where such standards exist, eg safety critical applications in aviation. Arguably the defect rate there _is_ lower, but I still think that this method for achieving this is quite inefficient. And I think that writing aviation software that doesn’t crash is a lot easier to define a process for than for writing software that is difficult to hack.

jcgrillo 2 days ago | parent [-]

The missing piece is the requirement for a certified Professional Engineer to sign off on the system. That decouples the incentives from the corporate objectives, and makes it personal. We need that kind of professional accountability in software, otherwise it'll continue to be bad.

adrianN 2 days ago | parent | next [-]

It is my understanding that personal responsibility already exists in safety critical software development.

jcgrillo 2 days ago | parent [-]

I hadn't actually heard of this, I have never worked on safety critical systems, but doing some googling I found a reference to a "Designated Engineering Representative" in this article about the FAA's DO-178B: https://en.wikipedia.org/wiki/DO-178B

I wasn't able to find much information about U.S. P.E. certification for SWE's, although there is at least one state which offers it. I wasn't able to find any indication anywhere that a compliance process requires a P.E. to sign off on software. That doesn't mean it doesn't exist though!

One major problem is that now that software is "everywhere" it's escaping the boundaries of safety critical standards. Nobody will be killed directly by a bank getting hacked, but it could result in mortal harm to an individual who has their identity stolen. There are all kinds of systems that aren't labeled safety critical in the kinetic sense which are nonetheless very load-bearing. Software which runs on phones, for example. Surely people have died due to buggy phone software. Nobody is being held meaningfully accountable, so it will continue to happen.

To be clear, I'm not saying we should heap a whole lot more pressure onto security teams. Instead we need to find better ways to make security every engineer's professional ethical responsibility--either directly because they're signing off on the system or indirectly because their respected senior colleague is. I just don't see fines getting us there.

adrianN a day ago | parent [-]

I understand your argument, but I feel that good development practices are essentially a company culture question. Putting additional burden on the engineers does not sit well with me as in my experience many engineers already care more about quality and security than their management would like to give them manpower to implement. If we make it easier to have some engineers be scapegoats for management failure this might actually backfire.

2 days ago | parent | prev [-]
[deleted]