| ▲ | wavemode 2 days ago | ||||||||||||||||
I think, rather, you're romanticizing what "real" engineering looks like. Real engineering doesn't mean that mistakes are never made or that there are never bugs. Rather, it is that systems are tested thoroughly enough, and designed with enough failsafes and redundancy, that safety concerns are mitigated. The problem in the Boeing case was not that the software had bugs. Lots of aviation software has bugs, it's actually very common. Rather, the problem was that they did not design the system to be safe in the event a bug occurred. How that looks exactly tends to differ depending on the system. As a common example, many aircraft systems have other systems which monitor them and emit a warning if they detect something which doesn't make sense. Though this would've required Boeing to create technical material for pilots on how to respond to this new type of warning, which would've required training updates, which would've required recertification of their plane design, the cost of which Boeing desperately wanted to avoid. Fortunately (unfortunately), FAA oversight had become very lax, so Boeing instead just downplayed the safety concerns and nobody asked any questions. | |||||||||||||||||
| ▲ | marcosdumay 2 days ago | parent [-] | ||||||||||||||||
> Rather, it is that systems are tested thoroughly enough, and designed with enough failsafes and redundancy Yeah... That's one of the main reasons why engineers from most other disciplines have a lot of difficulty creating large reliable software. Testing systems and adding failsafes are not nearly enough for system reliability, and not the best way to add it to software. It's almost enough for mechanical engineering... Almost, because it's not enough to make human interactions safe. | |||||||||||||||||
| |||||||||||||||||