▲ | codebje 17 hours ago | |||||||||||||||||||||||||||||||||||||||||||||||||
Specifications that are formally verified can definitely cover real-time guarantees, behaviour under error returns from operations like allocation, and similar things. Hardware failures can be accounted for in hardware verification, which is much like software verification: specification + hardware design = verified design; if the spec covers it, the verification guarantees it. Considering software alone isn't pretty useless, nor is having the guarantee that "inc x = x - 1" will always go from an Int to an Int, even if it's not "fully right" at least trying to increment a string or a complex number will be rejected at compile time. Giving up on any improvements in the correctness of code because it doesn't get you all the way to 100% correct is, IMO, defeatist. (Giving up on it because it has diminishing returns and isn't worth the effort is reasonable, of course!) | ||||||||||||||||||||||||||||||||||||||||||||||||||
▲ | charcircuit 15 hours ago | parent [-] | |||||||||||||||||||||||||||||||||||||||||||||||||
Hardware verification doesn't prevent hardware failures. There is a reason RAM comes with ECC. It's not because RAM designers are too lazy to do formal verification. Even with ECC RAM, bit flips can still happen if multiple bits flip at the same time. There are also things like CPUs taking the wrong branch that occasionally happen. You can't assume that the hardware will work perfectly in the real world and have to design for failure. | ||||||||||||||||||||||||||||||||||||||||||||||||||
|