| ▲ | PeterWhittaker 5 hours ago | |||||||
I enjoyed the article, though I do have to pick nits with: > Software used to be deterministic Ah, someone fortunate enough to have never coded a heisenbug or trip over UB of various causes. I've written plenty of well structured, well thought out mostly-deterministic software, then spent hours or days figuring what oversight summoned the gremlins. (There is one low priority bug I've occasionally returned to over the last two-three years in case experience and back-burner musing may result in insight. Nope. Use gcc, no bug, use clang, bug, always, regardless of O level, debug level, etc. Everything else, all of it far more complex, works 100% reliably, it's just that one display update that fails.) (It occurs to me that that is a bad example, because it IS deterministic, but none of us can pinpoint the "determiner".) | ||||||||
| ▲ | grayhatter 5 hours ago | parent [-] | |||||||
all code is deterministic for a given input. If you don't understand the behavior, it's because you don't understand the complete set of inputs into the system. Assuming you're not tripping over some hardware defect, it sounds like you're using a gcc hack that llvm doesn't support for a display update, sounds like memory ordering | ||||||||
| ||||||||