▲ | dzaima 19 hours ago | |||||||||||||||||||||||||
Even denormals and NaNs should be perfectly consistent, at least on CPUs. (as long as you're not inspecting the bit patterns of the NaNs, at least) Irrational stdlib functions (trig/log/exp; not sqrt though for whatever reason) should really be basically the only source of non-reproducibility in typical programs (assuming they don't explicitly do different things depending on non-controlled properties; and don't use libraries doing that either, which is also a non-trivial ask; and that there's no overly-aggressive optimizer that does incorrect transformations). I'd hope that languages/libraries providing seeded random sources with a guarantee of equal behavior across systems would explicitly note which operations aren't reproducible though, otherwise seeding is rather pointless; no clue if R does that. | ||||||||||||||||||||||||||
▲ | AlotOfReading 16 hours ago | parent [-] | |||||||||||||||||||||||||
Denormals are consistent, but there's hardware variability in whether or not they're handled (even between different FPUs on the same processor in some cases. Depending on codegen and compiler settings you might get different results. NaN handling is ambiguous in two different ways. First, binary operations taking two NaNs are specified to return one of them in IEEE-754. Which one (if they're different) isn't defined and different hardware makes different choices. Second, I'm not aware of any language where NaN propagation has simple and defined semantics in practice. LLVM essentially says "you'll get one of 4 randomly chosen behaviors in each function", for example. | ||||||||||||||||||||||||||
|