▲ | invalidptr 5 days ago | |||||||
>All previous programming abstractions kept correctness That's not strictly speaking true, since most (all?) high level languages have undefined behaviors, and their behavior varies between compilers/architectures in unexpected ways. We did lose a level of fidelity. It's still smaller than the loss of fidelity from LLMs but it is there. | ||||||||
▲ | pnt12 5 days ago | parent | next [-] | |||||||
That's a bit pedantic: lots of python programs will work the same way in major OSs. If they don't, someone will likely try to debug the specific error and fix it. But LLMs frequently hallucinate in non deterministic ways. Also, it seems like there's little chance for knowledge transfer. If I work with dictionaries in python all the timrle, eventually I'm better prepared to go under the hood and understand their implementation. If I'm prompting a LLM, what's the bridge from prompt engineering to software engineering? Not such direct connection, surely! | ||||||||
| ||||||||
▲ | ashton314 5 days ago | parent | prev | next [-] | |||||||
Undefined behavior does not violate correctness. Undefined behavior is just wiggle room for compiler engineers to not have to worry so much about certain edge cases. "Correctness" must always be considered with respect to something else. If we take e.g. the C specification, then yes, there are plenty of compilers that are in almost all ways people will encounter correct according to that spec, UB and all. Yes, there are bugs but they are bugs and they can be fixed. The LLVM project has a very neat tool called Alive2 [1] that can verify optimization passes for correctness. I think there's a very big gap between the kind of reliability we can expect from a deterministic, verified compiler and the approximating behavior of a probabilistic LLM. | ||||||||
▲ | ndsipa_pomu 5 days ago | parent | prev | next [-] | |||||||
However, the undefined behaviours are specified and known about (or at least some people know about them). With LLMs, there's no way to know ahead of time that a particular prompt will lead to hallucinations. | ||||||||
▲ | sieabahlpark 5 days ago | parent | prev [-] | |||||||
[dead] |