| ▲ | lukeify 6 hours ago | |||||||||||||||||||||||||||||||||||||
Most humans also write plausible code. | ||||||||||||||||||||||||||||||||||||||
| ▲ | tartoran 6 hours ago | parent | next [-] | |||||||||||||||||||||||||||||||||||||
LLMs piggyback on human knowledge encoded in all the texts they were trained on without understanding what they're doing. Humans would execute that code and validate it. From plausible it'd becomes hey, it does this and this is what I want. LLMs skip that part, they really have no understanding other than the statistical patterns they infer from their training and they really don't need any for what they are. | ||||||||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||||||||
| ▲ | gitaarik an hour ago | parent | prev | next [-] | |||||||||||||||||||||||||||||||||||||
All code is plausible by design | ||||||||||||||||||||||||||||||||||||||
| ▲ | 6 hours ago | parent | prev [-] | |||||||||||||||||||||||||||||||||||||
| [deleted] | ||||||||||||||||||||||||||||||||||||||