▲ | devmor 7 days ago | ||||||||||||||||||||||||||||
> I've started to think of LLM's as a form lossy compression of available knowledge which when prompted produces "facts". That is almost exactly what they are and what you should treat them as. A lossy compressed corpus of publicly available information with a weight of randomness. The most fervent skeptics like to call LLMs "autocorrect on steroids" and they are not really wrong. | |||||||||||||||||||||||||||||
▲ | uh_uh 7 days ago | parent [-] | ||||||||||||||||||||||||||||
An LLM is an autocorrect in as much as humans are replicators. Something seriously gets lost in this "explanation". | |||||||||||||||||||||||||||||
|