▲ | uh_uh 7 days ago | |||||||
An LLM is an autocorrect in as much as humans are replicators. Something seriously gets lost in this "explanation". | ||||||||
▲ | devmor 6 days ago | parent | next [-] | |||||||
Humans do much more than replicate, that is one function we have of many. What does an LLM do, other than output a weighted prediction of tokens based on its training database? Everything you can use an LLM for is a manipulation of that functionality. | ||||||||
▲ | andsoitis 6 days ago | parent | prev | next [-] | |||||||
> An LLM is an autocorrect in as much as humans are replicators. an autocorrect... on steroids. | ||||||||
▲ | xwolfi 6 days ago | parent | prev [-] | |||||||
What are humans, fundamentally, then ? | ||||||||
|