Remix.run Logo
devmor 7 days ago

> I've started to think of LLM's as a form lossy compression of available knowledge which when prompted produces "facts".

That is almost exactly what they are and what you should treat them as.

A lossy compressed corpus of publicly available information with a weight of randomness. The most fervent skeptics like to call LLMs "autocorrect on steroids" and they are not really wrong.

uh_uh 7 days ago | parent [-]

An LLM is an autocorrect in as much as humans are replicators. Something seriously gets lost in this "explanation".

devmor 6 days ago | parent | next [-]

Humans do much more than replicate, that is one function we have of many.

What does an LLM do, other than output a weighted prediction of tokens based on its training database? Everything you can use an LLM for is a manipulation of that functionality.

andsoitis 6 days ago | parent | prev | next [-]

> An LLM is an autocorrect in as much as humans are replicators.

an autocorrect... on steroids.

xwolfi 6 days ago | parent | prev [-]

What are humans, fundamentally, then ?

vrotaru 6 days ago | parent [-]

That is a good questions and I guess we have good progress since Plato whose definition was - A man is a featherless biped.

But I think we still do not know.