Remix.run Logo
martin-t a day ago

Programmers are willingly blind to this, at least until it's their code being stolen or they lose their job.

_LLMs are lossily compressed archives of stolen code_.

Trying to achieve AI through compression is nothing new.[0] The key innovation[1] is that the model[2] does not output only the first order input data but also the higher order patterns from the input data.

That is certainly one component of intelligence but we need to recognize that the tech companies didn't build AI, they build a compression algorithm which, combined with the stolen input text, can reproduce the input data and its patterns in an intelligent-looking way.

[0]: http://prize.hutter1.net/

[1]: Oh, god, this phrase is already triggering my generated-by-LLM senses.

[2]: Model of what? Of the stolen text. If 99.9999% of the work to achieve AI wasn't done by people whose work was stolen, they wouldn't be called models.