Remix.run Logo
lunar_mycroft 3 hours ago

LLMs can regurgitate almost all of the Harry Potter books, among others [0]. Clearly, these models can actually regurgitate large amounts of their training data, and reconstructing any gaps would be a lot less impressive than implementing the project truly from scratch.

(I'm not claiming this is what actually happened here, just pointing out that memorization is a lot more plausible/significant than you say)

[0] https://www.theregister.com/2026/01/09/boffins_probe_commerc...

StilesCrisis 3 hours ago | parent [-]

The training data doesn't contain a Rust based C compiler that can build Linux, though.