Remix.run Logo
sosodev a day ago

Nope. Pretraining runs have been moving forward with internet snapshots that include plenty of LLM content.

sejje a day ago | parent [-]

Sure, but not all of them are stupid enough to keep doing that while watching the model degrade, if it indeed does.