Remix.run Logo
gavinray 7 hours ago

Why have we been fed the narrative that training models on their own output progressively degrades quality?

It's the first thing anyone would think of (like a self-hosted compiler) but everything I've read said "it doesn't work."

EDIT: For context:

  > Shumailov et al. (2024) — "AI models collapse when trained on recursively generated data" (Nature, 2024)