| ▲ | Barbing 5 hours ago | |
Anthropic distills GPT? | ||
| ▲ | yorwba 4 hours ago | parent | next [-] | |
Everybody training models on large amounts of lightly filtered internet text is partially distilling every other model that had its output posted verbatim to the internet. | ||
| ▲ | beAbU 3 hours ago | parent | prev [-] | |
And OpenAI probably distills anthropic, who would't? It's all one big incestuous mess. In a couple of years we'll be talking about AI brainrot. | ||