| ▲ | sigmonsays 10 hours ago | |
when AI starts training itself accidentally on AI generated content, we all lose... | ||
| ▲ | whycombinetor 9 hours ago | parent | next [-] | |
Meanwhile on the HN main page right now: "Embarrassingly Simple Self-Distillation Improves Code Generation" https://news.ycombinator.com/item?id=47637757 | ||
| ▲ | sesm 10 hours ago | parent | prev [-] | |
Don't we already have "RLHF on synthetic data"? | ||