▲ | fragmede 2 days ago | |||||||
> AI gets trained on knowledge generated by AI? This sounds like the ouroboros snake eating its own tail, which it is, but because of tool use letting it compile and run code, it can generate code for, say, rust that does a thing, iterate until it's gotten the borrow checker to not be angry, then run the code to assert it does what it claims to, and then feed that working code into the training set as good code (and the non-working code as bad). Even using only the recipe books you already had, doing a lot of cooking practice would make you a better cook, and once you learn the recepies in the book well, mixing and matching recepies; egg preparation from one, flour ratios from another, is simply just something a good cook would just get a feel for what works and what doesn't, even if they only ever used that one book. | ||||||||
▲ | ethanwillis 2 days ago | parent [-] | |||||||
The original recipe books don't cover all possible things that could be created, not even in all their combinations. And most importantly even the subset of novel combinations that can be created from the recipe books -- there is something missing. What's missing is the judgement call of a human to say if some newly created information makes sense to us, is useful to us, etc. The question above is not about whether new information can be created or the navigation of it. It's about the applicability of what is created to human ends. | ||||||||
|