Remix.run Logo
nsnzjznzbx 9 hours ago

We will get to the point where you can quickly bootstrap i.e. an LLM can train a better LLM in a loop, leave it and it can really learn. Like learn learn.

"Train yourself to solve this problem see OBJECTIVE.md"

nine_k 7 hours ago | parent [-]

This is the kind of runaway self-improving development that proponents of the singularity keep talking about.

The problem is that training appears to be really slow and expensive. Some quality thinking is required to improve the training approach and the architecture before committing resources to training a new large model. And even the largest models are by now not nearly as good at quality thinking as the best humans.

3 hours ago | parent [-]
[deleted]