| ▲ | nextos 15 hours ago |
| AFAIK, it's a bit more than hyper-parameter tuning as it can also make non-parametric (structural) changes. Non-parametric optimization is not a new idea. I guess the hype is partly because people hope it will be less brute force now. |
|
| ▲ | gwerbin 15 hours ago | parent | next [-] |
| It's an LLM-powered evolutionary algorithm. |
| |
| ▲ | ainch 15 hours ago | parent | next [-] | | I'd like see a system like this take more inspiration from the ES literature, similar to AlphaEvolve. Let's see an archive of solutions, novelty scoring and some crossover rather than purely mutating the same file in a linear fashion. | | |
| ▲ | nextos 14 hours ago | parent [-] | | Exactly, that's the way forward. There are lots of old ideas from evolutionary search worth revisiting given that LLMs can make smarter proposals. |
| |
| ▲ | UncleOxidant 14 hours ago | parent | prev [-] | | That was my impression. Including evolutionary programming which normally would happen at the AST level, with the LLM it can happen at the source level. |
|
|
| ▲ | coppsilgold 15 hours ago | parent | prev [-] |
| Perhaps LLM-guided Superoptimization: <https://en.wikipedia.org/wiki/Superoptimization> I recall reading about a stochastic one years ago: <https://github.com/StanfordPL/stoke> |