| ▲ | dboreham 7 hours ago | |
This comment is wrong in two ways: 1. Current LLMs do much better than produce "small programs that already exist in multiple forms in the training data". Of course the knowledge they use does need to exist somewhere in training data, but they operate at a higher level of abstraction than simply spitting out programs they've already seen whole cloth. Way higher. 2. Inventing a new compression algorithm is beyond the expectations of all but the the most wild-eyed LLM proponents, today. | ||
| ▲ | rabf 5 hours ago | parent | next [-] | |
"the knowledge they use does need to exist somewhere in training data", I'm not to sure about that. The current coding enviroments for AI give the models a lot of reasoning power with tooling to test, iterate and web search. They frequently look at the results of their code runs now and try different approaches to get the desired result. Its common for them to write their own tests unprompted and re-evaluate accordingly. | ||
| ▲ | blauditore 7 hours ago | parent | prev [-] | |
2. is not really true. There are famous people claiming that AI will fix climate change, so we as humans should stop bothering. | ||