▲ | alanbernstein 14 hours ago | |||||||
Just try to imagine what you would have thought about this technology if you saw it with no warning, 10 years ago. Would "a few tens of thousands of lines of code" still seem small? | ||||||||
▲ | pron 13 hours ago | parent | next [-] | |||||||
I'm not saying it's not impressive or that it doesn't show great promise, but there are clearly challenges, and we don't yet know when or how they'll be solved. From some big LLM fans I've heard that one major problem is that of trust: Unlike tools/machines, LLMs cannot be trusted to reliably succeed or fail in an obvious way; unlike people, LLMs cannot be trusted to communicate back useful feedback, such as important insights or pitfalls. So while in some respects LLMs are superior to both humans and existing automation, in others they're inferior to both. Maybe we'll be able to fix these problems within the current LLM technology, and maybe we'll be able to do that soon, but neither of these is obviously inevitable. My pet issue with one form of inevitability, as I mentioned above, is that if we get to a point where software can reliably write other software for us, then we're also at a point where we don't need any of other software to be actually written, at least not in some human-readable form. There will just be one (kind of) program that does what we ask it to; why would we ask it to write programs? | ||||||||
| ||||||||
▲ | badRNG 13 hours ago | parent | prev [-] | |||||||
The OG ChatGPT released less than three years ago. Prior to that, 20 lines of code would seem wild. Does anyone remember leetcode? |