| ▲ | hn_throwaway_99 2 hours ago | |
> the LLMs will ship code the LLMs understand, and whether any human specifically understands any particular part will mostly not matter. I find this particularly funny. There were more than a couple Star Trek Episodes where some alien planet depends on some advanced AI or other technology that they no longer understand, and it turns out the AI is actually slowly killing them, making them sterile, etc. (e.g. https://en.wikipedia.org/wiki/When_the_Bough_Breaks_(Star_Tr... ) Sure, Star Trek is fiction, but "humans rely on a technology that they forget how to make" is a pretty recurrent theme in human history. The FOGBANK saga was pretty recent: https://en.wikipedia.org/wiki/Fogbank It just amazes me that people think "Sure, this AI generated code is kinda broken now, but all we need is just more AI code to fix it at some unknowable point in the future because humans won't be able to understand it!" | ||
| ▲ | randallsquared 42 minutes ago | parent [-] | |
If you'd told me 20-30 years ago we'd actually get the Star Trek computer in the mid-2020s and it still wouldn't be actually AGI, I would have thought that very strange and unlikely, so who knows? | ||