| ▲ | harrisi 2 hours ago | |
The aspect of "potentially secure/stable code" is very interesting to me. There's an enormous amount of code that aren't secure or stable already (I'd argue virtually all of the code in existence). This has already been a problem. There's no real ramifications for it. Even for something like Cloudflare stopping a significant amount of Internet traffic for any amount of time is not (as far as I know) investigated in an independent way. There's nobody that is potentially facing charges. However, with other civil engineering endeavors, there absolutely is. Regular checks, government agencies to audit systems, penalties for causing harm, etc. are expected in those areas. LLM-generated code is the continuation of the bastardization of software "engineering." Now the situation is not only that nobody is accountable, but a black box cluster of computers is not even reasonably accountable. If someone makes a tragic mistake today, it can be understood who caused it. If "Cloudflare2" comes about which is all (or significantly) generated, whoever is in charge can just throw their hands up and say "hey, I don't know why it did this, and the people that made the system that made this mistake don't know why it did this." It has been and will continue to be very concerning. | ||
| ▲ | feastingonslop 2 hours ago | parent [-] | |
Nobody is saying to skip testing the software. Testing is still important. What the code itself looks like, isn’t. | ||