| ▲ | onion2k 3 hours ago | |
If your tool regularly lies, gaslights and produces wrong results, that's a tooling issue. It's a human issue if you don't recognise that the code it's generated is wrong. That will never change no matter how good the tooling gets. | ||
| ▲ | kiba 3 hours ago | parent | next [-] | |
The tooling is the issue because humans designed the tooling wrong. It's a chatbot interface fined tuned to sycophancy. That's not a coincidence. | ||
| ▲ | Hamuko 2 hours ago | parent | prev [-] | |
Isn't part of the problem that these tools are advertised as allowing non-coders to code? How are you gonna recognise that the code is wrong when you don't know how to code and the product is telling you that you don't even need to? | ||