| ▲ | exe34 6 days ago |
| what we should and what we are forced to do are very different things. if I can get a machine to do the stuff I hate dealing with, I'll take it every time. |
|
| ▲ | mgaunard 6 days ago | parent | next [-] |
| who's going to be held accountable when the boilerplate fails? the AI? |
| |
| ▲ | danielbln 6 days ago | parent | next [-] | | The buck stops with the engineer, always. AI or no AI. | | |
| ▲ | mgaunard 5 days ago | parent [-] | | I've seen juniors send AI code for review, when I comment on weird things within it, it's just "I don't know, the AI did that" | | |
| ▲ | danielbln 5 days ago | parent [-] | | Oh, me too. And I reject them as the same as if they had copied code from Stack Overflow they can't explain. |
|
| |
| ▲ | exe34 6 days ago | parent | prev [-] | | no, I'm testing it the same way I test my own code! | | |
|
|
| ▲ | skydhash 6 days ago | parent | prev [-] |
| It's like the xkcd on automation https://xkcd.com/1205/ After a while, it just make sense to redesign the boilerplate and build some abstraction instead. Duplicated logic and data is hard to change and fix. The frustration is a clear signal to take a step back and take an holistic view of the system. |
| |
| ▲ | gibbitz a day ago | parent [-] | | And this is a great example of something I rarely see LLMs doing. I think we're approaching a point where we will use LLMs to manage code the way we use React to manage the DOM. You need an update to a feature? The LLM will just recode it wholesale. All of the problems we have in software development will dissolve in mountains of disposable code. I could see enterprise systems being replaced hourly for security reasons. Less chance of abusing a vulnerability if it only exists for an hour to find and exploit. Since the popularity of LLMs proves that as a society we've stopped caring about quality, I have a hard time seeing any other future. |
|