| ▲ | slopinthebag 2 hours ago | |
So you struggled to improve velocity without AI tools, are you worried that using the AI tools as a crutch will just lead to a death spiral of bad code being shipped increasingly faster? I've only ever seen the AI adoption approach work on fully functional teams. The concern as well is that by forcing the AI onto developers, they eventually throw their hands up and say "well they dont care about code quality anymore, neither should I" and start shipping absolute vibeslop. | ||
| ▲ | 9dev 2 hours ago | parent | next [-] | |
> I've only ever seen the AI adoption approach work on fully functional teams. It's not that the team isn't functioning, it's that it's a pretty diverse team in terms of experience, which means things just used to take a while to finish. > The concern as well is that by forcing the AI onto developers, they eventually throw their hands up and say "well they dont care about code quality anymore, neither should I" and start shipping absolute vibeslop. This is IMHO avoidable by emphasising code reviews and automated tooling; my general policy is still that everyone is responsible for what they push, period. So absolute vibeslop isn't what I'm seeing, rather an efficiency miscalculation on which parts should be written by humans and which by the AI. | ||
| ▲ | nz an hour ago | parent | prev [-] | |
The vast majority of workplaces have never cared about code quality (with the exception being the actual engineers that write the code). Everyone else has no clue what programmers do, other than, "they write arcane symbols, and our product works, and our business continues to function". They do not know that code can even _have_ quality. It does not help that they only ever have to interact with engineering when something is going _wrong_, which conditions them to associate engineers with stress and failure and angry customers. Nobody ever thinks of engineers when everything is going well. The LLM mandates stem from a combination of mistrust and resentment. I know, from second hand experience, that long before coding LLMs became a thing, engineers would ship slop when it became clear that their superiors cared about deadlines uber alles (i.e. not shipping slop would be the same thing as quitting, but without the paycheck -- slop code is often a form of quiet quitting). Most people would _prefer_ to be able to "program" their entire business from a spreadsheet. LLMs have enabled them to get involved, and they cannot understand why engineers reject this "help" (it is for the same reason that a pilot would reject a copilot that thinks he knows how to fly because he played a flight simulator or read Jonathan Livingston Seagull; flight simulators are used in training too, but they are not a substitute for actual piloting experience). This refusal and resistance feeds into the mistrust and resentment. We live in a world where managers and administrators do not understand what they are managing and administrating, nor do they think that this is part of their job description. In the worst cases, they believe their job is to extract compliance from their subordinates. There is a _lot_ of alpha in being part of a company, where authorities understand how the internals of the business (including software and IT!) _actually_ function. (One engineer told me that clueless yet demanding managers are, for all intents and purposes, unwitting saboteurs, and that the best a company can do about this is get him a job interview at a competitor). In some sense, the economy is just a machine for transferring wealth from those who do not know something essential, to those who do know something essential. This can veer uncomfortably close to exploitation. If we want to avoid crossing that line, we need to cultivate an economy where a lack of understanding is not seen as an _opportunity for profit_, but rather _as an opportunity for illumination_. | ||