▲ | brookst 2 days ago | |||||||||||||
They’re not dissimilar to human devs, who also often feel the need to replat, refactor, over-generalize, etc. The key thing in both cases, human and AI, is to be super clear about goals. Don’t say “how can this be improved”, say “what can we do to improve maintainability without major architectural changes” or “what changes would be required to scale to 100x volume” or whatever. Open-ended, poorly-defined asks are bad news in any planning/execution based project. | ||||||||||||||
▲ | strls 2 days ago | parent | next [-] | |||||||||||||
A senior programmer does not suggest adding more complexity/abstraction layers just to say something. An LLM absolutely does, every single time in my experience. | ||||||||||||||
| ||||||||||||||
▲ | exitb 2 days ago | parent | prev | next [-] | |||||||||||||
There are however human developers that have built enough general and project-specific expertise to be able to answer these open-ended, poorly-defined requests. In fact, given how often that happens, maybe that’s at the core of what we’re being paid for. | ||||||||||||||
| ||||||||||||||
▲ | awesome_dude 2 days ago | parent | prev [-] | |||||||||||||
Most definitely, asking the LLM those things is the same as asking (people) on Reddit, Stack Overflow, IRC, or even Hacker News |