Remix.run Logo
brookst 2 days ago

They’re not dissimilar to human devs, who also often feel the need to replat, refactor, over-generalize, etc.

The key thing in both cases, human and AI, is to be super clear about goals. Don’t say “how can this be improved”, say “what can we do to improve maintainability without major architectural changes” or “what changes would be required to scale to 100x volume” or whatever.

Open-ended, poorly-defined asks are bad news in any planning/execution based project.

strls 2 days ago | parent | next [-]

A senior programmer does not suggest adding more complexity/abstraction layers just to say something. An LLM absolutely does, every single time in my experience.

awesome_dude 2 days ago | parent [-]

You might not, but every "senior" programmer I have met on my journey has provided bad answers like the LLMs - and because of them I have an inbuilt verifier that means I check what's being proposed (by "seniors" or LLMs)

exitb 2 days ago | parent | prev | next [-]

There are however human developers that have built enough general and project-specific expertise to be able to answer these open-ended, poorly-defined requests. In fact, given how often that happens, maybe that’s at the core of what we’re being paid for.

brookst a day ago | parent | next [-]

But if the business doesn’t know the goals, is it really adding any value to go fulfill poorly defined requests like “make it better”?

AI tools can also take a swing at that kind of thing. But without a product/business intent it’s just shooting in the dark, whether human or AI.

awesome_dude 2 days ago | parent | prev [-]

I have to be honest, I've heard of these famed "10x" developers, but when I come close to one I only ever find "hacks" with a brittle understanding of a single architecture.

awesome_dude 2 days ago | parent | prev [-]

Most definitely, asking the LLM those things is the same as asking (people) on Reddit, Stack Overflow, IRC, or even Hacker News