Remix.run Logo
vanrohan 9 hours ago

I know LLM generated code comes with it's own challenges, but the absolutists are definitely clinging to a time that has passed. I saw a recent discussion on Immich where a maintainer flatout denied a PR saying "That diff looks LLM-generated to me; is that indeed the case? If so, we'd prefer not to receive a PR for it" The PR was from a professional software engineer, who worked weeks of his free time on a big feature. Well structured + tested. Dismissed just because AI was used. https://github.com/immich-app/immich/discussions/23745#discu...

59nadir 9 hours ago | parent [-]

"Well structured + tested". Who would know? The diff is almost 200k changed lines. Good on them for saying no to this nonsense.

There's a good chance the actual needed implementation is less than 20k lines (I've found that LLM bloat grows exponentially), but even that's a stretch to review and accept wholesale.

Deeds67 8 hours ago | parent [-]

I'm the person working on that fork. Yes, it has now diverged 200k+ lines, but half of that is specs, research and documentation and includes a month worth of work.

The comment in question was a small feature of about 1.5k lines changed and it was solidly tested.

59nadir 8 hours ago | parent [-]

Eh, fair enough. 1.5k is reasonable. Have you tried just writing it yourself instead? It's likely it'll be less than 1k lines and you should have no problems writing an implementation yourself if you understand the structure of the LLM version.

Deeds67 8 hours ago | parent [-]

[dead]

59nadir 6 hours ago | parent [-]

Heh, fair enough. To me this comes off as "I'm unable to write it myself [possibly because I've outsourced my thinking too much]", to be honest, but I'm not going to argue; you're the one who presumably wants this code to end up in that repository.

I wouldn't really consider (what is likely) sub-1kloc a "large feature", but to each their own.

Deeds67 6 hours ago | parent [-]

I don't want it to end up in that repo anymore, hence the fork. I've got a growing community of people who have been eagerly awaiting this feature and a ton more that I built.

I definitely could write this by hand - the stuff I built in the last 10 years before LLMs was more complex than this - but theres no way I'm spending all my free time to slowly craft something if I can just use AI and get the same results much faster