Remix.run Logo
drnick1 2 days ago

I don't understand this anti-AI stance. Either the code works and is useful, and it should be accepted, or it doesn't work and it should be rejected. Does it really matter who wrote it?

q3k a day ago | parent | next [-]

The code is only a projection of someone's mental model, which is what actually allows the project to succeed, especially in the long term.

That's why codebases die when they lose maintainers and forks often don't make it past the first few months.

LLM-generated code might work, but it's not backed by anyone's mental model. And the industry has had a long running term for code which is there but no-one understands it nor the reason behind it: legacy code.

miningape a day ago | parent | prev | next [-]

Because the quality of the code matters more than the raw quantity.

Having code that compiles and runs is the bare minimum - we should (and do) hold ourselves to a higher standard of professionalism.

cuu508 a day ago | parent | prev | next [-]

The linked API policy lists specific concerns in 3 categories: copyright, quality, ethical. Which one do you not understand?

drnick1 16 hours ago | parent [-]

I don't care about "ethics" in the abstract if the code works and is of good "quality" (however you choose to define that). AIs don't have copyright over anything they generate, so that's a non issue. In fact, if the code is any good, it should be impossible to tell if it was written by AI at all.

sensanaty 2 days ago | parent | prev | next [-]

LLMs give idiots the power to effectively DDoS repos with useless slop PRs that they have to expend the time and effort to triage and ignore. Like the curl maintainers have said, the review burden of looking at mountains of AI-generated crap is horrifically time consuming.

globular-toast 2 days ago | parent | prev [-]

Putting together and maintaining a GNU/Linux distribution and maintaining it for 23 years requires a bit more than "works for me and is useful".