Remix.run Logo
vultour 6 hours ago

Are you ashamed of other people finding out you used Claude? I think the co-authored-by bit should not be a setting at all, AI-generated code should be clearly identified.

sieve 5 hours ago | parent | next [-]

> AI-generated code should be clearly identified.

Let AI autonomously produce code of a quality that I care about and I might consider giving it credit. I don't know how other people write code but I come up with an idea and use a multitude of LLMs to brainstorm a reasonably comprehensive spec that any reasonably competent person can read and produce a working program from, including a locally working Q2 quant of Qwen 3.6. Even Kimi is as good as Claude at most coding tasks, and I don't see why any single agent deserves any credit for my design.

Let artists and filmmakers start watermarking their output with the tools they use and I might reconsider my decision.

Paracompact 5 hours ago | parent [-]

> Let artists and filmmakers start watermarking their output with the tools they use and I might reconsider my decision.

They do, though, in the form of metadata.

sieve 5 hours ago | parent [-]

Do Adobe or Arri or Red get authorship credit for the work their hardware and software do on projects? After all, artists would not be able to produce a single pixel without them. In a similar vein, you could make the argument that modern farming is sitting on your ass in your modern tractor while software handles most of the work. Does John Deere get rights over a quarter/half your harvest?

I am stuck between the luddites and "artisanal" coders on this one. LLMs are neither as smart/useful or as dumb/useless as people think. Unless your job involves producing useless garbage every single day, good software requires a lot of thought before the first line of code is even written. For those with serious domain knowledge, the thinking time can be compressed into minutes/hours rather than days/weeks it might take.

LLMs are a tool. You either pay for it or you use the freely available ones on your own hardware. As long as the output is directed by my thinking, the output belongs to me. If it were up to me, I would abolish IPR (and even permanent ownership of land) as a category altogether, but that is a different discussion.

NateEag 5 hours ago | parent | prev | next [-]

I think the Linux kernel's standard of disclosure via the "Assisted-By" trailer is the right move.

Makes it clear you used a bullshit machine, without implying it's an author.

...assuming you think using them at all is a good move - I won't deny they have some utility (though I'd argue much lower than many seem to think), but I do presently believe they're a disaster for humanity.

The ruination of the Internet with slop, the massive propagation of propaganda, and the insanely easy-to-wield tools for abuse are in no way worth the ability to accrue tech debt at 10x velocity (though to be clear, accruing tech debt can absolutely be a useful strategy, if one I personally dislike).

dangus 5 hours ago | parent | prev | next [-]

Basically what you’re saying is that if AI does anything on your computer, anything the AI impacts you should lose control over. If the AI touched it at all in any way, big or small, you now lose ownership of the actions your computer takes (on open source tools, I might add).

In case you need reminding of common sense, I’m supposed to be allowed to decide what my commit messages are because it’s my fucking computer.

I prefer that my software is not a morality police.

bdangubic 5 hours ago | parent | prev [-]

mind-boggling people are trying to hide this, tells you all you need to know about our “profession.” presence of that hook or the like in a place of business should be fireable offense