Remix.run Logo
w10-1 5 hours ago

They do require that you allow them to use your name publicly.

They are silent on whether you can prohibit them from training on your input, so I assume you can.

My guess is, if even 10% of maintainers forget to disable training, then Anthropic will have a most excellent source of how really good developers approach problems that can be fed back into the model. That could improve things for everyone.

Of course, the whole purpose of a trial is to induce dependence on the service. Let’s hope that doesn’t reduce the skill of those maintainers. If open source code gets better as a result, that’s good for all.

TuxSH 3 hours ago | parent | next [-]

> By accepting a Program subscription, you grant Anthropic permission to identify you publicly as a Program recipient, including by referencing your name, GitHub username, and associated open source project(s).

I was tempted about applying but that part is everything but nice and I think I'll just pass

saulpw 2 hours ago | parent [-]

There's no non-disparagement clause, so how about you left them use your name etc, and then you can come out in public and say those mean things and shame/embarrass them.

TuxSH an hour ago | parent [-]

Sure, but what I'm slightly worried about is people easily resolving my username to my real name. Maybe I worry too much, dunno

trollbridge 4 hours ago | parent | prev [-]

Of course they're going to train on open-source input (not like you could stop them).

And of course they're also going to train on your private inputs. It's right there in the TOS.

lostmsu 3 hours ago | parent [-]

> And of course they're also going to train on your private inputs. It's right there in the TOS.

Anthropic actually says they won't train on your private inputs on paid plans as long as you opted out. Unlike Google and OpenAI.