Remix.run Logo
drdaeman 4 days ago

[Edit: turns out I've got it wrong, and 5-year retention only said to apply to the data they're allowed to train on. This changes things for me.]

Personally, I don't mind training, as long as I have a say on the matter - and they have a switch for this. Opt-out is not exactly cool, but I've got the popup in my face, almost a month before the changes, and that's respectful enough for me.

This said, I've just canceled my subscription because this new 5-year mandatory data retention is a deal breaker for me. I don't mind 30 or 60 days or even 90 days - I can understand the need to briefly persist the data. But for anything long-term (and 5 years is effectively permanent) I want to be respected with having a choice, and I'm provided none except for "don't use".

A shame, but fortunately they're not a monopoly.

some_random 4 days ago | parent [-]

Data retention appears to be predicated on opting in to allowing training. If you don't opt in, they retain it for the same 30 days they were already retaining it for. https://www.anthropic.com/news/updates-to-our-consumer-terms

drdaeman 4 days ago | parent [-]

Oh! Thank you!

That popup was confusing as hell then, because I've read and understood it as two separate points: I've got it that they're making training opt-out, and that they're changing data retention to 5 years, independent of each other. I got upset over this, and haven't really researched into the nuances - and turns out I've got it all wrong.

Appreciate your comment, it's really helpful!

I hope they change the language to make it clear 5 years only applies to the chats they're allowed to train models on.

(Weirdly, I can't find the word "years" anywhere on their Privacy Policy, and the only instance on the Consumer Terms of Service pages is about being of legal age over 18 years old.)