Remix.run Logo
unvalley 5 hours ago

Anthropic’s models have almost certainly gorged on an enormous amount of OSS, and if they think they can settle that debt with only six months of perks for the maintainers who’ve kept that ecosystem alive, it comes across as pretty arrogant.

LaurensBER 4 hours ago | parent | next [-]

It's amazing how quickly Anthropic is turning into the "bad" guys.

First we couldn't use our Claude subscription with anything but Claude code, then the limits seemed to change every week without any communication, then they banned a bunch of people (including some prominent names). Then they complain about the Chinese distilling using their API (which I'm partly sympathetic to but let's not pretend that Antrophic invented their training data from scratch).

Then there's this half-baked offer. I mean sure, it looks nice on paper but given how incredibly valuable opensource has been for them and given their budget it does seem a bit tight.

upmind 4 hours ago | parent | prev | next [-]

6mo is so low, from the title I thought it'd be unlimited tbh especially considering they'll continue to crawl the content 6mo in the future

cloverich 5 hours ago | parent | prev [-]

Uncharitably, I think this is a strategy to gorge further especially if they select for higher quality open source. They are embracing the best to train off iteration patterns of the best, and have a semi self correcting slop mechanism.

Charitably this will be great for open source software so... so long as they never moat up and lockdown.

_flat20 3 hours ago | parent [-]

Can't they just keep scraping these repositories for new data anyway? Or has that changed?