▲ | fsckboy a day ago | |
>Also, while I'm sure that Anthropic has benefited from training its models on this dataset I thought that they didn't use this data for training, the "crime" here was making the copies. >I think that's fair, considering that two of those books received advances under $20K and never earned out. i don't understand your logic here, if they never earned out that means you were already "overpaid" compared to what they were worth in the market. shouldn't fairness mean this extra bonus goes first to cover the unmet earnout? |