▲ | gabriel666smith 3 days ago | |
Thanks! That's really helpful. > What contradictions do you see? I don't see any. I guess us seeing very different things is also what a settlement might be for :-). But I think I was wrong. I think others in the thread are debating the contradictions I saw. I tried typing them out when I made my earlier comment, but couldn't get them to fix to any kind of logic that made sense to me. They just seemed contradictory, at the time. I think the same arguments have now been made much more clearly by others - specifically around whether a corporation downloading this work is the same as a human downloading it - and the responses have been very clear also. The settlement figure was tied implicitly to Anthropic's valuation in the Ars article [0] where I think I originally posted my comment. Those comments were moved here, so I've linked below. Specifically linking the settlement sum to the valuation of a corporation is what caught me in a loop - that valuation assumes that Anthropic will do certain things in the future. I was thinking too much, maybe, about things like: "Would a teenager get the same treatment? What about a teenager with a private company? What about a teenager who seemed dumber than that teenager to the person deciding their company's valuation? What about a teenager who had not opened the files themselves, but had spun up a model from them? What about a teenager who had done both?" Etc. I think I was getting fixated on the idea that the valuation assumes future performance, and downloading the files was possibly necessary for that performance, but I was missing the obvious answers to some of my questions because of that. I do think that some of the more anthropomorphising language - "training data" is an example - trips people up a lot in the same way. And I think that if the settlement sum reflects anything to do with the valuation of that corporation, that does create some interesting questions, but maybe not contradictions. [0] https://arstechnica.com/tech-policy/2025/09/judge-anthropics... |