Remix.run Logo
GeoAtreides 2 hours ago

he doesn't have solid points, he conflates fair use with free use (?), ignores thousands of years of attribution history, and equates normal human to human learning with corporate LLMs training on original content (without consent). Great presentation, like you said, to cover the logical defects.

steveklabnik 2 hours ago | parent | next [-]

I did say "free use" instead of "fair use," yeah. That's my mistake, thank you for the correction. If I could edit my original comment, I would, mea culpa. Typos happen.

GeoAtreides an hour ago | parent [-]

I see. I must congratulate you on your rhetorical prowess, it's nice seeing a professional at work.

sillysaurusx 2 hours ago | parent | prev [-]

Fair use of training data hasn’t yet been settled in court. People here are treating it like it has been. But no amount of wishful thinking or moral arguments will change a verdict saying it’s fine for training data to be used as it has been.

Until that question is settled, it’s disingenuous to dismiss his points out of hand as conflating fair use or ignoring consent.

steveklabnik an hour ago | parent | next [-]

Even beyond that, the initial legal opinion we do have did in fact point to training being fair use: https://www.reuters.com/legal/litigation/anthropic-wins-key-...

However, I don't feel comfortable suggesting that this is settled just yet, one district judge's opinion does not mean that other future cases may disagree, or we may at some point get explicit legislation one way or the other.

GeoAtreides an hour ago | parent | prev [-]

I was just enumerating some of the issues with the '''solid''' points OP made. Actually addressing them would take too long and be exercise in futility, here, in HN, in april 2026. Why would I put in the effort, for my comment to be flagged and sent to the void? or worse, persisted for ever and used for training without my consent?

And yes, you are right, the legal and moral question of fair use in training data hasn't been settled yet; we agree here.