Remix.run Logo
CuriouslyC 10 months ago

[flagged]

Xelynega 10 months ago | parent [-]

If we really want to be technical, in common law systems anything is legal as long as the highest court to challenge it decides it's legal.

I guess I should have used the phrase "common sense stealing in any other context" to be more precise?

krisoft 10 months ago | parent [-]

> I guess I should have used the phrase "common sense stealing in any other context" to be more precise?

Clearly not common sense stealing. The Intercept was not deprived of their content. If OpenAI would have sneaked into their office and server farm and took all the hard drives and paper copies with the content that would be "common sense stealing".

TheOtherHobbes 10 months ago | parent [-]

Very much common sense copyright violation though.

Copyright means you're not allowed to copy something without permission.

It's that simple. There is no "Yes but you still have your book" argument, because copyright is a claim on commercial value, not a claim on instantiation.

There's some minimal wiggle room for fair use, but clearly making an electronic copy and creating a condensed electronic version of the content - no matter how abstracted - and using it for profit is not fair use.

chii 10 months ago | parent [-]

> Copyright means you're not allowed to copy something without permission.

but is training an AI copying? And if so, why isn't someone learning from said work not considered copying in their brain?

throw646577 10 months ago | parent | next [-]

> but is training an AI copying?

If the AI produces chunks of training set nearly verbatim when prompted, it looks like copying.

> And if so, why isn't someone learning from said work not considered copying in their brain?

Well, their brain, while learning, is not someone's published work product, for one thing. This should be obvious.

But their brain can violate copyright by producing work as the output of that learning, and be guilty of plagiarism, etc. If I memorise a passage of your copyrighted book when I am a child, and then write it in my book when I am an adult, I've infringed.

The fact that most jurisdictions don't consider the work of an AI to be copyrightable does not mean it cannot ever be infringing.

pera 10 months ago | parent | prev | next [-]

A product from a company is not a person. An LLM is not a brain.

If you transcode a CD to mp3 and build a business around selling these files without the author's permission you'd be in big legal problems.

Tech products that "accidentally" reproduce materials without the owners' permission (e.g. someone uploading La La Land into YouTube) have processes to remove them by simply filling a form. Can you do that with ChatGPT?

lelanthran 10 months ago | parent | prev | next [-]

Because the law considers scale.

It's legal for you to possess a single joint. It's not legal for you to possess a warehouse of 400 tons of weed.

The line between legal and not legal is sometimes based on scale; being able to ingest a single book and learn from it is not the same scale as ingesting the entire published works of mankind and learning from it.

nkrisc 10 months ago | parent | prev | next [-]

Because AI isn’t a person.

hiatus 10 months ago | parent | prev [-]

Is training an AI the same as a person learning something? You haven't shown that to be the case.