Remix.run Logo
zephen 2 hours ago

You seem set on conflating "training" an LLM with "learning" by a human.

LLMs don't "learn" but they _do_ in some cases, faithfully regurgitate what they have been trained on.

Legally, we call that "making a copy."

But don't take my word for it. There are plenty of lawsuits for you to follow on this subject.

shkkmo 2 hours ago | parent [-]

> You seem set on conflating "training" an LLM with "learning" by a human.

"Learning" is an established word for this, happy to stick with "training" if that helps your comprehension.

> LLMs don't "learn" but they _do_ in some cases, faithfully regurgitate what they have been trained on.

> Legally, we call that "making a copy."

Yes, when you use a LLM to make a copy .. that is making a copy.

When you train a LLM... That isn't making a copy, that is training. No copy is created until output is generated that contains a copy.

zephen an hour ago | parent [-]

> Learning" is an established word for this

Only by people attempting to muddy the waters.

> happy to stick with "training" if that helps your comprehension.

And supercilious dickheads (though that is often redundant).

> No copy is created until output is generated that contains a copy.

The copy exists, albeit not in human-discernable form, inside the LLM, else it could not be generated on demand.

Despite you claiming that "It works exactly the same for a LLM," no, it doesn't.