| ▲ | zephen 2 hours ago | |||||||
You seem set on conflating "training" an LLM with "learning" by a human. LLMs don't "learn" but they _do_ in some cases, faithfully regurgitate what they have been trained on. Legally, we call that "making a copy." But don't take my word for it. There are plenty of lawsuits for you to follow on this subject. | ||||||||
| ▲ | shkkmo 2 hours ago | parent [-] | |||||||
> You seem set on conflating "training" an LLM with "learning" by a human. "Learning" is an established word for this, happy to stick with "training" if that helps your comprehension. > LLMs don't "learn" but they _do_ in some cases, faithfully regurgitate what they have been trained on. > Legally, we call that "making a copy." Yes, when you use a LLM to make a copy .. that is making a copy. When you train a LLM... That isn't making a copy, that is training. No copy is created until output is generated that contains a copy. | ||||||||
| ||||||||