Remix.run Logo
necovek 2 days ago

There is an easy fix already in widespread use: "open weights".

It is very much a valuable thing already, no need to taint it with wrong promise.

Though I disagree about being used if it was indeed open source: I might not do it inside my home lab today, but at least Qwen and DeepSeek would use and build on what eg. Facebook was doing with Llama, and they might be pushing the open weights model frontier forward faster.

JumpCrisscross 2 days ago | parent | next [-]

> There is an easy fix already in widespread use: "open weights"

They're both correct given how the terms are actually used. We just have to deduce what's meant from context.

There was a moment, around when Llama was first being released, when the semantics hadn't yet set. The nutter wing of the FOSS community, to my memory, put forward a hard-line and unworkable definition of open source and seemed to reject open weights, too. So the definition got punted to the closest thing at hand, which was open weights with limited (unfortunately, not no) use restrictions. At this point, it's a personal preference that's at most polite to respect if you know your audience has one.

necovek 2 days ago | parent [-]

The point is that "open source" by now has an established and widespread definition, and a "source" hints that it is something a thing is built from that is open.

Is this really a debate we still need to be having today? Sounds like grumpiness with Open Source Initiative defining this ~25 years ago when this term was rarely used as such.

If we do not accept a well defined term and want to keep it a personal preference, we can say that about any word in a natural language.

JumpCrisscross 2 days ago | parent [-]

> "open source" by now has an established and widespread definition

For code, yes. For LLMs, the most commonly-used definition is synonymous with open weight (plus, I think, lack of major use restrictions).

> If we do not accept a well defined term and want to keep it a personal preference, we can say that about any word in a natural language

Plenty of people do. It’s generally polite to entertain their preferences, but only to a limit, and certainly not as a forcing function. The practical reality is describing DeepSeek’s models as open source is today the mainstream mode.

necovek 2 days ago | parent [-]

https://www.merriam-webster.com/dictionary/open-source

Perhaps you are right and this LLM-specific usage enters a dictionary at some point.

As I believe it is very misleading, I am doing my part to discourage it — it is not, imho, impolite to point out established meaning of words when people misuse them. We all create a language together, and all sides have their say.

JumpCrisscross 2 days ago | parent [-]

I think the debate has been around what constitutes the source code. The mode has settled on weights. The spirit of the dictionary definition seems fine for excluding a definition that’s only practical if you own a multimillion-dollar ersatz mainframe.

SV_BubbleTime a day ago | parent [-]

You don’t need to defend a silly argument.

These models aren’t open source, they’re open weights, and some people will confuse the two.

It doesn’t make the wrong word the right one. Just that it’s a lazy combination and people don’t need to mind.

JumpCrisscross 17 hours ago | parent [-]

> doesn’t make the wrong word the right one. Just that it’s a lazy combination and people don’t need to mind

That’s a fair interpretation. I’m going one step further: if most people use the term “wrong,” including experts and industry leaders, that’s eventually the correct use. The term “open source” as requiring open training data is impractical to the point of being virtually useless outside philosophical contexts. This debate is on the same plane as folks who like to argue tomatoes aren’t vegetables, when the truth is botanically they aren’t while culinarily they are. DeepSeek’s model not being open source is only true for the FOSS-jargony definition of open source—in non-jargon use, it’s open source.

dannyw 2 days ago | parent | prev [-]

Yeah, open weights is really good, especially when base models (not just the instruction tuned) weights are released like here.