Remix.run Logo
floathub 3 days ago

Free software has never mattered more.

All the infrastructure that runs the whole AI-over-the-internet juggernaut is essentially all open source.

Heck, even Claude Code would be far less useful without grep, diff, git, head, etc., etc., etc. And one can easily see a day where something like a local sort Claude Code talking to Open Weight and Open Source models is the core dev tool.

CobrastanJorji 3 days ago | parent | next [-]

It's not just that open source code is useful in an age of AI, it's that the AI could only have been made because of the open source code.

3 days ago | parent [-]
[deleted]
TacticalCoder 3 days ago | parent | prev | next [-]

> All the infrastructure that runs the whole AI-over-the-internet juggernaut is essentially all open source.

Exactly.

> Heck, even Claude Code would be far less useful without grep, diff, git, head, etc.

It wouldn't even work. It's constantly using those.

I remember reading a Claude Code CLI install doc and the first thing was "we need ripgrep" with zero shame.

All these tools also all basically run on top of Linux: with Claude Code actually installing, on Windows and MacOS, a full linux VM on the system.

It's all open-source command line tools, an open-source OS and piping program one to the other. I'm on Linux on the desktop (and servers ofc) since the Slackware days... And I was right all along.

gesis 3 days ago | parent [-]

The primary selling point of unix and unix-like operating systems has always been composability.

Without the ability to string together the basic utilities into a much greater sum, Unix would have been another blip.

PaulDavisThe1st 3 days ago | parent | prev | next [-]

> Free software has never mattered more.

But the Libre part of Free Software has never mattered less, at least so TFA argues and while I could niggle with the point, it's not wrong.

pwdisswordfishy 3 days ago | parent | prev | next [-]

Wow, some corps could offload some of their costs to "the community" (unpair labor), while end users are as disenfranchised as ever! How validating!

andoando 3 days ago | parent | prev [-]

Why isnt LLM training itself open sourced? With all the compute in the world, something like Folding@home here would be killer

DesaiAshu 3 days ago | parent | next [-]

data bandwidth limits distributed training under current architectures. really interesting implications if we can make progress on that

dogcomplex 2 days ago | parent | next [-]

Limits but doesn't prohibit. See https://www.primeintellect.ai/blog/intellect-3 - still useful and can scale enormously. Takes a particular shape and relies heavily on RL, but still big.

andoando 2 days ago | parent | prev [-]

What bandwith limits? Im assuming the forward and backward passes have to be done sequentially?

DesaiAshu 16 hours ago | parent [-]

Yes also passing data within each layer

mike_hearn 2 days ago | parent | prev | next [-]

It is in some cases. NVIDIA's models are open source, in the truest sense that you can download the training set and training scripts and make your own.

throwaway27448 2 days ago | parent | prev | next [-]

It's either illegal or extremely expensive to source quality training material.

m4rtink 2 days ago | parent [-]

Yeah, turns out if you want to train a model without scrapping and overloading the whole of Internet while ignoring all the licenses and basic decency is actually hard & expensive!

doctorwho42 2 days ago | parent | prev [-]

Well it is, it's in the name "OpenAI". /S