Remix.run Logo
pdpi 11 hours ago

That’s throwing the baby out with the bath water.

The Open Source movement has been a gigantic boon on the whole of computing, and it would be a terrible shame to lose that ad a knee jerk reaction to genAI

blibble 11 hours ago | parent | next [-]

> That’s throwing the baby out with the bath water.

it's not

the parasites can't train their shitty "AI" if they don't have anything to train it on

simonw 11 hours ago | parent | next [-]

You refusing to write open source will do nothing to slow the development of AI models - there's plenty of other training data in the world.

It will however reduce the positive impact your open source contributions have on the world to 0.

I don't understand the ethical framework for this decision at all.

lunar_mycroft 8 hours ago | parent | next [-]

> You refusing to write open source will do nothing to slow the development of AI models - there's plenty of other training data in the world.

There's also plenty of other open source contributors in the world.

> It will however reduce the positive impact your open source contributions have on the world to 0.

And it will reduce your negative impact through helping to train AI models to 0.

The value of your open source contributions to the ecosystem is roughly proportional to the value they provide to LLM makers as training data. Any argument you could make that one is negligible would also apply to the other, and vice versa.

blibble 9 hours ago | parent | prev | next [-]

> You refusing to write open source will do nothing to slow the development of AI models - there's plenty of other training data in the world.

if true, then the parasites can remove ALL code where the license requires attribution

oh, they won't? I wonder why

bwfan123 7 hours ago | parent | prev | next [-]

> there's plenty of other training data in the world.

Not if most of it is machine generated. The machine would start eating its own shit. The nutrition it gets is from human-generated content.

> I don't understand the ethical framework for this decision at all.

The question is not one of ethics but that of incentives. People producing open source are incentivized in a certain way and it is abhorrent to them when that framework is violated. There needs to be a new license that explicitly forbids use for AI training. That may encourage folks to continue to contribute.

azakai 7 hours ago | parent [-]

Saying people shouldn't create open source code because AI will learn from it, is like saying people shouldn't create art because AI will learn from it.

In both cases I get the frustration - it feels horrible to see something you created be used in a way you think is harmful and wrong! - but the world would be a worse place without art or open source.

Juliate 10 hours ago | parent | prev | next [-]

The ethical framework is simply this one: what is the worth of doing +1 to everyone, if the very thing you wish didn't exist (because you believe it is destroying the world) benefits x10 more from it?

If bringing fire to a species lights and warms them, but also gives the means and incentives to some members of this species to burn everything for good, you have every ethical freedom to ponder whether you contribute to this fire or not.

simonw 10 hours ago | parent [-]

I don't think that a 10x estimate is credible. If it was I'd understand the ethical argument being made here, but I'm confident that excluding one person's open source code from training has an infinitesimally small impact on the abilities of the resulting model.

For your fire example, there's a difference between being Prometheus teaching humans to use fire compared to being a random villager who adds a twig to an existing campfire. I'd say the open source contributions example here is more the latter than the former.

simonwsays 9 hours ago | parent | next [-]

Your argument applies to everything that requires a mass movement to change. Why do anything about the climate? Why do anything about civil rights? Why do anything about poverty? Why try to make any change? I'm just one person. Anything I could do couldn't possibly have any effect. You know what, since all the powerful interests say it's good, it's a lot easier to jump on the bandwagon and act like it is. All of those people who disagree are just luddites anyways. And the luddites didn't even have a point right? They were just idiots who hates metallic devices for no reason at all.

Juliate 9 hours ago | parent | prev [-]

The ethical issue is consent and normalisation: asking individuals to donate to a system they believe is undermining their livelihood and the commons they depend on, while the amplified value is captured somewhere else.

"It barely changes the model" is an engineering claim. It does not imply "therefore it may be taken without consent or compensation" (an ethical claim) nor "there it has no meaningful impact on the contributor or their community" (moral claim).

bgwalter 10 hours ago | parent | prev | next [-]

Guilt-tripping people into providing more fodder for the machine. That is really something else.

I'm not surprised that you don't understand ethics.

simonw 9 hours ago | parent | next [-]

I'm trying to guilt-trip them into using their skills to improve the world through continuing to release open source software.

I couldn't care less if their code was used to train AI - in fact I'd rather it wasn't since they don't want it to be used for that.

blibble 9 hours ago | parent | next [-]

given the "AI" industry's long term goals, I see contributing in any way to generative "AI" to be deeply unethical, bordering on evil

which is the exact opposite of improving the world

you can extrapolate to what I think of YOUR actions

simonw 7 hours ago | parent | next [-]

I imagine you think I'm an accelerant of all of this, through my efforts to teach people what it can and cannot do and provide tools to help them use it.

My position on all of this is that the technology isn't going to uninvented and I very much doubt it will be legislated away, which means the best thing we can do is promote the positive uses and disincentivize the negative uses as much as possible.

trinsic2 an hour ago | parent | next [-]

You know. I'm realizing im my head Im comparing this to Nazism and Hitler. Im sure many people thought he was bringing change to the world and since its going to happen anyway we should all get on-board with it. In the end there was a reckoning.

IMHO their are going to be consequences of these negative effects, regardless of the positives.

Looking at it in this light, you might want to get out now, while you still can. Im sure its going to continue, its not going to be legislated away, but it's still wrong to be using this technology in the way it's being used right now, and I will not be associated with the harmful effects this technology is being used for because a few corporations feel justified in pushing evil on to the world wrapped positives.

blibble an hour ago | parent | prev [-]

I don't see you as an accelerant

they're using your exceptional reputation as a open-source developer to push their proprietary parasitic products and business models, with you thinking you're doing good

I don't mean to be rude, but I suspect "useful idiot" is probably the term they use to describe open source influencers in meetings discussing early access

bryan_w 8 hours ago | parent | prev [-]

Your post, full of well formed, English sentences is also going to contribute to generative AI, so thanks for that.

blibble 7 hours ago | parent [-]

oh I've thought of that :)

my comments on the internet are now almost exclusively anti-"AI", and anti-bigtech

9 hours ago | parent | prev | next [-]
[deleted]
hnisforjakases 9 hours ago | parent | prev | next [-]

[dead]

realmadludite 8 hours ago | parent | prev [-]

[dead]

9 hours ago | parent | prev [-]
[deleted]
realmadludite 8 hours ago | parent | prev [-]

[dead]

pdpi 11 hours ago | parent | prev | next [-]

Yes — That’s the bath water. The baby is the all the communal good that has come from FLOSS.

afavour 11 hours ago | parent [-]

OP is asserting that the danger posed by AI is far bigger than the benefit of FLOSS. So to OP AI is the bath water.

seanclayton 11 hours ago | parent [-]

Yes, and they are okay with throwing the baby out with it, which is what the other commenter is commenting about. Throwing babies out of buckets full of bathwater is a bad thing, is what the idiom implies.

9 hours ago | parent [-]
[deleted]
11 hours ago | parent | prev | next [-]
[deleted]
Kirth 10 hours ago | parent | prev | next [-]

surely that cat's out of the bag by now; and it's too late to make an active difference by boycotting the production of more public(ly indexed) code?

franktankbank 9 hours ago | parent [-]

Kind of kind of not. Form a guild and distribute via SAAS or some other undistributable knowledge. Most code out there is terrible so relying on AI trained on it will lose out.

ekianjo 11 hours ago | parent | prev | next [-]

If we end up with only proprietary software we are the one who lose

Juliate 11 hours ago | parent [-]

GenAI would be decades away (if not more) with only proprietary software (which would never have reached both the quality, coordination and volume open source enabled in such a relatively short time frame).

xdavidliu 11 hours ago | parent | prev | next [-]

open source code is a miniscule fraction of the training data

TheCraiggers 10 hours ago | parent | next [-]

I'd love to see a citation there. We already know from a few years ago that they were training AI based on projects on GitHub. Meanwhile, I highly doubt software firms were lining up to have their proprietary code bases ingested by AI for training purposes. Even with NDAs, we would have heard something about it.

xdavidliu 5 hours ago | parent [-]

I should have clarified what I meant. The training data includes roughly speaking the entire internet. Open source code is probably a large fraction of the code in the data, but it is a tiny fraction of the total data, which is mostly non-code.

My point was that the hypothetical of "not contributing to any open source code" to the extent that LLMs had no code to train on, would not have made as big of an impact as that person thought, since a very large majority of the internet is text, not code.

maplethorpe 11 hours ago | parent | prev [-]

Where did most of the code in their training data come from?

dvfjsdhgfv 11 hours ago | parent | prev | next [-]

It is. If not you, other people will write their code, maybe of worse quality, and the parasites will train on this. And you cannot forbid other people to write open source software.

blibble 11 hours ago | parent [-]

> If not you, other people will write their code, maybe of worse quality, and the parasites will train on this.

this is precisely the idea

add into that the rise of vibe-coding, and that should help accelerate model collapse

everyone that cares about quality of software should immediately stop contributing to open source

garciasn 11 hours ago | parent | prev [-]

Free software has always been about standing on the shoulders of giants.

I see this as doing so at scale and thus giving up on its inherent value is most definitely throwing the baby out with the bathwater.

blibble 11 hours ago | parent [-]

I'd rather the internet ceased to exist entirely, than contributing in any way to generative "AI"

srpinto 11 hours ago | parent | next [-]

This is just childish. This is a complex problem and requires nuance and adaptability, just as programming. Yours is literally the reaction of an angsty 12 year old.

DiscourseFan 11 hours ago | parent | prev | next [-]

Such a reactionary position is no better than nihilism.

user____name 11 hours ago | parent [-]

If God is Dead, do we have to rebuild It in the megacorps of the world whilst maximizing shareholder value?

DiscourseFan 11 hours ago | parent | next [-]

I think you aren't recognizing the power that comes from organizing thousands, hundreds of thousands, or millions of workers into vast industrial combines that produce the wealth of our society today. We must go through this, not against it. People will not know what could be, if they fail to see what is.

keeganpoppen 4 hours ago | parent | prev [-]

this just sounds like some memes smashed together in the LHC. what is this even supposed to mean? AI is a technology that will inevitably developed by humankind. all of this appeal to... populism? socialism?... is completely devoid of meaning in response to a discussion whose sine qua non is pragmatism at the very least.

Marha01 11 hours ago | parent | prev [-]

Ridiculous overreaction.

ironman1478 11 hours ago | parent | prev | next [-]

Open source has been good, but I think the expanded use of highly permissive licences has completely left the door open for one sided transactions.

All the FAANGs have the ability to build all the open source tools they consume internally. Why give it to them for free and not have the expectation that they'll contribute something back?

undeveloper 10 hours ago | parent [-]

Even the GPL allows companies to simply use code without contributing back, long as it's unmodified, or through a network boundary. the AGPL has the former issue.

ironman1478 6 hours ago | parent | next [-]

At least the contribution back can happen. You're right though, it's not perfect.

Avamander 7 hours ago | parent | prev [-]

This goes against what Stallman believes in, but there's a need for AGPL with a clause against closed-weight models.

lwhi 11 hours ago | parent | prev | next [-]

The promise and freedom of open source has been exploited by the least egalitarian and most capitalist forces on the planet.

I would never have imagined things turning out this way, and yet, here we are.

pdpi 11 hours ago | parent | next [-]

FLOSS is a textbook example of economic activity that generates positive externalities. Yes, those externalities are of outsized value to corporate giants, but that’s not a bad thing unto itself.

Rather, I think this is, again, a textbook example of what governments and taxation is for — tax the people taking advantage of the externalities, to pay the people producing them.

lwhi 9 hours ago | parent [-]

Yes, but unfortunately this never happens; and depressingly, I can't imagine it happening.

The open source movement has been exploited.

11 hours ago | parent | prev | next [-]
[deleted]
ThrowawayR2 10 hours ago | parent | prev [-]

Open Source (as opposed to Free Software) was intended to be friendly to business and early FOSS fans pushed for corporate adoption for all they were worth. It's a classic "leopards ate my face" moment that somehow took a couple of decades for the punchline to land: "'I never thought capitalists would exploit MY open source,' sobs developer who advocated for the Businesses Exploiting Open Source movement."

lwhi 9 hours ago | parent [-]

I'm not sure I follow your line of reasoning.

The exploited are in the wrong for not recognising they're going to be exploited?

A pretty twisted point of view, in my opinion.

ThrowawayR2 8 hours ago | parent [-]

Perhaps you are unfamiliar with the "leopards ate my face" meme? https://knowyourmeme.com/memes/leopards-eating-peoples-faces... The parallels between the early FOSS advocates energetically seeking corporate adoption of FOSS and the meme are quite obvious.

lwhi 5 hours ago | parent [-]

I don't misunderstand what you're saying, but I think it's a twisted point of view.

mvdtnz 4 hours ago | parent | prev [-]

How dare you chastise someone for making the personal decision not to produce free work anymore? Who do you think you are?