Remix.run Logo
blibble 11 hours ago

> And by the way, training your monster on data produced in part by my own hands, without attribution or compensation.

> To the others: I apologize to the world at large for my inadvertent, naive if minor role in enabling this assault.

this is my position too, I regret every single piece of open source software I ever produced

and I will produce no more

pdpi 11 hours ago | parent | next [-]

That’s throwing the baby out with the bath water.

The Open Source movement has been a gigantic boon on the whole of computing, and it would be a terrible shame to lose that ad a knee jerk reaction to genAI

blibble 11 hours ago | parent | next [-]

> That’s throwing the baby out with the bath water.

it's not

the parasites can't train their shitty "AI" if they don't have anything to train it on

simonw 11 hours ago | parent | next [-]

You refusing to write open source will do nothing to slow the development of AI models - there's plenty of other training data in the world.

It will however reduce the positive impact your open source contributions have on the world to 0.

I don't understand the ethical framework for this decision at all.

lunar_mycroft 8 hours ago | parent | next [-]

> You refusing to write open source will do nothing to slow the development of AI models - there's plenty of other training data in the world.

There's also plenty of other open source contributors in the world.

> It will however reduce the positive impact your open source contributions have on the world to 0.

And it will reduce your negative impact through helping to train AI models to 0.

The value of your open source contributions to the ecosystem is roughly proportional to the value they provide to LLM makers as training data. Any argument you could make that one is negligible would also apply to the other, and vice versa.

blibble 9 hours ago | parent | prev | next [-]

> You refusing to write open source will do nothing to slow the development of AI models - there's plenty of other training data in the world.

if true, then the parasites can remove ALL code where the license requires attribution

oh, they won't? I wonder why

bwfan123 7 hours ago | parent | prev | next [-]

> there's plenty of other training data in the world.

Not if most of it is machine generated. The machine would start eating its own shit. The nutrition it gets is from human-generated content.

> I don't understand the ethical framework for this decision at all.

The question is not one of ethics but that of incentives. People producing open source are incentivized in a certain way and it is abhorrent to them when that framework is violated. There needs to be a new license that explicitly forbids use for AI training. That may encourage folks to continue to contribute.

azakai 7 hours ago | parent [-]

Saying people shouldn't create open source code because AI will learn from it, is like saying people shouldn't create art because AI will learn from it.

In both cases I get the frustration - it feels horrible to see something you created be used in a way you think is harmful and wrong! - but the world would be a worse place without art or open source.

Juliate 10 hours ago | parent | prev | next [-]

The ethical framework is simply this one: what is the worth of doing +1 to everyone, if the very thing you wish didn't exist (because you believe it is destroying the world) benefits x10 more from it?

If bringing fire to a species lights and warms them, but also gives the means and incentives to some members of this species to burn everything for good, you have every ethical freedom to ponder whether you contribute to this fire or not.

simonw 10 hours ago | parent [-]

I don't think that a 10x estimate is credible. If it was I'd understand the ethical argument being made here, but I'm confident that excluding one person's open source code from training has an infinitesimally small impact on the abilities of the resulting model.

For your fire example, there's a difference between being Prometheus teaching humans to use fire compared to being a random villager who adds a twig to an existing campfire. I'd say the open source contributions example here is more the latter than the former.

simonwsays 9 hours ago | parent | next [-]

Your argument applies to everything that requires a mass movement to change. Why do anything about the climate? Why do anything about civil rights? Why do anything about poverty? Why try to make any change? I'm just one person. Anything I could do couldn't possibly have any effect. You know what, since all the powerful interests say it's good, it's a lot easier to jump on the bandwagon and act like it is. All of those people who disagree are just luddites anyways. And the luddites didn't even have a point right? They were just idiots who hates metallic devices for no reason at all.

Juliate 9 hours ago | parent | prev [-]

The ethical issue is consent and normalisation: asking individuals to donate to a system they believe is undermining their livelihood and the commons they depend on, while the amplified value is captured somewhere else.

"It barely changes the model" is an engineering claim. It does not imply "therefore it may be taken without consent or compensation" (an ethical claim) nor "there it has no meaningful impact on the contributor or their community" (moral claim).

bgwalter 10 hours ago | parent | prev | next [-]

Guilt-tripping people into providing more fodder for the machine. That is really something else.

I'm not surprised that you don't understand ethics.

simonw 9 hours ago | parent | next [-]

I'm trying to guilt-trip them into using their skills to improve the world through continuing to release open source software.

I couldn't care less if their code was used to train AI - in fact I'd rather it wasn't since they don't want it to be used for that.

blibble 9 hours ago | parent | next [-]

given the "AI" industry's long term goals, I see contributing in any way to generative "AI" to be deeply unethical, bordering on evil

which is the exact opposite of improving the world

you can extrapolate to what I think of YOUR actions

simonw 7 hours ago | parent | next [-]

I imagine you think I'm an accelerant of all of this, through my efforts to teach people what it can and cannot do and provide tools to help them use it.

My position on all of this is that the technology isn't going to uninvented and I very much doubt it will be legislated away, which means the best thing we can do is promote the positive uses and disincentivize the negative uses as much as possible.

trinsic2 an hour ago | parent | next [-]

You know. I'm realizing im my head Im comparing this to Nazism and Hitler. Im sure many people thought he was bringing change to the world and since its going to happen anyway we should all get on-board with it. In the end there was a reckoning.

IMHO their are going to be consequences of these negative effects, regardless of the positives.

Looking at it in this light, you might want to get out now, while you still can. Im sure its going to continue, its not going to be legislated away, but it's still wrong to be using this technology in the way it's being used right now, and I will not be associated with the harmful effects this technology is being used for because a few corporations feel justified in pushing evil on to the world wrapped positives.

blibble an hour ago | parent | prev [-]

I don't see you as an accelerant

they're using your exceptional reputation as a open-source developer to push their proprietary parasitic products and business models, with you thinking you're doing good

I don't mean to be rude, but I suspect "useful idiot" is probably the term they use to describe open source influencers in meetings discussing early access

bryan_w 8 hours ago | parent | prev [-]

Your post, full of well formed, English sentences is also going to contribute to generative AI, so thanks for that.

blibble 7 hours ago | parent [-]

oh I've thought of that :)

my comments on the internet are now almost exclusively anti-"AI", and anti-bigtech

9 hours ago | parent | prev | next [-]
[deleted]
hnisforjakases 9 hours ago | parent | prev | next [-]

[dead]

realmadludite 8 hours ago | parent | prev [-]

[dead]

9 hours ago | parent | prev [-]
[deleted]
realmadludite 8 hours ago | parent | prev [-]

[dead]

pdpi 11 hours ago | parent | prev | next [-]

Yes — That’s the bath water. The baby is the all the communal good that has come from FLOSS.

afavour 11 hours ago | parent [-]

OP is asserting that the danger posed by AI is far bigger than the benefit of FLOSS. So to OP AI is the bath water.

seanclayton 11 hours ago | parent [-]

Yes, and they are okay with throwing the baby out with it, which is what the other commenter is commenting about. Throwing babies out of buckets full of bathwater is a bad thing, is what the idiom implies.

9 hours ago | parent [-]
[deleted]
11 hours ago | parent | prev | next [-]
[deleted]
Kirth 10 hours ago | parent | prev | next [-]

surely that cat's out of the bag by now; and it's too late to make an active difference by boycotting the production of more public(ly indexed) code?

franktankbank 9 hours ago | parent [-]

Kind of kind of not. Form a guild and distribute via SAAS or some other undistributable knowledge. Most code out there is terrible so relying on AI trained on it will lose out.

ekianjo 11 hours ago | parent | prev | next [-]

If we end up with only proprietary software we are the one who lose

Juliate 11 hours ago | parent [-]

GenAI would be decades away (if not more) with only proprietary software (which would never have reached both the quality, coordination and volume open source enabled in such a relatively short time frame).

xdavidliu 11 hours ago | parent | prev | next [-]

open source code is a miniscule fraction of the training data

TheCraiggers 10 hours ago | parent | next [-]

I'd love to see a citation there. We already know from a few years ago that they were training AI based on projects on GitHub. Meanwhile, I highly doubt software firms were lining up to have their proprietary code bases ingested by AI for training purposes. Even with NDAs, we would have heard something about it.

xdavidliu 5 hours ago | parent [-]

I should have clarified what I meant. The training data includes roughly speaking the entire internet. Open source code is probably a large fraction of the code in the data, but it is a tiny fraction of the total data, which is mostly non-code.

My point was that the hypothetical of "not contributing to any open source code" to the extent that LLMs had no code to train on, would not have made as big of an impact as that person thought, since a very large majority of the internet is text, not code.

maplethorpe 11 hours ago | parent | prev [-]

Where did most of the code in their training data come from?

dvfjsdhgfv 11 hours ago | parent | prev | next [-]

It is. If not you, other people will write their code, maybe of worse quality, and the parasites will train on this. And you cannot forbid other people to write open source software.

blibble 11 hours ago | parent [-]

> If not you, other people will write their code, maybe of worse quality, and the parasites will train on this.

this is precisely the idea

add into that the rise of vibe-coding, and that should help accelerate model collapse

everyone that cares about quality of software should immediately stop contributing to open source

garciasn 11 hours ago | parent | prev [-]

Free software has always been about standing on the shoulders of giants.

I see this as doing so at scale and thus giving up on its inherent value is most definitely throwing the baby out with the bathwater.

blibble 11 hours ago | parent [-]

I'd rather the internet ceased to exist entirely, than contributing in any way to generative "AI"

srpinto 11 hours ago | parent | next [-]

This is just childish. This is a complex problem and requires nuance and adaptability, just as programming. Yours is literally the reaction of an angsty 12 year old.

DiscourseFan 11 hours ago | parent | prev | next [-]

Such a reactionary position is no better than nihilism.

user____name 11 hours ago | parent [-]

If God is Dead, do we have to rebuild It in the megacorps of the world whilst maximizing shareholder value?

DiscourseFan 11 hours ago | parent | next [-]

I think you aren't recognizing the power that comes from organizing thousands, hundreds of thousands, or millions of workers into vast industrial combines that produce the wealth of our society today. We must go through this, not against it. People will not know what could be, if they fail to see what is.

keeganpoppen 4 hours ago | parent | prev [-]

this just sounds like some memes smashed together in the LHC. what is this even supposed to mean? AI is a technology that will inevitably developed by humankind. all of this appeal to... populism? socialism?... is completely devoid of meaning in response to a discussion whose sine qua non is pragmatism at the very least.

Marha01 11 hours ago | parent | prev [-]

Ridiculous overreaction.

ironman1478 11 hours ago | parent | prev | next [-]

Open source has been good, but I think the expanded use of highly permissive licences has completely left the door open for one sided transactions.

All the FAANGs have the ability to build all the open source tools they consume internally. Why give it to them for free and not have the expectation that they'll contribute something back?

undeveloper 10 hours ago | parent [-]

Even the GPL allows companies to simply use code without contributing back, long as it's unmodified, or through a network boundary. the AGPL has the former issue.

ironman1478 6 hours ago | parent | next [-]

At least the contribution back can happen. You're right though, it's not perfect.

Avamander 7 hours ago | parent | prev [-]

This goes against what Stallman believes in, but there's a need for AGPL with a clause against closed-weight models.

lwhi 11 hours ago | parent | prev | next [-]

The promise and freedom of open source has been exploited by the least egalitarian and most capitalist forces on the planet.

I would never have imagined things turning out this way, and yet, here we are.

pdpi 11 hours ago | parent | next [-]

FLOSS is a textbook example of economic activity that generates positive externalities. Yes, those externalities are of outsized value to corporate giants, but that’s not a bad thing unto itself.

Rather, I think this is, again, a textbook example of what governments and taxation is for — tax the people taking advantage of the externalities, to pay the people producing them.

lwhi 9 hours ago | parent [-]

Yes, but unfortunately this never happens; and depressingly, I can't imagine it happening.

The open source movement has been exploited.

11 hours ago | parent | prev | next [-]
[deleted]
ThrowawayR2 10 hours ago | parent | prev [-]

Open Source (as opposed to Free Software) was intended to be friendly to business and early FOSS fans pushed for corporate adoption for all they were worth. It's a classic "leopards ate my face" moment that somehow took a couple of decades for the punchline to land: "'I never thought capitalists would exploit MY open source,' sobs developer who advocated for the Businesses Exploiting Open Source movement."

lwhi 9 hours ago | parent [-]

I'm not sure I follow your line of reasoning.

The exploited are in the wrong for not recognising they're going to be exploited?

A pretty twisted point of view, in my opinion.

ThrowawayR2 8 hours ago | parent [-]

Perhaps you are unfamiliar with the "leopards ate my face" meme? https://knowyourmeme.com/memes/leopards-eating-peoples-faces... The parallels between the early FOSS advocates energetically seeking corporate adoption of FOSS and the meme are quite obvious.

lwhi 5 hours ago | parent [-]

I don't misunderstand what you're saying, but I think it's a twisted point of view.

mvdtnz 4 hours ago | parent | prev [-]

How dare you chastise someone for making the personal decision not to produce free work anymore? Who do you think you are?

bilekas 11 hours ago | parent | prev | next [-]

Unfortunately as I see it, even if you want to contribute to open source out of a pure passion or enjoyment, they don't respect the licenses that are consumed. And the "training" companies are not being held liable.

Are there any proposals to nail down an open source license which would explicitly exclude use with AI systems and companies?

rpdillon 10 hours ago | parent | next [-]

All licenses rely on the power of copyright and what we're still figuring out is whether training is subject to the limitations of copyright or if it's permissible under fair use. If it's found to be fair use in the majority of situations, no license can be constructed that will protect you.

Even if you could construct such a license, it wouldn't be OSI open source because it would discriminate based on field of endeavor.

And it would inevitably catch benevolent behavior that is AI-related in its net. That's because these terms are ill-defined and people use them very sloppily. There is no agreed-upon definition for something like gen AI or even AI.

MonkeyClub 11 hours ago | parent | prev | next [-]

Even if you license it prohibiting AI use, how would you litigate against such uses? An open source project can't afford the same legal resources that AI firms have access to.

bilekas 10 hours ago | parent [-]

I won't speak for all but companies I've worked for large and small have always respected licenses and were always very careful when choosing open source, but I can't speak for all.

The fact that they could litigate you into oblivion doesn't make it acceptable.

y-curious 11 hours ago | parent | prev | next [-]

Where is this spirit when AWS takes a FOSS project, puts it in the cloud and monetizes it?

Snild 11 hours ago | parent | next [-]

It exists, hence e.g. AGPL.

But for most open source licenses, that example would be within bounds. The grandparent comment objected to not respecting the license.

fweimer 10 hours ago | parent [-]

The AGPL does not prevent offering the software as a service. It's got a reputation as the GPL variant for an open-core business model, but it really isn't that.

Most companies trying to sell open-source software probably lose more business if the software ends up in the Debian/Ubuntu repository (and the packaging/system integration is not completely abysmal) than when some cloud provider starts offering it as a service.

mrwrong 11 hours ago | parent | prev | next [-]

you are saying X, but a completely different group of people didn't say Y that other time! I got you!!!!

y-curious 11 hours ago | parent [-]

It’s fair to call out that both aspects are two sides of the same coin. I didn’t try to “get” anyone

mrwrong 7 hours ago | parent [-]

um, no it's not. you have fallen into the classic web forum trap of analyzing a heterogenous mix of people with inconsistent views as one entity that should have consistent views

oblio 11 hours ago | parent | prev [-]

Fairly sure it's the same problem and the main reason stronger licenses are appearing or formerly OSS companies closing down their sources.

muldvarp 11 hours ago | parent | prev [-]

> Unfortunately as I see it, even if you want to contribute to open source out of a pure passion or enjoyment, they don't respect the licenses that are consumed.

Because it is "transformative" and therefore "fair" use.

candiddevmike 11 hours ago | parent | next [-]

Running things through lossy compression is transformative?

muldvarp 9 hours ago | parent [-]

The quotation marks indicate that _I_ don't think it is. Especially given that modern deep learning is over-paramaterized to the point that it interpolates training examples.

terminalshort 11 hours ago | parent | prev [-]

Fair use is an exception to copyright, but a license agreement can go far beyond copyright protections. There is no fair use exception to breach of contract.

zeroonetwothree 10 hours ago | parent [-]

I imagine a license agreement would only apply to using the software, not merely reading the code (which is what AI training claims to do under fair use).

As an analogy, you can’t enforce a “license” that anyone that opens your GitHub repo and looks at any .cpp file owes you $1,000,000.

skybrian 9 hours ago | parent | prev | next [-]

If you're unhappy that bad people might use your software in unexpected ways, open source licenses were never appropriate for you in the first place.

Anyone can use your software! Some of them are very likely bad people who will misuse it to do bad things, but you don't have any control over it. Giving up control is how it works. It's how it's always worked, but often people don't understand the consequences.

lunar_mycroft 7 hours ago | parent | next [-]

People do not have perfect foresight, and the ways open source software is used has significantly shifted in recent years. As a result, people reevaluating whether or not they want to participate.

skybrian 5 hours ago | parent [-]

Yes, very true.

Barrin92 8 hours ago | parent | prev | next [-]

>Giving up control is how it works. It's how it's always worked,

no, it hasn't. Open source software, like any open and cooperative culture, existed on a bedrock, what we used to call norms when we still had some in our societies and people acted not always but at least most of the time in good faith. Hacker culture (word's in the name of this website) which underpinned so much of it, had many unwritten rules that people respected even in companies when there were still enough people in charge who shared at least some of the values.

Now it isn't just an exception but the rule that people will use what you write in the most abhorrent, greedy and stupid ways and it does look like the only way out is some Neal Stephenson Anathem-esque digital version of a monastery.

skybrian 8 hours ago | parent | next [-]

Open source software is published to the world and used far beyond any single community where certain norms might apply.

If you care about what people do with your code, you should put it in the license. To the extent that unwritten norms exist, it's unfair to expect strangers in different parts of the world to know what they are, and it's likely unenforceable.

This recently came up for the GPLv2 license, where Linus Torvalds and the Software Freedom Conservancy disagree about how it should be interpreted, and there's apparently a judge that agrees with Linus:

https://mastodon.social/@torvalds@social.kernel.org/11577678...

jama211 8 hours ago | parent | prev [-]

Inside open source communities maybe. In the corporate world? Absolutely not. Ever. They will take your open source code and do what they want with it, always have.

skybrian 8 hours ago | parent [-]

This varies. The lawyers for risk-adverse companies will make sure they follow the licenses. There are auditing tools to make sure you're not pulling in code you shouldn't. An example is Google's go-licenses command [1].

But you can be sure that even the risk-adverse companies are going to go by what the license says, rather than "community norms."

Other companies are more careless.

[1] https://github.com/google/go-licenses

conradfr 9 hours ago | parent | prev [-]

It's not really people, and they don't really use the software.

skybrian 8 hours ago | parent [-]

People training LLM's on source code is sort of like using newspaper for wrapping fish. It's not the expected use, but people are still using it for something.

As they say, "reduce, reuse, recycle." Your words are getting composted.

Vegenoid 6 hours ago | parent [-]

Nothing says reduce and reuse like building huge quantities of GPUs and massive data centers to run AI models. It’s like composting!

2026iknewit 11 hours ago | parent | prev | next [-]

I learned what i learned due to all the openess in software engineering and not because everyone put it behind a pay wall.

Might be because most of us got/gets payed well enough that this philosophy works well or because our industry is so young or because people writing code share good values.

It never worried me that a corp would make money out of some code i wrote and it still doesn't. AFter all, i'm able to write code because i get paid well writing code, which i do well because of open source. Companies always benefited from open source code attributed or not.

Now i use it to write more code.

I would argue though, I'm fine with that, to push for laws forcing models to be opened up after x years, but i would just prefer the open source / open community coming together and creating just better open models overall.

indigoabstract 11 hours ago | parent | prev | next [-]

It's kind of ironic since AI can only grow by feeding on data and open source with its good intentions of sharing knowledge is absolutely perfect for this.

But AI is also the ultimate meat grinder, there's no yours or theirs in the final dish, it's just meat.

And open source licenses are practically unenforceable for an AI system, unless you can maybe get it to cough up verbatim code from its training data.

At the same time, we all know they're not going anywhere, they're here to stay.

I'm personally not against them, they're very useful obviously, but I do have mixed or mostly negative feelings on how they got their training data.

Findecanor 11 hours ago | parent | prev | next [-]

I've been feeling a lot the same way, but removing your source code from the world does not feel like a constructive solution either.

Some Shareware used to be individually licensed with the name of the licensee prominently visible, so if you had got an illegal copy you'd be able to see whose licensed copy it was that had been copied.

I wonder if something based on that idea of personal responsibility for your copy could be adopted to source code. If you wanted to contribute to a piece of software, you could ask a contributor and then get a personally licensed copy of the source code with your name in every source file... but I don't know where to take it from there. Has there ever been some system similar to something like that that one could take inspiration from?

whateverboat 6 hours ago | parent | prev | next [-]

That's a weird position to take. Open source software is actually what is mitigating this stupidity in my opinion. Having monopolistic players like Microsoft and Google is what brought us here in the first place.

Trasmatta 11 hours ago | parent | prev | next [-]

And then having vibe coders constantly lecture us about how the future is just prompt engineering, and that we should totally be happy to desert the skills we spent decades building (the skills that were stolen to train AI).

"The only thing that matters is the end result, it's no different than a compiler!", they say as someone with no experience dumps giant PRs of horrific vibe code for those of us that still know what we're doing to review.

terminalshort 11 hours ago | parent | prev | next [-]

What a miserable attitude. When you put something out in the world it's out there for anyone to use and always has been before AI.

blibble 11 hours ago | parent [-]

it is (... was) there to use for anyone, on the condition that the license is followed

which they don't

and no self-serving sophistry about "it's transformative fair use" counts as respecting the license

rpdillon 10 hours ago | parent | next [-]

The license only has force because of copyright. For better or for worse, the courts decide what is transformative fair use.

Characterizing the discussion behind this as "sophistry" is a fundamentally unserious take.

For a serious take, I recommend reading the copyright office's 100 plus page document that they released in May. It makes it clear that there are a bunch of cases that are non-transformative, particularly when they affect the market for the original work and compete with it. But there's also clearly cases that are transformative when no such competition exists, and the training material was obtained legally.

https://www.copyright.gov/ai/Copyright-and-Artificial-Intell...

I'm not particularly sympathetic to voices on HN that attempt to remove all nuance from this discussion. It's challenging enough topic as is.

blibble 9 hours ago | parent | next [-]

> For better or for worse, the courts decide what is transformative fair use.

thankfully, I don't live under the US regime

there is no concept of fair use in my country

Capricorn2481 8 hours ago | parent | prev [-]

> Characterizing the discussion behind this as "sophistry" is a fundamentally unserious take

What a joke. Sorry, but no. I don't think is unserious at all. What's unserious is saying this.

> and the training material was obtained legally

And assuming everyone should take it at face value. I hope you understand that going on a tech forum and telling people they aren't being nuanced because a Judge in Alabama that can barely unlock their phone weighed in on a massively novel technology with global implications, yes, reads deeply unserious. We're aware the U.S. legal system is a failure and the rest of the world suffers for it. Even your President routinely steals music for campaign events, and stole code for Truth Social. Your copyright is a joke that's only there to serve the fattest wallets.

These judges are not elected, they are appointed by people whose pockets are lined by these very corporations. They don't serve us, they are here to retrofit the law to make illegal things corporations do, legal. What you wrote is thought terminating.

rpdillon 7 hours ago | parent [-]

What I wrote is an encouragement to investigate the actual state of the law when you're talking about legal topics. That's the opposite of thought-terminating.

jama211 8 hours ago | parent | prev [-]

*in your opinion

cmrdporcupine 11 hours ago | parent | prev | next [-]

> and I will produce no more

Nah, don't do that. Produce shitloads of it using the very same LLM tools that ripped you off, but license it under the GPL.

If they're going to thief GPL software, least we can do is thief it back.

naasking 11 hours ago | parent | prev | next [-]

Why? The core vision of free software and many open source licenses was to empower users and developers to make things they need without being financially extorted, to avoid having users locked in to proprietary systems, to enable interoperability, and to share knowledge. GenAI permits all of this to a level beyond just providing source code.

Most objections like yours are couched in language about principles, but ultimately seem to be about ego. That's not always bad, but I'm not sure why it should be compelling compared to the public good that these systems might ultimately enable.

machinationu 11 hours ago | parent | prev | next [-]

[flagged]

mrcwinn 10 hours ago | parent | prev | next [-]

Was it ever open source if there was an implied refusal to create something you don't approve of? Was it only for certain kinds of software, certain kinds of creators? If there was some kind of implicit approval process or consent requirement, did you publish it? Where can that be reviewed?

gtirloni 11 hours ago | parent | prev | next [-]

> and I will produce no more

Thanks for your contributions so far but this won't change anything.

If you'd want to have a positive on this matter, it's better to pressure the government(s) to prevent GenAI companies from using content they don't have a license for, so they behave like any other business that came before them.

maplethorpe 11 hours ago | parent | prev [-]

What people like Rob Pike don't understand is that the technology wouldn't be possible at all if creators needed to be compensated. Would you really choose a future where creators were compensated fairly, but ChatGPT didn't exist?

user____name 11 hours ago | parent | next [-]

> What people like Abraham Lincoln don't understand is that the technology wouldn't be possible at all if slaves needed to be compensated. Would you really choose a future where slaves were compensated fairly, but plantations didn't exist?

I fixed it... Sorry, I had to, the quote template was simply too good.

MonkeyClub 11 hours ago | parent | prev | next [-]

"Too expensive to do it legally" doesn't really stand up as an argument.

alpha_squared 11 hours ago | parent | prev | next [-]

Unequivocally, yes. There are plenty of "useful" things that can come out of doing unethical things, that doesn't make it okay. And, arguably, ChatGPT isn't nearly as useful as it is at convincing you it is.

rkomorn 11 hours ago | parent | prev | next [-]

Absolutely. Was this supposed to be some kind of gotcha?

makerofthings 6 hours ago | parent | prev | next [-]

Yes, very much so. I am in favour of pushing into the future as fast as we can, so to speak, but I think ChatGPT is a temporary boost that is going to slow us in the long run.

kentm 8 hours ago | parent | prev | next [-]

> Would you really choose a future where creators were compensated fairly, but ChatGPT didn't exist?

Yes.

I don't see how "We couldn't do this cool thing if we didn't throw away ethics!" is a reasonable argument. That is a hell of a thing to write out.

tedious-coder 9 hours ago | parent | prev | next [-]

Yes, what a wild position to prefer the job loss, devaluation of skills, and environmental toll of AI to open source creators having been compensated in some better manner.

11 hours ago | parent | prev | next [-]
[deleted]
caem 11 hours ago | parent | prev | next [-]

That would be like being able to keep my cake and eat it too. Of course I would. Surely you're being sarcastic?

Trasmatta 11 hours ago | parent | prev | next [-]

Very much yes, how can I opt into that timeline?

kenferry 11 hours ago | parent | prev | next [-]

Uh, yeah, he clearly would prefer it didn’t exist even if he was compensated.

dmd 11 hours ago | parent | prev | next [-]

Er... yes? Obviously? What are you even asking?

Xiol 11 hours ago | parent | prev | next [-]

Yes.

nocman 11 hours ago | parent | prev | next [-]

Um, please let your comment be sarcastic. It is ... right?

adhamsalama 8 hours ago | parent | prev | next [-]

Yes.

monsieurgaufre 9 hours ago | parent | prev | next [-]

Yes.

metronomer 11 hours ago | parent | prev | next [-]

Well yeah.

hnisforjakases 9 hours ago | parent | prev [-]

[dead]