| ▲ | simonw 12 hours ago |
| You refusing to write open source will do nothing to slow the development of AI models - there's plenty of other training data in the world. It will however reduce the positive impact your open source contributions have on the world to 0. I don't understand the ethical framework for this decision at all. |
|
| ▲ | lunar_mycroft 10 hours ago | parent | next [-] |
| > You refusing to write open source will do nothing to slow the development of AI models - there's plenty of other training data in the world. There's also plenty of other open source contributors in the world. > It will however reduce the positive impact your open source contributions have on the world to 0. And it will reduce your negative impact through helping to train AI models to 0. The value of your open source contributions to the ecosystem is roughly proportional to the value they provide to LLM makers as training data. Any argument you could make that one is negligible would also apply to the other, and vice versa. |
|
| ▲ | blibble 10 hours ago | parent | prev | next [-] |
| > You refusing to write open source will do nothing to slow the development of AI models - there's plenty of other training data in the world. if true, then the parasites can remove ALL code where the license requires attribution oh, they won't? I wonder why |
|
| ▲ | bwfan123 9 hours ago | parent | prev | next [-] |
| > there's plenty of other training data in the world. Not if most of it is machine generated. The machine would start eating its own shit. The nutrition it gets is from human-generated content. > I don't understand the ethical framework for this decision at all. The question is not one of ethics but that of incentives. People producing open source are incentivized in a certain way and it is abhorrent to them when that framework is violated. There needs to be a new license that explicitly forbids use for AI training. That may encourage folks to continue to contribute. |
| |
| ▲ | azakai 9 hours ago | parent [-] | | Saying people shouldn't create open source code because AI will learn from it, is like saying people shouldn't create art because AI will learn from it. In both cases I get the frustration - it feels horrible to see something you created be used in a way you think is harmful and wrong! - but the world would be a worse place without art or open source. |
|
|
| ▲ | Juliate 11 hours ago | parent | prev | next [-] |
| The ethical framework is simply this one: what is the worth of doing +1 to everyone, if the very thing you wish didn't exist (because you believe it is destroying the world) benefits x10 more from it? If bringing fire to a species lights and warms them, but also gives the means and incentives to some members of this species to burn everything for good, you have every ethical freedom to ponder whether you contribute to this fire or not. |
| |
| ▲ | simonw 11 hours ago | parent [-] | | I don't think that a 10x estimate is credible. If it was I'd understand the ethical argument being made here, but I'm confident that excluding one person's open source code from training has an infinitesimally small impact on the abilities of the resulting model. For your fire example, there's a difference between being Prometheus teaching humans to use fire compared to being a random villager who adds a twig to an existing campfire. I'd say the open source contributions example here is more the latter than the former. | | |
| ▲ | simonwsays 10 hours ago | parent | next [-] | | Your argument applies to everything that requires a mass movement to change. Why do anything about the climate? Why do anything about civil rights? Why do anything about poverty? Why try to make any change? I'm just one person. Anything I could do couldn't possibly have any effect. You know what, since all the powerful interests say it's good, it's a lot easier to jump on the bandwagon and act like it is. All of those people who disagree are just luddites anyways. And the luddites didn't even have a point right? They were just idiots who hates metallic devices for no reason at all. | |
| ▲ | Juliate 10 hours ago | parent | prev [-] | | The ethical issue is consent and normalisation: asking individuals to donate to a system they believe is undermining their livelihood and the commons they depend on, while the amplified value is captured somewhere else. "It barely changes the model" is an engineering claim. It does not imply "therefore it may be taken without consent or compensation" (an ethical claim) nor "there it has no meaningful impact on the contributor or their community" (moral claim). |
|
|
|
| ▲ | bgwalter 11 hours ago | parent | prev | next [-] |
| Guilt-tripping people into providing more fodder for the machine. That is really something else. I'm not surprised that you don't understand ethics. |
| |
| ▲ | simonw 11 hours ago | parent | next [-] | | I'm trying to guilt-trip them into using their skills to improve the world through continuing to release open source software. I couldn't care less if their code was used to train AI - in fact I'd rather it wasn't since they don't want it to be used for that. | | |
| ▲ | blibble 10 hours ago | parent | next [-] | | given the "AI" industry's long term goals, I see contributing in any way to generative "AI" to be deeply unethical, bordering on evil which is the exact opposite of improving the world you can extrapolate to what I think of YOUR actions | | |
| ▲ | simonw 8 hours ago | parent | next [-] | | I imagine you think I'm an accelerant of all of this, through my efforts to teach people what it can and cannot do and provide tools to help them use it. My position on all of this is that the technology isn't going to uninvented and I very much doubt it will be legislated away, which means the best thing we can do is promote the positive uses and disincentivize the negative uses as much as possible. | | |
| ▲ | trinsic2 3 hours ago | parent | next [-] | | You know. I'm realizing im my head Im comparing this to Nazism and Hitler. Im sure many people thought he was bringing change to the world and since its going to happen anyway we should all get on-board with it. In the end there was a reckoning. IMHO their are going to be consequences of these negative effects, regardless of the positives. Looking at it in this light, you might want to get out now, while you still can. Im sure its going to continue, its not going to be legislated away, but it's still wrong to be using this technology in the way it's being used right now, and I will not be associated with the harmful effects this technology is being used for because a few corporations feel justified in pushing evil on to the world wrapped positives. | |
| ▲ | blibble 3 hours ago | parent | prev [-] | | I don't see you as an accelerant they're using your exceptional reputation as a open-source developer to push their proprietary parasitic products and business models, with you thinking you're doing good I don't mean to be rude, but I suspect "useful idiot" is probably the term they use to describe open source influencers in meetings discussing early access |
| |
| ▲ | bryan_w 9 hours ago | parent | prev [-] | | Your post, full of well formed, English sentences is also going to contribute to generative AI, so thanks for that. | | |
| ▲ | blibble 9 hours ago | parent [-] | | oh I've thought of that :) my comments on the internet are now almost exclusively anti-"AI", and anti-bigtech |
|
| |
| ▲ | 11 hours ago | parent | prev | next [-] | | [deleted] | |
| ▲ | hnisforjakases 10 hours ago | parent | prev | next [-] | | [dead] | |
| ▲ | realmadludite 10 hours ago | parent | prev [-] | | [dead] |
| |
| ▲ | 10 hours ago | parent | prev [-] | | [deleted] |
|
|
| ▲ | realmadludite 10 hours ago | parent | prev [-] |
| [dead] |