| ▲ | kstenerud 4 hours ago |
| > The giant plagiarism machines have already stolen everything. Copyright is dead. Licenses are washed away in clean rooms. Isn't this what the free software movement wanted? Code available to all? Yes, code is cheap now. That's the new reality. Your value lies elsewhere. You can lament the loss of your usefulness as a horse buggy mechanic, or you can adapt your knowledge and experience and use it towards those newfangled automobiles. |
|
| ▲ | probably_wrong 3 hours ago | parent | next [-] |
| > Isn't this what the free software movement wanted? Code available to all? But this is not that. The current situations is closer to "what's yours is mine and what's mine is mine". I have been releasing my writings under a Creative Commons Attribution-ShareAlike license which requires attribution and that anything built upon the material to be distributed "under the same license as the original". And yet I have no access to OpenAI's built-upon material (I know for a fact they scrape my posts) while they get my data for free. This is so far legal, but it's probably not ethical and definitely not what the free software movement wanted. |
|
| ▲ | lmm 4 hours ago | parent | prev | next [-] |
| > Isn't this what the free software movement wanted? Code available to all? Available to all yes. Not available to the giant corpos while the lone hobbyist still fears getting sued to oblivion. In fact that's pretty much the opposite of what the free software movement wanted. Also the other thing the free software movement wanted was to be able to fix bugs in the code they had to use, which AI is pulling us further and further away from. |
|
| ▲ | mmustapic 4 hours ago | parent | prev | next [-] |
| No, the free software movement wants that the source code of the software you use be available to you to modify it if you wish. AI does not necessarily do that. |
| |
| ▲ | kstenerud 4 hours ago | parent [-] | | AI makes the entirety of the software engineering profession available to you. All you have to do is ask the right way, and you can build in days what once took months or years. Decompiling and re-engineering proprietary code has never been easier. You almost don't even need the source code anymore. The object code can be examined by your LLM, and binary patches applied. Closed source is no longer the moat it was, and so keeping the source code to yourself is only going to hurt you as people pass you over for companies who realize this, and strive to make it easier for your LLM to figure their systems out. | | |
| ▲ | mmustapic 4 hours ago | parent | next [-] | | But I can't have the weights of the LLM model I'm using for this. | | | |
| ▲ | doctorwho42 an hour ago | parent | prev | next [-] | | Reengineering from scratch is different than being able to form an existing software. | |
| ▲ | Arkhaine_kupo 4 hours ago | parent | prev [-] | | > Decompiling and re-engineering proprietary code has never been easier. You almost don't even need the source code anymore. The object code can be examined by your LLM, and binary patches applied. Jesus christ. "The people who wanted everyone to have a home should be happy with the invention of the lockpick. You can just find a nice house and open the lock and move in. Ignore the lockpick company charging essentially whatver they want for lockpicks or how they got accesss to everyones keyfob, or the danger of someone breaking into your house" That is basically your argument. Like AI is a copyright theft machine, with companies owning the entire stack and being able to take away at will, and comitting crimes like decompiling source code instead of clean room is not a selling point either... The open source community wants people to upskill, people become tech literate, free solutions that grow organically out of people who care, features the community needs and wants and people having the freedom to modify that code to solve their own circumstances. | | |
| ▲ | Supermancho 4 hours ago | parent | next [-] | | > That is basically your argument. Like AI is a copyright theft machine, with companies owning the entire stack and being able to take away at will, and comitting crimes like decompiling source code instead of clean room is not a selling point either... Stop trying to make this into some abstract argument. It's not an argument anymore. It's already happened. How one might choose to characterize the reality, is irrelevant. A vast (and growing) amount of source code is more open, for better or worse. Granted, this is to the chagrin of subgroups that had been pushing different strategies. | | |
| ▲ | simoncion 4 hours ago | parent | next [-] | | > It's already happened. Agreed. > Stop trying to make this into some abstract argument. As you mentioned, it's not an abstract argument. It's statements of fact. > A vast (and growing) amount of source code is more open... No, not at all. 1) If you honestly believe that major tech companies will permit both copyright- and license-washing of their most important proprietary code simply because someone ran it through an LLM, you're quite the fool. If someone "trained" an LLM on -say- both Windows 11 and ReactOS, and then used that to produce "ReactDoze" while being honest about how it was produced, Microsoft would permanently nail them to the wall. 2) The LLMs that were trained on the entirety of The Internet are very, very much not open. If "Open"AI and Anthropic were making available the input data, the programs and procedures used to process that data, and all the other software, input data, and procedures required to reproduce their work, then one could reasonably entertain the claim that the system produced was open. | | |
| ▲ | kstenerud 3 hours ago | parent [-] | | This is looking at the current situation through the old lens. That ship has sailed. The revolution is happening. We live in a new reality now, one where we're still trying to figure out what rules should even be. And there will be winners and losers, and copyright and patent law will be modified in an attempt to tame the chaos, with mixed results because of all of the powerful players on both ends. You can live on the front of it for high risk/reward, or at the back for safety. But either way, you're going to exist in this new reality and you need to decide your risk appetite. | | |
| |
| ▲ | Arkhaine_kupo 4 hours ago | parent | prev [-] | | > Stop trying to make this into some abstract argument. It's not an argument anymore. It's already happened. yes and lockpicks also exist. Promotting the ability to break into homes when people are talking about the housing crisis is a crazy, short sighted and frankly embarrasing position to take. And mischaracterising the people in the open source community as belonging to that ideology is insulting. > A vast (and growing) amount of source code is more open You are missusing the word open here, for accesible. Having an open house, and breaking into someone's home are not the same thing, even if the door ends up open either way. > Granted, this is to the chagrin of subgroups that had been pushing different strategies. Taking unethical shortcuts that ultimately lead to an even worse outcome is not a cause of chagrin, its a cause of deep and utter terror and embarrasment. Wanting people to own their skills and tech stack and be informed, smart and engaged is a goal that "just ask the robot you dont control to break into a corporate codebase and copy it" is not even remotely close to helping get close to. |
| |
| ▲ | satvikpendem 2 hours ago | parent | prev [-] | | This argument commits the same fallacy as the argument against piracy; copying is not stealing, because the original still remains. A lockpicked and squatted house means someone else does not have that house, it's a zero sum game which information which is freely copyable does not align with. | | |
| ▲ | Arkhaine_kupo 5 minutes ago | parent [-] | | That only works if you assume that the exclusive value is in the object and not the labour. The reproduction of the object is essentially free in the internet, but the labour to produce it isn't. If I spent 3 years making my codebase, and you copy paste the git repo, yeah your access to the information is not going to replace the original. But your labour cost is 0 and you can undercut the 3 years of expense, loans or debt I adquired to produce it. Btw the FBI murdered Aaron Swartz for attempting to open access to research papers, Mark Zuckemberg admitted to stealing those ssame papers through libgen and showed off the results of Llama and his stock price went up. I think the piracy argument falls apart when the class warfare and 2 tier justce system is openly weaponised towards open access |
|
|
|
|
|
| ▲ | sdevonoes 4 hours ago | parent | prev | next [-] |
| Progress is good. But why on earth should we support Anthropic/OpenAI/etc? What the planet needs is less multibillion corporations, not more |
| |
| ▲ | kstenerud 4 hours ago | parent | next [-] | | You don't have to. Just like you don't have to support Amazon for web services and file stores. Or Oracle for databases. Or Microsoft for operating systems. Or DEC for computers. There are perfectly good open source LLMs and agents out there, which are getting better by the day (especially after the recent leak!) | |
| ▲ | farfatched 4 hours ago | parent | prev [-] | | I want to support local models and compute over SaaS models. I want to support RISC V over Intel. I want other things too, and on balance, Intel+Anthropic is most compliant with my various preferences, even if they're not perfect. |
|
|
| ▲ | toofy 2 hours ago | parent | prev | next [-] |
| i can say with a pretty high confidence level that few people in the free software movement want the closed off black boxes these companies are locking away. they’re not free in any sense of the word. from price to openness of the models. would openai cry if every bit of their models were wide open for us to use however we see fit? if so, then it’s not free, again, in any definition of the word. |
|
| ▲ | 4 hours ago | parent | prev [-] |
| [deleted] |