| ▲ | pjc50 3 hours ago |
| This seems like a "we've banned you and will ban any account deemed to be ban-evading" situation. OSS and the whole culture of open PRs requires a certain assumption of good faith, which is not something that an AI is capable of on its own and is not a privilege which should be granted to AI operators. I suspect the culture will have to retreat back behind the gates at some point, which will be very sad and shrink it further. |
|
| ▲ | bayindirh 3 hours ago | parent | next [-] |
| > I suspect the culture will have to retreat back behind the gates at some point, which will be very sad and shrink it further. I'm personally contemplating not publishing the code I write anymore. The things I write are not world-changing and GPLv3+ licensed only, but I was putting them out just in case somebody would find it useful. However, I don't want my code scraped and remixed by AI systems. Since I'm doing this for personal fun and utility, who cares about my code being in the open. I just can write and use it myself. Putting it outside for humans to find it was fun, while it lasted. Now everything is up for grabs, and I don't play that game. |
| |
| ▲ | 20k 3 hours ago | parent [-] | | Its astonishing the way that we've just accepted mass theft of copyright. There appears to be no way to stop AI companies from stealing your work and selling it on for profits On the plus side: It only takes a small fraction of people deliberately poisoning their work to significantly lower the quality, so perhaps consider publishing it with deliberate AI poisoning built in | | |
| ▲ | thephyber 2 hours ago | parent | next [-] | | In practice, the real issue is how slow and subjective the legal enforcement of copyright is. The difference between copyright theft and copyright derivatives is subjective and takes a judge/jury to decide. There’s zero possibility the legal system can handle the bandwidth required to solve the volume of potential violations. This is all downstream of the default of “innocent until proven guilty”, which vastly benefits us all. I’m willing to hear out your ideas to improve on the situation. | |
| ▲ | jbreckmckye 3 hours ago | parent | prev | next [-] | | Would publishing under AGPL count as poisoning? Or even with an explicit "this is not licensed" license | | |
| ▲ | thephyber 2 hours ago | parent | next [-] | | Your licensing only matters if you are willing to enforce it. That costs lawyer money and a will to spend your time. This won’t be solved by individuals withholding their content. Everything you have already contributed to (including GitHub, StackOverflow, etc) has already been trained. The most powerful thing we can do is band together, lobby Congress, and get intellectual property laws changes to support Americans. There’s no way courts have the bandwidth to react to this reactively. | |
| ▲ | 2 hours ago | parent | prev [-] | | [deleted] |
| |
| ▲ | pjc50 2 hours ago | parent | prev [-] | | Eh, the Internet has always been kinda pro-piracy. We've just ended up with the inverse situation where if you're an individual doing it you will be punished (Aaron Scwartz), but if you're a corporation doing it at a sufficiently large scale with a thin figleaf it's fine. | | |
| ▲ | bayindirh 2 hours ago | parent [-] | | While it was pro-piracy, nobody did deliberately closed GPL or MIT code because there was an unwritten ethical agreement between everyone, and that agreement had benefits for everyone. The batch has spoiled when companies started to abuse developers and their MIT code for exposure points and cookies. ...and here we are. |
|
|
|
|
| ▲ | functionmouse 2 hours ago | parent | prev | next [-] |
| Better my gates than Bill Gates The moment Microsoft bought GitHub it was over |
|
| ▲ | jcgrillo 2 hours ago | parent | prev | next [-] |
| The tooling amplifies the problem. I've become increasingly skeptical of the "open contributions" model Github and their ilk default to. I'd rather the tooling default be "look but don't touch"--fully gate-kept. If I want someone to collaborate with me I'll reach out to that person and solicit their assistance in the form of pull requests or bug reports. I absolutely never want random internet entities "helping". Developing in the open seems like a great way to do software. Developing with an "open team" seems like the absolute worst. We are careful when we choose colleagues, we test them, interview them.. so why would we let just anyone start slinging trash at our code review tools and issue trackers? A well kept gate keeps the rabble out. |
|
| ▲ | casey2 2 hours ago | parent | prev | next [-] |
| We have webs of trust, just swap router/packet with PID/PR
Then the maintainer can see something like 10-1 accepted/rejected for first layer (direct friends) 1000-40 for layer two (friends of friends) and so own. Then you can directly message any public ID or see any PR. This can help agents too since they can see all their agent buddies have a 0% success rate they won't bother |
|
| ▲ | midnitewarrior 3 hours ago | parent | prev [-] |
| Do that and the AI might fork the repo, address all the outstanding issues and split your users. The code quality may not be there now, but it will be soon. |
| |
| ▲ | sethops1 3 hours ago | parent | next [-] | | This is a fantasy that virtually never comes to fruition. The vast majority of forks are dead within weeks when the forkers realize how much effort goes into building and maintaining the project, on top of starting with zero users. | | |
| ▲ | dormento an hour ago | parent | next [-] | | This might be true today, but think about it. This is a new scenario, where a giga-brain-sized <insert_role_here> works tirelessly 24/7 improving code. Imagine it starts to fork repos. Imagine it can eventually outpace human contributors, not only on volume (which it already can), but in attention to detail and usefulness of resulting code. Now imagine the forks overtake the original projects. This is not just "Will Smith eating spaghetti", its a real breaking point. I'm equal parts frightened and amazed. | |
| ▲ | thephyber 2 hours ago | parent | prev | next [-] | | While true, there are projects which surmount these hurdles because the people involved realize how important the project is. Given projects which are important enough, the bots will organize and coordinate. This is how that Anthropic developer got several agents to work in parallel to write a C compiler using Rust, granted he created the coordination framework. | |
| ▲ | newswasboring 2 hours ago | parent | prev [-] | | I think the difference now (in case code quality is solved with LLMs) is the cost of effort is now approaching zero. | | |
| ▲ | PurpleRamen 26 minutes ago | parent [-] | | Good enough AI is not cheap (yet). So at the moment it's more a scenario for people who are rich enough. Though, small projects with little maintenance-burden might be at a risk here. But thinking about, this might be a new danger to get us into another xz-utils-situation. The big malicious actors have enough money to waste and can scale up the amount of projects they attack and hijack, or even build themselves. |
|
| |
| ▲ | bayindirh an hour ago | parent | prev | next [-] | | > The code quality may not be there now, but it will be soon. I'm hearing this exact argument since 2002 or so. Even Duke Nukem Forever has been released in this time frame. I bet even Tesla might solve Autopilot(TM) problems before this becomes a plausible reality. | |
| ▲ | acedTrex an hour ago | parent | prev [-] | | I am perfectly willing to take that risk. Hell i'll even throw ten bucks on it while we are here. |
|