| ▲ | kay_o 2 hours ago | |||||||
> However, there are lots of people in the world who live their whole life by vibing Why are they often so desperate to lie and non-consensually harass others with their vibing rather than be honest about it? Why do they think they are "helping" with hallucinated rubbish that can't even build? I use LLMs. It is not difficult to: ethically disclose your use, double check all of your work, ensure things compile without errors, not lie to others, not ask it to generate ten paragraphs of rubbish when the answer is one sentence, and respect the project's guidelines. But for so many people this seems like an impossible task. | ||||||||
| ▲ | automatic6131 2 hours ago | parent | next [-] | |||||||
> Why do they think they are "helping" with hallucinated rubbish that can't even build? Because they can't tell the difference between what the machine is outputting, and what people have built. All they see is the superficial resemblance (long lines of incomprehensbile code) and the reward that the people writing the code have got, and want that reward too. | ||||||||
| ||||||||
| ▲ | pjc50 2 hours ago | parent | prev | next [-] | |||||||
"Main character energy". What they're really doing is protecting their view of themselves as smart, and they're making a contribution for the sake of trying to perform being an OSS dev rather than out of need or altruism. AI is absolutely terrible for people like that, as it's the perfect enabler. | ||||||||
| ▲ | drchickensalad 2 hours ago | parent | prev | next [-] | |||||||
You're asking why oil doesn't act like water. It's not really an impossible task, it's just not one they agree with. | ||||||||
| ▲ | MattDaEskimo 26 minutes ago | parent | prev | next [-] | |||||||
I wonder how many are account farming. | ||||||||
| ▲ | ramon156 2 hours ago | parent | prev | next [-] | |||||||
It's the same as cheating in a game. You are given an """advantage""", so lying about it seems like the best option | ||||||||
| ▲ | jcgrillo an hour ago | parent | prev [-] | |||||||
LLMs are in this case enabling bad behavior, but open source software has always been vulnerable to this. Similarly, people who use LLMs to do this kind of thing are the kind of people who would have done it without LLMs but for the large effort it would have taken. We're just learning now how large that group is. This is a good thing, it's an opportunity to make open source development processes robust to this kind of sabotage. | ||||||||