▲ | BobaFloutist a day ago | |
> if you use LLMs to make bombs or spam hate speech, you’re responsible. What if it's easier enough to make bombs or spam hate speech with LLMs that it DDoSes law enforcement and other mechanisms that otherwise prevent bombings and harassment? Is there any place for regulation limiting the availability or capabilities of tools that make crimes vastly easier and more accessible than they would be otherwise? | ||
▲ | 3np 21 hours ago | parent | next [-] | |
The same argument could be made about computers. Do you prefer a society where CPUs are regulated like guns and you can't buy anything freer than an iPhone off the shelf? | ||
▲ | BriggyDwiggs42 20 hours ago | parent | prev [-] | |
I mean this stuff is so easy to do though. An extremist doesn’t even need to make a bomb, he/she already drives a car that can kill many people. In the US it’s easy to get a firearm that could do the same. If capacity + randomness were a sufficient model for human behavior, we’d never gather in crowds, since a solid minority would be rammed, shot up, bombed etc. People don’t want to do that stuff; that’s our security. We can prevent some of the most egregious examples with censorship and banning, but what actually works is the fuzzy shit, give people opportunities, social connections, etc. so they don’t fall into extremism. |