| ▲ | miyoji 2 hours ago | |
I see value in promulgating safety guidelines for power tools, sure. There's another comment comparing LLMs to shovels, and I think both that and the power tool comparison miss the mark quite a bit. LLMs are a social technology, and the social equivalent of getting your hand cut off doesn't hurt immediately in the way that cutting your actual hand off would. It's more like social media, or cigarettes, or gambling. You can be warned about the dangers, you can see the shells of wrecked human beings who regret using these technologies, but it doesn't work on our stupid monkey brains. Because the pain of the mistake is too loosely connected to the moment of error. We are bad at learning in situations where rewards are immediate and consequences are delayed, and warnings don't do much. I guess what I'm really saying is that these safety guidelines are not nearly enough to keep us safe from the dangers of AI that they're meant to prevent. | ||
| ▲ | Terr_ 2 hours ago | parent [-] | |
> LLMs are social technology [...] cigarettes, or gambling. I agree with the thrust of your argument, a minor wording-quibble: LLM's are a falsely-social technology, in the sense that casinos are a false-prosperity technology and cocaine is a false-happiness technology. It exploits the desire without really being the thing. | ||