| ▲ | Epitaque 5 hours ago | ||||||||||||||||
For the sake of me seeing if people like you understand the other side, can you try steelmanning the argument that open weight AI can allow bad actors to cause a lot of harm? | |||||||||||||||||
| ▲ | wavemode an hour ago | parent | next [-] | ||||||||||||||||
The steelman argument is that super-intelligent AGI could allow any random person to build destructive technology, so companies on the path toward creating that ought to be very careful about alignment, safety and, indeed, access to weights. The obvious assumed premise of this argument is that Anthropic are actually on the path toward creating super-intelligent AGI. Many people, including myself, are skeptical of this. (In fact I would go farther - in my opinion, cosplaying as though their AI is so intelligent that it's dangerous has become a marketing campaign for Anthropic, and their rhetoric around this topic should usually be taken with a grain of salt.) | |||||||||||||||||
| ▲ | thenewnewguy 4 hours ago | parent | prev | next [-] | ||||||||||||||||
I would not consider myself an expert on LLMs, at least not compared to the people who actually create them at companies like Anthropic, but I can have a go at a steelman: LLMs allow hostile actors to do wide-scale damage to society by significantly decreasing the marginal cost and increasing the ease of spreading misinformation, propaganda, and other fake content. While this was already possible before, it required creating large troll farms of real people, semi-specialized skills like photoshop, etc. I personally don't believe that AGI/ASI is possible through LLMs, but if you do that would magnify the potential damage tenfold. Closed-weight LLMs can be controlled to prevent or at least reduce the harmful actions they are used for. Even if you don't trust Anthropic to do this alone, they are a large company beholden to the law and the government can audit their performance. A criminal or hostile nation state downloading an open weight LLM is not going to care about the law. This would not be a particularly novel idea - a similar reality is already true of other products and services that can be used to do widespread harm. Google "Invention Secrecy Act". | |||||||||||||||||
| ▲ | 10xDev 5 hours ago | parent | prev [-] | ||||||||||||||||
"please do all the work to argue my position so I don't have to". | |||||||||||||||||
| |||||||||||||||||