| ▲ | _alternator_ 4 hours ago | |||||||
What about people who want help building a bio weapon? | ||||||||
| ▲ | sgjohnson 3 hours ago | parent | next [-] | |||||||
The cat is out of the bag and there’s no defense against that. There are several open source models with no built in (or trivial to ecape) safeguards. Of course they can afford that because they are non-commercial. Anthorpic can’t afford a headline like “Claude helped a terrorist build a bomb”. And this whataboutism is completely meaningless. See: P. A. Luty’s Expedient Homemade Firearms (https://en.wikipedia.org/wiki/Philip_Luty), or FGC-9 when 3D printing. It’s trivial to build guns or bombs, and there’s a strong inverse correlation between people wanting to cause mass harm and those willing to learn how to do so. I’m certain that _everyone_ looking for AI assistance even with your example would be learning about it for academic reasons, sheer curiosity, or would kill themselves in the process. “What saveguards should LLMs have” is the wrong question. “When aren’t they going to have any?” is an inevitability. Perhaps not in widespread commercial products, but definitely widely-accessible ones. | ||||||||
| ▲ | jazzyjackson 4 hours ago | parent | prev | next [-] | |||||||
What about libraries and universities that do a much better job than a chatbot at teaching chemistry and biology? | ||||||||
| ||||||||
| ▲ | ReptileMan 4 hours ago | parent | prev [-] | |||||||
chances of them surviving the process is zero, same with explosives. If you have to ask you are most likely to kill yourself in the process or achieve something harmless. Think of it that way. The hard part for nuclear device is enriching thr uranium. If you have it a chimp could build the bomb. | ||||||||
| ||||||||