Remix.run Logo
STRiDEX 6 hours ago

Dumb question but why are chemical weapons always addressed as a risk with llms? Is the idea that they contain how to make chemical weapons or that they would guide someone on how?

Would there not already be websites that contain that information? How is an llm different, i guess, from some sort of anarchist cookbook thing.

Philpax 6 hours ago | parent | next [-]

Both. There's the risk of them instructing a user on how to produce a known formulation (the Anarchist Cookbook solution, as you say), which is irritating but not that problematic.

The bigger issue is that they are potentially capable of producing novel formulations capable of producing harm, and guiding someone through this process. That is, consider a world in which someone with malicious desires has access to a model as capable at chemistry / biology as Mythos is at offensive cybersecurity abilities.

This is obviously limited by the fact that the models don't operate in the physical world, but there's plenty of written material out there.

rogerrogerr 6 hours ago | parent [-]

The world has been blessed by two connected things:

1. Smart people have economic opportunities that align them away from being evil

2. People who are evil tend not to be smart.

We're breaking both of these assumptions.

chrisweekly 5 hours ago | parent | next [-]

"Smart people have economic opportunities that align them away from being evil"

For some definition of evil, some of the time, ok. But as economic opportunities compound (looking at the behavior of the ultra-rich), it seems there's at least strong correlation in the other direction, if not full-on "root of all evil" causation.

rogerrogerr 5 hours ago | parent [-]

Sure, but that’s not “slaughter a stadium of people with drones” evil or “poison the water supply” evil or “take out unprotected electrical substations” evil.

So much infrastructure is very soft because the evil people aren’t smart enough to conceive of or conduct an attack.

fwip 4 hours ago | parent [-]

I think you might find that, if you reconsider who the 'evil' people are, you might find that we're already doing that sort of thing.

Der_Einzige 6 hours ago | parent | prev | next [-]

Good. This is how we will force the world to reckon with the isolated, the disgruntled, and "lone wolf" terrorist. Real "sigma males" actually exist, and when they decide "society has to pay" we are all worse off for it. If Ted Kaczynski (quintessential example of a real actual sigma) had been in his prime operating right now, he'd have mail-bombed NeurIPS and ICLR already. I'm not cool with being in crowds of AI professionals right now for physical security reasons given the extreme anti-AI sentiment that exists from nearly everyone outside of the valley: https://jonready.com/blog/posts/everyone-in-seattle-hates-ai...

malcolmgreaves 5 hours ago | parent | prev [-]

That’s not quite true. Take a look at all the billionaires destroying society. Being evil is the surest way to get to get rich. In fact it’s the only way to amass that level of capital: there’s no ethical billionaire.

mikek 4 hours ago | parent [-]

This feels like a wild overgeneralization. People can become rich without resorting to evil methods, especially now with global markets and software. Case in point: Minecraft was wildly successful, and now Notch is a billionaire.

hxugufjfjf 4 hours ago | parent [-]

Eeeeh not the best example maybe?

orneryostrich 4 hours ago | parent [-]

Pre-wealth, Notch was friendly, kind, and downright jolly! Even as he started to accumulate wealth, he was donating huge sums of money to various indie games. Whenever a Humble Bundle dropped he would top the leaderboard for the amount he paid for the games. Things took a major turn for the worse after the acquisition and after he left Mojang. That's when he ran out of purpose and turned to drugs and conspiracy theories.

dcre 5 hours ago | parent | prev | next [-]

LLMs can tell you exactly how to acquire the materials and manufacture the materials. They might even come up with novel formulations that rely on substances that are easier to get. There might be information about this stuff online but LLMs are much better than random idiots at adapting that information to their actual situation.

On top of LLMs reducing the cost/difficulty, the other reason biological and chemical weapons are such a worry is their asymmetric character — they are much much easier and cheaper to produce and deploy than they are to defend against.

somesortofthing 4 hours ago | parent | prev | next [-]

They contain broad overviews(throw some disease-causing bacteria in a sort of rainbow arrangement of increasingly more effective antibiotics, you'll usually get something that's at least very deadly even if it doesn't have pandemic potential) but executing in a real lab takes a ton of trial and error to figure out the details. The issue is that the details ~all exist somewhere in the training dataset already, discovered and documented over the course of unrelated, benign biology research. Ability to quickly and accurately search over that corpus translates to large speedups in the physical development process.

Aboutplants 4 hours ago | parent | prev | next [-]

It’s marketing, Fear is one of the most effective marketing tools. That and purpose of government attention

Nicook 2 hours ago | parent | prev | next [-]

Probably also a bit of liability. After all its been trained on a dataset that includes a long running joke of trying to trick people on the internet to unknowingly create chlorine gas.

rgbrenner 6 hours ago | parent | prev | next [-]

In the same way that all coding docs are available publicly

CodingJeebus 6 hours ago | parent | prev [-]

WAG but I wonder if a hijacked LLM could also assist with figuring out how to obtain required materials, not just provide the recipe.