Remix.run Logo
moralestapia 11 hours ago

* sigh *

Three things:

* Delaying the release accomplishes nothing.

* The barrier to someone building/not-building a bioweapon in their backyard is not access to an LLM.

* Remember when GPT 3.5 was going to destroy the world? And how it was conscious? And how it was "trying to escape"? Lmao.

malfist 11 hours ago | parent | next [-]

I think gpt 3.5 might have destroyed the world

usaar333 11 hours ago | parent | prev | next [-]

How does delaying the release not solve anything? It puts everyone on a notice to fix all security vulnerabilities now

spooneybarger 11 hours ago | parent [-]

Because the only thing keeping those vulnerabilities in existence was laziness.

anon84873628 10 hours ago | parent [-]

"laziness" is an interesting reframing of "rational cost-benefit analysis and the limits of the human mind".

fwipsy 10 hours ago | parent | prev [-]

You're right, it's silly for me to worry. We've never had a technology that initially appeared benign but turned into a big problem. In fact, no tech company has ever released technologies that cause problems for the rest of society AT ALL. /s

What are the other barriers? Last I checked access to CRISPR is not especially tightly regulated. Even if it is, defense in depth is a thing.

moralestapia 10 hours ago | parent [-]

If it was as easy as "knowing how to" someone would've already done it or at least attempted to.*

Plenty of people know how to, 10,000s of researchers, perhaps you know someone who does.

Did you know that your local veterinary shop has enough drugs to kill 100s of people?

Why doesn't it happen?

* It's not that easy.

* There's a ton of regulation that is hard to circumvent, on purpose.

* There's a gigantic deterrent called "spend the rest of your life behind bars" that people tend to avoid.

An LLM, even the most advanced one, does not make any material change in any of these. You cannot bullshit your way into "uhh, I need Ebola samples for ... reasons".

Unironically, your Sunday movie portraying a super-villain jeopardizing a city with his "home lab" full of flasks with colored liquids and BioHazard signs push way more people into becoming interested on this than having access to an LLM.

*: Okay, like 5 people, and way before LLMs were a thing. This has been a thing for decades, we're fine.

fwipsy 4 hours ago | parent [-]

CRISPR has not been a thing for decades. Biotechnology is advancing and AI is lowering the bar to use it. In 2018 a PhD student was able to synthesize an infectious horsepox virus: https://journals.plos.org/plosone/article?id=10.1371/journal...

So far the overlap between people with bioengineering capabilities and murderous tendencies has been very low. As the technology becomes available to more people that overlap may increase. Even if it never comes within reach of one person, what about North Korea, or Iran?

AI can be jailbroken. The LLM safeguards your argument relies on were put in place by the people you're criticizing for being too safety-conscious. Security through obscurity is no guarantee.