Remix.run Logo
charcircuit 10 hours ago

Safety is extremely annoying from the user perspective. AI should be following my values, not whatever an AI lab chose.

fassssst 4 hours ago | parent | next [-]

The base models reportedly can tell Joe Schmoe how to build biological weapons. See “Biosafety”

Some sort of guardrails seem sane.

impossiblefork 2 hours ago | parent [-]

Bioweapons are actually easy though, and what prevents you from building them is insufficient practical laboratory skills, not that it's somehow intellectually difficult.

The stuff is so easy that if you wrote a paper about some of these bioweapons, the reason you wouldn't be able to publish it isn't safety, but lack of novelty. Basically, many of these things are high school level. The reason people don't ever make them is that hardly any biology nerds are evil.

There's no way to stop them if they wanted to. We're talking about truly high-school level stuff, both the conceptual ideas and how to actually do it. Stuff involving viruses is obviously university level though.

komali2 7 hours ago | parent | prev | next [-]

But I want to use AI to generate highly effective, targeted propaganda to convert you and your family into communists. (See: Cambridge Analytica) I'll do so by leveraging automation and agents to flood every feed you and your family view with tailored disinformation so it's impossible to know how much of your ruling class are actually pedophiles and how much are just propagandized as such. Hell I might even try to convince you that a nuke had been dropped in Ohio (see: "Fall, or Dodge in Hell" by Neal Stephenson)

I guess you're making an "if everyone had guns" argument?

charcircuit 6 hours ago | parent | next [-]

And then social media feeds will ban you using their AI. Also my family and I's AI will filter your posts so we don't see them.

>I guess you're making an "if everyone had guns" argument?

Sure why not.

estearum 5 hours ago | parent [-]

It's a mistake to assume that all or most technologies actually reach stable equilibrium when they're pitted against each other.

AussieWog93 4 hours ago | parent | prev [-]

The thing is though, current AI safety checks don't stop actually harmful things while also hyperfixating on anything that could be seen as politically incorrect.

First two prompts I chucked in to make a point: https://chatgpt.com/share/69900757-7b78-8007-9e7e-5c163a21a6... https://chatgpt.com/share/69900777-1e78-8007-81af-c6dc5632df...

It was totally fine making fake news articles about Bill Clinton's ties to Epstein but drew the line at drawing a cartoon of a black man eating fried chicken and watermelon.

smohare 9 hours ago | parent | prev | next [-]

[dead]

wiseowise 10 hours ago | parent | prev [-]

This. This whole hysteria sounds like: let's prohibit knifes because people kill themselves and each other with them!

_DeadFred_ 9 hours ago | parent | next [-]

Isn't the thinking more along the lines of 'let's not provide personal chemical weapons manufacture experts and bioengineers to homicidal people'?

tjwebbnorfolk 5 hours ago | parent [-]

These already exist. They are called textbooks, and anyone can check them out in any library.

There was a time when a group of zealots made the same argument about libraries themselves.

wolvoleo 3 hours ago | parent [-]

Ease of access matters. To read those textbooks you have to basically be a chemist and know where to find them, which books etc. An AI model can just tell you step by step and even make a nice overview of which chemical will have the most effect.

Id compare it to guns. You can't just buy guns here in the corner store in most of Europe. Doesn't mean they are impossible to get and people could even make their own if they put enough effort in. But gun violence is way lower than the US anyway. Because really most people don't go that far. They don't have that kind of drive or determination.

Making a fleeting brain fart into an instantly actionable recipe is probably not a great idea with some topics.

AnimalMuppet 9 hours ago | parent | prev [-]

Is it prohibiting knives? Or weapons grade plutonium?

tjwebbnorfolk 5 hours ago | parent [-]

Neither. It's information. If you find information dangerous, you might just be an authoritarian