Remix.run Logo
jedharris 6 days ago

> it's possible for simple normal trains of thought to latch your brain into a very undesirable state.

This seems very incorrect, or at least drastically underspecified. These trains of thought are "normal" (i.e. common and unremarkable) so why don't they "latch your brain into a very undesirable state" lots of the time?

I don't think Scott or anyone up to speed on modern neuroscience would deny the coupling of mental state and brain chemistry--in fact I think it would be more accurate to say both of them are aspects of the dynamics of the brain.

But this doesn't imply that "simple normal trains of thought" can latch our brain dynamics into bad states -- i.e. in dynamics language move us into a undesirable attractor. That would require a very problematic fragility in our normal self-regulation of brain dynamics.

AstralStorm 6 days ago | parent [-]

See the key here is, the AI provides a very enticing social partner.

Think of it as a version of making your drugged friend believe various random stuff. It works better if you're not a stranger and have an engaging or alarming style.

LLMs are trained to produce pleasant responses that tailor to the user to maximize positive responses. (A more general version of engagement.) It stands to reason they would be effective at convincing someone.