Remix.run Logo
perlgeek 4 days ago

Just using an LLM as is for therapy, maybe with an extra prompt, is a terrible idea.

On the other hand, I could image some more narrow uses where an LLM could help.

For example, in Cognitive Behavioral Therapy, there are different methods that are pretty prescriptive, like identifying cognitive distortions in negative thoughts. It's not too hard to imagine an app where you enter a negative thought on your own and exercise finding distortions in it, and a specifically trained LLM helps you find more distortions, or offer clearer/more convincing versions of thoughts that you entered yourself.

I don't have a WaPo subscription, so I cannot tell which of these two very different things have been banned.

delecti 4 days ago | parent | next [-]

LLMs would be just as terrible at that usecase as any other kind of therapy. They don't have logic, and can't determine a logical thought from an illogical one. They tend to be overly agreeable, so they might just reinforce existing negative thoughts.

It would still need a therapist to set you on the right track for independent work, and has huge disadvantages compared to the current state-of-the-art, a paper worksheet that you fill out with a pen.

tejohnso 4 days ago | parent [-]

They don't "have" logic just like they don't "have" charisma? I'm not sure what you mean. LLMs can simulate having both. ChatGPT can tell me that my assertion is a non sequitur - my conclusion doesn't logically follow from the premise.

4 days ago | parent | next [-]
[deleted]
ceejayoz 4 days ago | parent | prev [-]

Psychopaths can simulate empathy, but lack it.

AlecSchueler 4 days ago | parent [-]

Psychopaths also tend to eat lunch, but what's your point?

ceejayoz 4 days ago | parent [-]

The point is simulating something isn't the same as having something.

AlecSchueler 4 days ago | parent [-]

Well yes, that's a tautology. But is a simulation demonstrably less effective?

ceejayoz 4 days ago | parent [-]

> But is a simulation demonstrably less effective?

Yes?

If you go looking to psychopaths and LLMs for empathy, you're touching a hot stove. At some point, you're going to get burned.

wizzwizz4 4 days ago | parent | prev [-]

> and a specifically trained LLM

Expert system. You want an expert system. For example, a database mapping "what patients write" to "what patients need to hear", a fuzzy search tool with properly-chosen thresholding, and a conversational interface (repeats back to you, paraphrased – i.e., the match target –, and if you say "yes", provides the advice).

We've had the tech to do this for years. Maybe nobody had the idea, maybe they tried it and it didn't work, but training an LLM to even approach competence at this task would be way more effort than just making an expert system, and wouldn't work as well.