Remix.run Logo
borski 3 hours ago

I never said therapists were only for those in crisis; that is a misreading of my argument entirely.

An LLM cannot parse the complexity of your situation. Period. It is literally incapable of doing that, because it does not have any idea what it is like to be human.

Therapy is not an objective science; it is, in many ways, subjective, and the therapeutic relationship is by far the most important part.

I am not saying LLMs are not useful for helping people parse their emotions or understand themselves better. But that is not therapy, in the same way that using an app built for CBT is not, in and of itself, therapy. It is one tool in a therapist’s toolbox, and will not be the right tool for all patients.

That doesn’t mean it isn’t helpful.

But an LLM is not a therapist. The fact that you can trivially convince it to believe things that are absolutely untrue is precisely why, for one simple example.

vanviegen 38 minutes ago | parent [-]

As you said earlier, therapists are (thoroughly) trained on how to best handle situations. Just 'being human' (and thus empathizing) may not be such a big part of the job as you seem to believe.

Training LLMs we can do.

Though it might be important for the patient to believe that the therapist is empathizing, so that may give AI therapy an inherent disadvantage (depending on the patient's view of AI).