Remix.run Logo
csense 4 days ago

Consider the following:

- A therapist may disregard professional ethics and gossip about you

- A therapist may get you involuntarily committed

- A therapist may be forced to disclose the contents of therapy sessions by court order

- Certain diagnoses may destroy your life / career (e.g. airline pilots aren't allowed to fly if they have certain mental illnesses)

Some individuals might choose to say "Thanks, but no thanks" to therapy after considering these risks.

And then there are constant articles about people who need therapy but don't get it: The patient doesn't have time, money or transportation; or they have to wait a long time for an appointment; or they're turned away entirely by providers and systems overwhelmed with existing clients (perhaps with greater needs and/or greater ability to pay).

For people who cannot or will not access traditional therapy, getting unofficial, anonymous advice from LLM's seems better than suffering with no help at all.

(Question for those in the know: Can you get therapy anonymously? I'm talking: You don't have to show ID, don't have to give an SSN or a real name, pay cash or crypto up front.)

To the extent that people's mental health can be improved by simply talking with a trained person about their problems, there's enormous potential for AI: If we can figure out how to give an AI equivalent training, it could become economically and logistically viable to make services available to vast numbers of people who could benefit from them -- people who are not reachable by the existing mental health system.

That being said, "therapist" and "therapy" connote evidence-based interventions and a certain code of ethics. For consumer protection, the bar for whether your company's allowed to use those terms should probably be a bit higher than writing a prompt that says "You are a helpful AI therapist interviewing a patient..." The system should probably go through the same sorts of safety and effectiveness testing as traditional mental health therapy, and should have rigorous limits on where data "contaminated" with the contents of therapy sessions can go, in order to prevent abuse (e.g. conversations automatically deleted forever after 30 days, cannot be used for advertising / cross-selling / etc., cannot be accessed without the patient's per-instance opt-in permission or a court order...)

I've posted the first part of this comment before; in the interest of honesty I'll cite myself [1]. Apologies to the mods if this mild self-plagiarism is against the rules.

[1] https://news.ycombinator.com/item?id=44484207#44505789

skeezyboy 4 days ago | parent [-]

ai just summarizes text, its not like speaking to a person