▲ | mensetmanusman 4 days ago | |||||||||||||||||||||||||||||||
What if it works a third as well as a therapists but is 20 times cheaper? What word should we use for that? | ||||||||||||||||||||||||||||||||
▲ | inetknght 4 days ago | parent | next [-] | |||||||||||||||||||||||||||||||
> What if it works a third as well as a therapists but is 20 times cheaper? When there's studies that show it, perhaps we might have that conversation. Until then: I'd call it "wrong". Moreover, there's a lot more that needs to be asked before you can ask for a one-word summary disregarding all nuance. - can the patient use the AI therapist on their own devices and without any business looking at the data and without network connection? Keep in mind that many patients won't have access to the internet. - is the data collected by the AI therapist usable in court? Keep in mind that therapists often must disclose to the patient what sort of information would be usable, and whether or not the therapist themselves must report what data. Also keep in mind that AIs have, thus far, been generally unable to competently prevent giving dangerous or deadly advice. - is the AI therapist going to know when to suggest the patient talk to a human therapist? Therapists can have conflicts of interest (among other problems) or be unable to help the patient, and can tell the patient to find a new therapist and/or refer the patient to a specific therapist. - does the AI therapist refer people to business-preferred therapists? Imagine an insurance company providing an AI therapist that only recommends people talk to therapists in-network instead of considering any licensed therapist (regardless of insurance network) appropriate for the kind of therapy; that would be a blatant conflict of interest. Just off the top of my head, but there are no doubt plenty of other, even bigger, issues to consider for AI therapy. | ||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||
▲ | zaptheimpaler 4 days ago | parent | prev | next [-] | |||||||||||||||||||||||||||||||
This is the key question IMO, and one good answer is in this recent video about a case of ChatGPT helping someone poison themselves [1]. A trained therapist will probably not tell a patient to take “a small hit of meth to get through this week”. A doctor may be unhelpful or wrong, but they will not instruct you to replace salt with NaBr and poison yourself. "third as well as as therapist" might be true on average, but the suitability of this thing cannot be reduced to averages. Trained humans don't make insane mistakes like that and they know when they are out of their depth and need to consult someone else. | ||||||||||||||||||||||||||||||||
▲ | _se 4 days ago | parent | prev | next [-] | |||||||||||||||||||||||||||||||
"A really fucking bad idea"? It's not one word, but it is the most apt description. | ||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||
▲ | 6gvONxR4sf7o 4 days ago | parent | prev | next [-] | |||||||||||||||||||||||||||||||
Something like this can only really be worth approaching if there was an analog to losing your license for it. If a therapist screws up badly enough once, I'm assuming they can lose their license for good. If people want to replace them with AI, then screwing up badly enough should similarly lose that AI the ability to practice for good. I can already imagine companies behind these things saying "no, we've learned, we won't do it again, please give us our license back" just like a human would. But I can't imagine companies going for that. Everyone seems to want to scale the profits but not accept the consequences of the scaled risks, and increased risks is basically what working a third as well amounts to. | ||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||
▲ | pawelmurias 4 days ago | parent | prev | next [-] | |||||||||||||||||||||||||||||||
You could talk to a stone for even cheaper with way better effects. | ||||||||||||||||||||||||||||||||
▲ | knuppar 4 days ago | parent | prev | next [-] | |||||||||||||||||||||||||||||||
Generally speaking and glossing over country specific rules, all generally available health treatments have to demonstrate they won't cause catastrophic harm. This is a harness we simply can't put around LLMs today. | ||||||||||||||||||||||||||||||||
▲ | fzeroracer 4 days ago | parent | prev | next [-] | |||||||||||||||||||||||||||||||
If it works 33% of the time for people and then drives people to psychosis in the other 66% of the time, what word would you use for that? | ||||||||||||||||||||||||||||||||
▲ | BurningFrog 4 days ago | parent | prev | next [-] | |||||||||||||||||||||||||||||||
Last I heard, most therapy doesn't work that well. | ||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||
▲ | thrown-0825 4 days ago | parent | prev | next [-] | |||||||||||||||||||||||||||||||
Just self diagnose on tiktok, its 100x cheaper. | ||||||||||||||||||||||||||||||||
▲ | randall 4 days ago | parent | prev | next [-] | |||||||||||||||||||||||||||||||
i’ve had a huge amount of trauma in my life and i find myself using chat gpt as kind of a cheater coach thing where i know i’m feeling a certain way, i know it’s irrational, and i don’t really need to reflect on why it’s happening or how i can fix it, and i think for that it’s perfect. a lot of people use therapists as sounding boards, which actually isn’t the best use of therapy imo. | ||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||
▲ | Denatonium 4 days ago | parent | prev | next [-] | |||||||||||||||||||||||||||||||
Whiskey | ||||||||||||||||||||||||||||||||
▲ | filoeleven 3 days ago | parent | prev | next [-] | |||||||||||||||||||||||||||||||
Big if. | ||||||||||||||||||||||||||||||||
▲ | pengaru 4 days ago | parent | prev | next [-] | |||||||||||||||||||||||||||||||
[flagged] | ||||||||||||||||||||||||||||||||
▲ | pessimizer 4 days ago | parent | prev [-] | |||||||||||||||||||||||||||||||
Based on the Dodo Bird Conjecture*, I don't even think there's a reason to think that AI would do any worse than human therapists. It might even be better because the distressed person might hold less back from a soulless machine than they would for a flesh and blood person. Not that this is rational, because everything they tell an AI therapist can be logged, saved forever and combed through. I think that ultimately the word we should use for this is "lobbying." If AI can't be considered therapy, that means that a bunch of therapists, no more effective than Sunday school teachers, working from extremely dubious frameworks** will not have to compete with it for insurance dollars or government cash. Since that cash is a fixed demand (or really a falling one), the result is that far fewer people will get any mental illness treatment at all. In Chicago, virtually all of the city mental health services were closed by Rahm Emmanuel. I watched a man move into the doorway of an abandoned building across from the local mental health center within weeks after it had been closed down and leased to an "tech incubator." I wondered if he had been a patient there. Eventually, after a few months, he was gone. So if I could ask this question again, I'd ask: "What if it works 80%-120% as well as a therapist but is 100 or 1000 times cheaper?" My tentative answer would be that it would be suppressed by lobbyists employed by some private equity rollup that has already or will soon have turned 80% of therapists into even lower-paid gig workers. The place you would expect this to happen first was Illinois, because it is famously one of the most corruptly governed states in the country.*** Our current governor, absolutely terrible but at the same time the best we've had in a long while, tried to buy Obama's Senate seat from a former Illinois governor turned goofy national cultural figure and Trump ass-kisser in a ploy to stay out of prison (which ultimately delivered.) You can probably listen to the recordings now, unless they've been suppressed. I had a recording somewhere years ago, because I worked in a state agency under Blagojevich and followed everything in realtime (including pulling his name off of the state websites I managed the moment he was impeached. We were all gathered around the television in a conference room.) edit: feel like I have to add that this comment was written my me, not AI. Maybe I'm flattering myself to think anybody would make the mistake. ----- [*] Westra, H. A. (2022). The implications of the Dodo bird verdict for training in psychotherapy: prioritizing process observation. Psychotherapy Research, 33(4), 527–529. https://doi.org/10.1080/10503307.2022.2141588 [**] At least Freud is almost completely dead, although his legacy blackens world culture. [***] Probably the horrific next step is that the rollup lays off all the therapists and has them replaced with an AI they own, after lobbying against the thing that they previously lobbied for. Maybe they sell themselves to OpenAI or Anthropic or whoever, and let them handle that phase. |