| ▲ | lukev 4 days ago |
| Good. It's difficult to imagine a worse use case for LLMs. |
|
| ▲ | dmix 4 days ago | parent | next [-] |
| Most therapists barely say anything by design, just know when to ask questions or lead you somewhere. So having one always talking in every statement doesn't fit the method. More like a friend you dump on simulator. |
| |
| ▲ | 999900000999 4 days ago | parent [-] | | [flagged] | | |
| ▲ | dannersy 4 days ago | parent | next [-] | | So, therapy is useless (as a concept) because America's healthcare system is dogshit? That statement doesn't make any sense. Water is still necessary for the body whether I can acquire it or not. Therapy, or any healthcare in general, is still useful whether or not you can afford it. | |
| ▲ | thrown-0825 4 days ago | parent | prev [-] | | Most people shop around for therapists that align with their values anyways. They are really paying $800 / month to have their feelings validated and receive a diagnosis that absolves them from taking ownership over their emotions and actions. | | |
| ▲ | AlecSchueler 4 days ago | parent | next [-] | | Source for this? Either way it's still demonstrably the most effective treatment for many issues. Sometimes being heard is good enough. | | |
| ▲ | 999900000999 4 days ago | parent | next [-] | | Yeah, but the number one stressor for the vast majority of people is money in one way or another. If you have a spare 9600$ a year to be heard you're doing very well. Remember, we're talking about a country where people skip insulin. Back during my second eviction I had a friend listen to me whine on the phone for hours. That's a debt I can never repay, I definitely didn't have health insurance or a spare 800$ a month back then. Or to flip this around, 800$ a month would be a fantastic treatment for most stressed out lower income people. I really hate how therapy is promoted as some kind of miracle, when; A) It's completely inaccessible to those who need it most. B) Can actually make things significantly worse. C) You probably just need to do less of whatever your doing. But if slow down you might get fired. If you get fired you won't be able to afford 800$ a month! | | |
| ▲ | AlecSchueler 4 days ago | parent [-] | | > Remember, we're talking about a country where people skip insulin. Ah, I'm in the Netherlands. I didn't realise we were only talking about the US. I know the story is about Illinois but I thought the critique of therapy was intended to be broader. It goes without saying that basic necessities like food and housing come first for health, mental and otherwise, I'm sorry that they're so uncertain in America. | | |
| ▲ | 999900000999 4 days ago | parent [-] | | Oh no I'm only speaking about the American context. I guess you're in a magical Utopia where people don't skip essential medicines because they can't come up with the co-pay even if they have insurance. Or the health insurance company will outright refuse to cover what your doctor prescribes so you need to materialize a spare 1000$. Too sick to work, time to cut off your medicare because you failed the work requirements. Even if you find a therapist that works, they can move out if your insurance network. Or you switch to a new job that offers different insurance your therapist can't accept. I know during my second eviction I didn't have 800$ a month. So what use is it. Do only upper middle class people have problems worthy of consideration? | | |
| ▲ | AlecSchueler 4 days ago | parent [-] | | Your response feels quite snarky but I understand you're speaking from a place of emotion after your own difficult experiences. Here in The Netherlands people with less money can get access to therapy with assistance from the state. I've had to do it myself and it cost me around 300 euros per year to see 3 different providers for 3 kinds of therapy; the rest of the costs were covered by the state. I wouldn't call it a magical utopia as it works via a system of mutual social support, not magic, but it does seem relatively utopic in comparison to what you describe. | | |
| ▲ | 999900000999 4 days ago | parent [-] | | It's ok. My first reply was flagged for pointing out an unaffordable treatment has no real use. I forgot this is a forum where 800$ a month is a trivial amount of money. No snark intended. I've been to Europe a few times, as far as I'm concerned The Netherlands, Belgium, and the UK are essentially utopias. Not having to play the health insurance game, significantly lower crime rates, actual worker rights. No place is perfect, but try being poor in America. Nothing is closer to hell. |
|
|
|
| |
| ▲ | thrown-0825 4 days ago | parent | prev [-] | | https://www.researchgate.net/publication/257958560_A_Model_f... It suggests that shared values may predict more positive outcomes, and therapists should develop ethical sensitivity regarding value conflict. Many patients are encouraged to shop around for therapists and typically wind up with someone they are comfortable with whos value system aligns with theirs. AKA a private echo chamber financially incentivized to cultivate recurring revenue via emotionally dependent patients. |
| |
| ▲ | squishington 4 days ago | parent | prev [-] | | Please don't spread misinformation like this. It can stop people from seeking professional help. When people seek therapy, they are taking ownership over their emotions and actions, because they want to change their internal state in a healthy way (as opposed to escaping negative feelings with substance abuse, for example). Earlier this year I would suffer flight responses in public due to the effects of PTSD. I was able to significantly mitigate this (nearly gone) by seeing a therapist who practises EMDR. And sometimes people do need their feelings validated, which is an important part of healing from abuse. It's about rebuilding trust. |
|
|
|
|
| ▲ | hinkley 4 days ago | parent | prev | next [-] |
| Especially given the other conversation that happened this morning. The more you tell an AI not to obsess about a thing, the more they obsess about it. So trying to make a model that will never tell people to self harm is futile. Though maybe we are just doing in wrong, and the self-filtering should be external filtering - one model to censor results that do not fit, and one to generate results with lighter self-censorship. |
|
| ▲ | tim333 4 days ago | parent | prev | next [-] |
| There's an advantage to something like an LLM in that you can be more scientific as to whether it's effective or not, and if one gets good results you can reproduce the model. With humans there's too much variability to tell very much. |
|
| ▲ | create-username 4 days ago | parent | prev | next [-] |
| Yes, there is. AI assisted homemade neurosurgery |
| |
| ▲ | kirubakaran 4 days ago | parent | next [-] | | If Travis Kalanick can do vibe research at the bleeding edge of quantum physics[1], I don't see why one can't do vibe brain surgery. It isn't really rocket science, is it? [2] [1] https://futurism.com/former-ceo-uber-ai [2] If you need /s here to be sure, perhaps it's time for some introspection | |
| ▲ | Tetraslam 4 days ago | parent | prev [-] | | :( but what if i wanna fine-tune my brain weights | | |
|
|
| ▲ | waynesonfire 4 days ago | parent | prev | next [-] |
| You're ignorant. Why wait until a person is so broken they need clinical therapy? Sometimes just a an ear or an oppertunity to write is sufficient. LLMs for therapy is as vaping is to quitting nicotine--extremely helpful to 80+% of people. Confession in the church setting I'd consider similar to talking to LLM. Are you anti-that too? We're talking about people that just need a tool to help them process what is going on in their life at some basic level, not more than just to acknowledge their experience. And frankly, it's not even clear to me that a human therapist is any better. Yeah, maybe the guard-rails are in place but I'm not convinced that if those are crossed it'd result in some sociately consequences. Let people explorer their mind and experience--at the end of the day, I suspect they'd be healthier for it. |
| |
| ▲ | mattgreenrocks 4 days ago | parent | next [-] | | > And frankly, it's not even clear to me that a human therapist is any better. A big point of therapy is helping the patient better ascertain reality and deal with it. Hopefully, the patient learns how to reckon with their mind better and deceive themselves less. But this requires an entity that actually exists in the world and can bear witness. LLMs, frankly, don’t deal with reality. I’ll concede that LLMs can give people what they think therapy is about: lying on a couch unpacking what’s in their head. But this is not at all the same as actual therapeutic modalities. That requires another person that knows what they’re doing and can act as an outside observer with an interest in bettering the patient. | |
| ▲ | jrflowers 4 days ago | parent | prev [-] | | > Sometimes just a an ear or an oppertunity to write is sufficient. People were able to write about their feelings and experiences before the invention of a chat bot that tells you everything that you wrote is true. Like you could do that in notepad or on a piece of paper and it was free |
|
|
| ▲ | erikig 4 days ago | parent | prev | next [-] |
| AI ≠ LLMs |
| |
| ▲ | lukev 4 days ago | parent [-] | | What other form of "AI" would be remotely capable of even emulating therapy, at this juncture? | | |
| ▲ | mrbungie 4 days ago | parent [-] | | I promise you that by next year AI will be there, just believe me bro. /s. |
|
|
|
| ▲ | jacobsenscott 4 days ago | parent | prev | next [-] |
| It's already happening, a lot. I don't think anyone is claiming an llm is a therapist, but people use chatgpt for therapy every day. As far as I know no LLM company is taking any steps to prevent this - but they could, and should be forced to. It must be a goldmine of personal information. I can't imagine some therapists, especially remote only, aren't already just acting as a human interface to chatgtp as well. |
| |
| ▲ | thinkingtoilet 4 days ago | parent | next [-] | | Lots of people are claiming LLMs are therapists. People are claiming LLMs are lawyers, doctors, developers, etc... The main problem is, as usual, influencers need something new to create their next "OMG AI JUST BROKE X INDUSTRY" video and people eat that shit up for breakfast, lunch, and dinner. I have spoken to people who think they are having very deep conversations with LLMs. The CEO of my company, an otherwise intelligent person, has gone all in on the AI hype train and is now saying things like we don't need lawyers because AI knows more than a lawyer. It's all very sad and many of the people who know better are actively taking advantage of the people who don't. | |
| ▲ | dingnuts 4 days ago | parent | prev | next [-] | | > I can't imagine some therapists, especially remote only, aren't already just acting as a human interface to chatgtp as well. Are you joking? Any medical professional caught doing this should lose their license. I would be incensed if I was a patient in this situation, and would litigate. What you're describing is literal malpractice. | | |
| ▲ | xboxnolifes 4 days ago | parent | next [-] | | Software engineers are so accustomed to the idea that skirting your professional responsibility ends with a slap on the wrist and not removing your ability to practice your profession entirely. | | |
| ▲ | jacobsenscott 4 days ago | parent | next [-] | | You overestimate the vigilance of other professions. A Dr or Lawyer needs to screw up really badly, many times, before anything happens. They all go to the same country clubs. | |
| ▲ | dazed_confused 4 days ago | parent | prev [-] | | Yeah in other professions negligence can lead to jail... |
| |
| ▲ | tim333 4 days ago | parent | prev | next [-] | | In many places talk therapy isn't really considered a medical profession. Where I am "Counseling and psychotherapy are not protected titles in the United Kingdom" which kind of means anyone can do it as long as you don't make false claims about qualifications. | |
| ▲ | lupire 4 days ago | parent | prev | next [-] | | The only part that looks like malpractice is sharing patient info in a non HIPAA way. Using an assistive tool for advice is not malpractice. The licensed professional is simply accountable for their curation choices. | |
| ▲ | jacobsenscott 4 days ago | parent | prev [-] | | I'm not joking. Malpractice happens all the time. You being incensed is not the deterrent you think it is. |
| |
| ▲ | larodi 4 days ago | parent | prev [-] | | Of course they do, and everyone does, and it's your like in this song https://www.youtube.com/watch?v=u1xrNaTO1bI and given price of proper therapy is skyrocketing. |
|
|
| ▲ | perlgeek 4 days ago | parent | prev [-] |
| Just using an LLM as is for therapy, maybe with an extra prompt, is a terrible idea. On the other hand, I could image some more narrow uses where an LLM could help. For example, in Cognitive Behavioral Therapy, there are different methods that are pretty prescriptive, like identifying cognitive distortions in negative thoughts. It's not too hard to imagine an app where you enter a negative thought on your own and exercise finding distortions in it, and a specifically trained LLM helps you find more distortions, or offer clearer/more convincing versions of thoughts that you entered yourself. I don't have a WaPo subscription, so I cannot tell which of these two very different things have been banned. |
| |
| ▲ | delecti 4 days ago | parent | next [-] | | LLMs would be just as terrible at that usecase as any other kind of therapy. They don't have logic, and can't determine a logical thought from an illogical one. They tend to be overly agreeable, so they might just reinforce existing negative thoughts. It would still need a therapist to set you on the right track for independent work, and has huge disadvantages compared to the current state-of-the-art, a paper worksheet that you fill out with a pen. | | |
| ▲ | tejohnso 4 days ago | parent [-] | | They don't "have" logic just like they don't "have" charisma? I'm not sure what you mean. LLMs can simulate having both. ChatGPT can tell me that my assertion is a non sequitur - my conclusion doesn't logically follow from the premise. | | |
| ▲ | 4 days ago | parent | next [-] | | [deleted] | |
| ▲ | ceejayoz 4 days ago | parent | prev [-] | | Psychopaths can simulate empathy, but lack it. | | |
| ▲ | AlecSchueler 4 days ago | parent [-] | | Psychopaths also tend to eat lunch, but what's your point? | | |
| ▲ | ceejayoz 4 days ago | parent [-] | | The point is simulating something isn't the same as having something. | | |
| ▲ | AlecSchueler 4 days ago | parent [-] | | Well yes, that's a tautology. But is a simulation demonstrably less effective? | | |
| ▲ | ceejayoz 4 days ago | parent [-] | | > But is a simulation demonstrably less effective? Yes? If you go looking to psychopaths and LLMs for empathy, you're touching a hot stove. At some point, you're going to get burned. |
|
|
|
|
|
| |
| ▲ | wizzwizz4 4 days ago | parent | prev [-] | | > and a specifically trained LLM Expert system. You want an expert system. For example, a database mapping "what patients write" to "what patients need to hear", a fuzzy search tool with properly-chosen thresholding, and a conversational interface (repeats back to you, paraphrased – i.e., the match target –, and if you say "yes", provides the advice). We've had the tech to do this for years. Maybe nobody had the idea, maybe they tried it and it didn't work, but training an LLM to even approach competence at this task would be way more effort than just making an expert system, and wouldn't work as well. |
|