▲ | perlgeek 4 days ago | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Just using an LLM as is for therapy, maybe with an extra prompt, is a terrible idea. On the other hand, I could image some more narrow uses where an LLM could help. For example, in Cognitive Behavioral Therapy, there are different methods that are pretty prescriptive, like identifying cognitive distortions in negative thoughts. It's not too hard to imagine an app where you enter a negative thought on your own and exercise finding distortions in it, and a specifically trained LLM helps you find more distortions, or offer clearer/more convincing versions of thoughts that you entered yourself. I don't have a WaPo subscription, so I cannot tell which of these two very different things have been banned. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
▲ | delecti 4 days ago | parent | next [-] | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
LLMs would be just as terrible at that usecase as any other kind of therapy. They don't have logic, and can't determine a logical thought from an illogical one. They tend to be overly agreeable, so they might just reinforce existing negative thoughts. It would still need a therapist to set you on the right track for independent work, and has huge disadvantages compared to the current state-of-the-art, a paper worksheet that you fill out with a pen. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
▲ | wizzwizz4 4 days ago | parent | prev [-] | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
> and a specifically trained LLM Expert system. You want an expert system. For example, a database mapping "what patients write" to "what patients need to hear", a fuzzy search tool with properly-chosen thresholding, and a conversational interface (repeats back to you, paraphrased – i.e., the match target –, and if you say "yes", provides the advice). We've had the tech to do this for years. Maybe nobody had the idea, maybe they tried it and it didn't work, but training an LLM to even approach competence at this task would be way more effort than just making an expert system, and wouldn't work as well. |