▲ | delecti 4 days ago | |||||||||||||||||||||||||||||||||||||||||||||||||
LLMs would be just as terrible at that usecase as any other kind of therapy. They don't have logic, and can't determine a logical thought from an illogical one. They tend to be overly agreeable, so they might just reinforce existing negative thoughts. It would still need a therapist to set you on the right track for independent work, and has huge disadvantages compared to the current state-of-the-art, a paper worksheet that you fill out with a pen. | ||||||||||||||||||||||||||||||||||||||||||||||||||
▲ | tejohnso 4 days ago | parent [-] | |||||||||||||||||||||||||||||||||||||||||||||||||
They don't "have" logic just like they don't "have" charisma? I'm not sure what you mean. LLMs can simulate having both. ChatGPT can tell me that my assertion is a non sequitur - my conclusion doesn't logically follow from the premise. | ||||||||||||||||||||||||||||||||||||||||||||||||||
|