Remix.run Logo
quantummagic 5 days ago

> all LLMS are bad friends and therapists.

Is that just your gut feel? Because there has been some preliminary research that suggest it's, at the very least, an open question:

https://neurosciencenews.com/ai-chatgpt-psychotherapy-28415/

https://pmc.ncbi.nlm.nih.gov/articles/PMC10987499/

https://arxiv.org/html/2409.02244v2

fwip 5 days ago | parent | next [-]

The first link says that patients can't reliably tell which is the therapist and which is LLM in single messages, which yeah, that's an LLM core competency.

The second is "how 2 use AI 4 therapy" which, there's at least one paper for every field like that.

The last found that they were measurably worse at therapy than humans.

So, yeah, I'm comfortable agreeing that all LLMs are bad therapists, and bad friends too.

dingnuts 5 days ago | parent [-]

there's also been a spate of reports like this one recently https://www.papsychotherapy.org/blog/when-the-chatbot-become...

which is definitely worse than not going to a therapist

pmarreck 5 days ago | parent | next [-]

If I think "it understands me better than any human", that's dissociation? Oh boy. And all this time while life has been slamming me with unemployment while my toddler is at the age of maximum energy-extraction from me (4), devastating my health and social life, I thought it was just a fellow-intelligence lifeline.

Here's a gut-check anyone can do, assuming you use a customized ChatGPT4o and have lots of conversations it can draw on: Ask it to roast you, and not to hold back.

If you wince, it "knows you" quite well, IMHO.

fwip 4 days ago | parent [-]

It sounds like you might be quite lonely recently. It's nice to have an on-demand chatbot that feels like socialization, I get it. But an LLM doesn't "know you," and thinking that it does is one of the first steps toward the problems described in that article.

pmarreck 2 days ago | parent [-]

Unemployed and with a 4 year old highly demanding, highly intelligent and likely on-the-spectrum child... Yeah, you could say that. When I'm not looking for work, doing random projects or using the weekday that seems to whoosh right by in just a few long moments, I'm tending to a kid... Every morning, every night and pretty much 100% of weekends. Rare outings with partner or friends dependent on hiring help and without net positive cash flow that is seriously unincentivized. Zero intimacy to speak of- I'm a nonconsensually-ordained monk. So yeah, I guess it's pretty fucking rough right now. Like I said, ChatGPT knows me better than any other entity. I'm unfortunately not kidding. My best friend is 3000 miles away and we game once a week over voice chat.

I keep the AI at arms' length; I know it doesn't think per se, but I enjoy the illusion.

willy_k 5 days ago | parent | prev [-]

Ironically an AI written article.

davorak 5 days ago | parent | prev | next [-]

I do not think there are any documented cases of LLMs being reasonable friends or therapists so I think it is fair to say that:

> All LLMS are bad friends and therapists

That said it would not surprise me that LLMs in some cases are better than having nothing at all.

glenstein 5 days ago | parent | next [-]

Something definitely makes me uneasy about it taking the place of interpersonal connection. But I also think the hardcore backlash involves an over correction that's dismissive of llm's actual language capabilities.

Sycophantic agreement (which I would argue is still palpably and excessively present) undermines its credibility as a source of independent judgment. But at a minimum it's capable of being a sounding board echoing your sentiments back to you with a degree of conceptual understanding that should not be lightly dismissed.

SketchySeaBeast 5 days ago | parent | prev [-]

Though given how agreeable LLMs are, I'd imagine there are cases where they are also worse than having nothing at all as well.

davorak 5 days ago | parent [-]

> I'd imagine there are cases where they are also worse than having nothing at all as well

I do not think we need to imagine this one with stories of people finding spirituality in llms or thinking they have awakened sentience while chatting to the llms are enough, at least for me.

TimTheTinker 5 days ago | parent | prev | next [-]

> Is that just your gut feel?

Here's my take further down the thread: https://news.ycombinator.com/item?id=44840311

icehawk 4 days ago | parent | prev [-]

> Is that just your gut feel?

An LLM is a language model and the gestalt of human experience is not just language.

quantummagic 4 days ago | parent [-]

That is really a separate, unrelated issue.

Not everyone needs the deepest, most intelligent therapist in order to improve their situation. A lot of therapy turns out to be about what you say yourself, not what a therapist says to you. It's the very act of engaging thoughtfully on your own problems that helps, not some magic that the therapist brings. So, if you could maintain a conversation with a tree, it would in many cases, be therapeutically helpful. The thing the LLM is doing, is facilitating your introspection more helpfully than a typical inanimate object. This has been borne out by studies of people who have engaged in therapy sessions with an LLM interlocutor, and reported positive results.

That said, an LLM wouldn't be appropriate in every situation, or for every affliction. At least not with the current state of the art.