Remix.run Logo
spwa4 12 hours ago

> The first call that shifted his thinking was with a 67-year-old woman living out of her car, managing PTSD and congestive heart failure. She spoke with Flora for over an hour. "It was both incredible and depressing," Batlivala told me. "Flora was probably the only 'person' she'd talked to in weeks about her situation." Now, hourlong conversations with Flora are routine. "That's the companionship piece," he said. "And it turns out that is truly an intervention."

People don't seem to realize that this is both coming and that before long people will be defending AI "persons" because of this reason (OpenAI is already complaining about people doing this). Nobody's going to deliver this level of care using humans. It's not going to happen.

A lot of people needing care are deeply isolated and will be of the opinion that AI changes that.

vjvjvjvjghv 8 hours ago | parent | next [-]

I feel the same about caretaking. Having an AI talk to people with dementia will be a godsend for families. Before he died, my dad had the same thought every 5 minutes and it slowly drove my mom crazy. A super patient AI would have helped a lot and freed up the rest of the family for other tasks.

One step further would be robots that take people to the bathroom, clean them and other stuff. Having this done by humans is either extremely expensive or it will not be done properly.

Some people are horrified by the loss of human touch but for most old people human touch is a luxury they can't afford.

saltcured 7 hours ago | parent [-]

I don't think it will be helpful when it is slopped together and doesn't have a real mental model to keep the dementia patient on a healthy track.

Look at all the "AI psychosis" problems with people going into a conversation loop that amplifies their worst thought patterns. Now consider the same where the person in this loop is already having delusions and other cognitive decline. It seems to me that it could spiral in the wrong direction quite easily.

It's quite difficult for human caretakers to navigate this space too. That is part of why it is so exhausting. You're constantly trying to make judgement calls and implicitly predict the unreliable response of the dementia sufferer.

I think there is a large uncanny valley between having some facsimile of human interaction in a short session and having some kind of trustworthy caretaker that can consistently respond in a way that promotes health and safety. I think it involves a lot of subjunctive interpretation and reasoning to navigate all the mixed up layers of fact, fantasy, and simply aphasic expression that come from dementia.

esseph 9 hours ago | parent | prev | next [-]

Every psychologist and therapist I have talked to about using LLMs in place of personal interactions (just discussion about this topic) have all said roughly the same thing:

Any attempt to use LLMs as a substitute for personal interaction is playing an incredibly dangerous game that will probably make them a lot of money, while hurting a lot of people.

spwa4 8 hours ago | parent [-]

You might want to read again who the patient was. Because: obviously not going to happen, no matter how bad the AI is ...

Oh and taking sycophancy out of a model is easy. Just finetune out that they (have to) agree with everything. Plus every new model has less of it, or at least masks it better.

polynomial 7 hours ago | parent | prev [-]

A 67 year old woman living out of her car? JFC.