Remix.run Logo
jimkleiber 9 hours ago

> Also on the panel, Father Michael Baggot worried that "artificial intimacy is going to distract us from, and deter us from, the deep interpersonal bonds that are central to our happiness and our flourishing."

> He called for guardrails on AI to stop it capturing individuals' "minds but … also our affections."

> Fr Baggot cited the example of Magisterium AI, a Catholic chatbot. He sits on the scholarly advisory board for the service, and said its creators had worked to prevent it being "anthropomorphic" adding, "We do not want people having an intimate relationship with it."

I appreciate that coinage, "artificial intimacy," and want to explore the implications of it more.

raincole 6 hours ago | parent | next [-]

Religions, especially but not limited to Catholic, are what screams "artificial intimacy" to me.

From the language the Church uses ("Jesus loves you.") to the practices the Church does (confession to a priest).

The most charitable take I can make from this is that religious leaders genuinely believe what they claim to believe so they don't think it's "artificial." There are a lot of less charitable takes I could make, but I'd stop here.

tempodox 5 hours ago | parent [-]

Don’t worry, some “AI” adepts have already managed to make a religion out of it, so classical religions aren’t required any more. In the end it doesn’t matter that much what you’re using to outsource your thinking and your judgement.

ryandvm 3 hours ago | parent | prev | next [-]

The irony of priests being concerned with people developing emotional attachments to beings that don't consciously exist...

elric 8 hours ago | parent | prev | next [-]

I imagine this goes beyond what most people think of when they think of "intimacy" (sex, relationships) and includes all kinds of emotional closeness and friendships. Maybe it's just my imagination, but I've noticed a decline in people's willingness to engage with other people since the covid pandemic. If we start replacing interpersonal relationships with chatbots, we're headed for dark times.

CGMthrowaway 8 hours ago | parent | next [-]

Intimacy in that sense is a euphemism. It's not primary meaning of the word: https://www.merriam-webster.com/dictionary/intimacy

ChrisMarshallNY 8 hours ago | parent | prev [-]

One of the by-products of the sycophancy issues, is that LLMs are infinitely patient. They’ll listen to your bullshit forever, and won’t call it out, or walk away.

I can certainly see folks getting so used to it, that they then measure all their IRL relationships by that. They could decide that “you’re not my friend,” because you don’t want to listen to them whine endlessly about their ex.

firefax 7 hours ago | parent | next [-]

I think we kind of get this effect already with online discourse in general. People spin up and burn off nyms, and interact online in a manner they could never pull off IRL where people remember conversations/experiences. Which in turn makes them retreat further online, further warping their idea of normal conversation.

lelanthran 8 hours ago | parent | prev [-]

> One of the by-products of the sycophancy issues, is that LLMs are infinitely patient. They’ll listen to your bullshit forever, and won’t call it out, or walk away.

So, just like professional therapists then?

mattgreenrocks 7 hours ago | parent | next [-]

> So, just like professional therapists then?

I know there's other responses saying the same thing, but this needs underscoring: good therapists won't put up with this forever. They should use techniques to guide your mind away from keeping you trapped. It's a slow progress with very nonlinear progression. But for those it helps, things can improve.

Eventually you realize you (and perhaps a higher power) freed yourself from your mental bondage. They showed you the path, and walked alongside you, but they weren't the ones making the changes.

ChrisMarshallNY 8 hours ago | parent | prev | next [-]

No. Therapists are supposed to call it out, and interrupt rabbitholes.

From what I’ve seen of LLMs, it’s the opposite.

lelanthran 7 hours ago | parent [-]

In theory, sure. In practice therapists don't, because that patient won't be back.

All therapists give some some variation of "your problem is $SOMETHING_POSITIVE".

Never "your problem is you're too selfish" because those patients don't go back.

It's always "your problem is you're too willing to help" or "you give too much of yourself" or other similar such BS.

ChrisMarshallNY 7 hours ago | parent [-]

Good therapists don't. I know quite a few of them. They are pretty good at guiding you into seeing what an ass you are, but they make it seem like your own discovery, so the sting isn't as bad.

cindyllm 7 hours ago | parent [-]

[dead]

kibwen 8 hours ago | parent | prev [-]

No, any therapist worth their salt will absolutely call you out for bullshit, even if they try to couch it in gentle terms.

mattgreenrocks 7 hours ago | parent | prev | next [-]

> I appreciate that coinage, "artificial intimacy," and want to explore the implications of it more.

I've been looking for this phrase for years.

It describes the phenomenon perfectly, even accounting for the diminishing of emotional/mental/physical closeness that occurs.

satvikpendem 8 hours ago | parent | prev | next [-]

I already see dystopian ads for friend.com, "someone who listens, responds, and supports you" but it's actually an AI necklace device, and you'll see people marking them up too given how unnerving it is to call an AI a "friend."

https://old.reddit.com/r/Greenpoint/comments/1nmk49r/dystopi...

joules77 8 hours ago | parent [-]

Well before it was the pedo priest with the same dialogue.

So maybe an improvement.

Good friend of the Church, Nietzsche predicted dystopia long ago but it never plays out the way people think. The chimp troupe is highly unpredictable. One day it props up Hitlers. Next day it kills him.

bluefirebrand 7 hours ago | parent [-]

> So maybe an improvement

Definitely not an improvement to be friends with corporate-owned machines versus being friends with God

ferguess_k 8 hours ago | parent | prev | next [-]

Some people probably don't want that "deep interpersonal bonds" though. I know I don't want. I know some people who don't say they don't want but act in every way that they don't want.

Although I don't like the future proposed by the AI companies, this is the least of my concerns. The only big concern is employment. Like, if AI creates more jobs than it destroys, sure, go ahead, do it now.

toomuchtodo 8 hours ago | parent | next [-]

If the cure for the loneliness epidemic is community, through which interpersonal bonds are required, I suppose it is fine if we allow folks to opt out of community and human connection and use chatbots as they would heroin or meth; maxing out dopamine within their tolerances until death. Free will, self determination, and all that. But, we should also be mindful of the second order effects of such policy (the future ending up some combination of internet gaming cafes where people occasionally play so long they die, and "Ready Player One").

ferguess_k 8 hours ago | parent | next [-]

Yeah I agree with that, basically genuine choice for all.

BTW I just don't want "deep" bonds, but some sort of bonds is always good. Not sure how "deep" he meant though.

wara23arish 8 hours ago | parent | next [-]

may I ask why?

busterarm 8 hours ago | parent | prev [-]

As I get older and see these things play out, I agree less and less. There's a physical toll on your health that gets paid for living a lifestyle like this. Society pays part of the cost of this (at the very least anyone on the same health insurance plan).

I feel icky saying this but we should make a strong effort as a society to stamp out anti-social behaviors. Addictions are very high on that list.

You might think that you can engage this way without being a burden to others, but you can't.

toomuchtodo 8 hours ago | parent [-]

> I feel icky saying this but we should make a strong effort as a society to stamp out anti-social behaviors. Addictions are very high on that list.

GLP-1s can help stamp out addiction, but people are going to be people. You can provide them support, but you cannot prevent chronic, determined self harm and destruction. I speak from personal experience.

https://recursiveadaptation.com/p/the-growing-scientific-cas...

ToucanLoucan 8 hours ago | parent | prev [-]

I mean experiencing LLM "intimacy" of any sort is just getting to roleplay as a billionare tech CEO isn't it? It's why they're so proud of it, as far as they're concerned, they've perfectly reproduced the real people they encounter: breathless sycophants utterly tripping over themselves to tell them how fucking smart they are for whatever banal shit they've farted out most recently and tell them every idea they have is god working through them to bestow his gifts to mankind.

And for the same reason: they want their fucking money.

8 hours ago | parent | prev | next [-]
[deleted]
mattgreenrocks 8 hours ago | parent | prev | next [-]

> Some people probably don't want that "deep interpersonal bonds" though. I know I don't want. I know some people who don't say they don't want but act in every way that they don't want.

It's not my position to tell someone what to want. But the evolutionary firmware your body runs on is tuned for interpersonal bonds. If you want to go against that, nobody will stop you, but it strikes me as needless suffering in a world that already has a considerable amount.

carefulfungi 8 hours ago | parent | prev | next [-]

A commenter saying they don't want deep interpersonal bonds being downvoted is a sad rejection. ferguess_k - I hope you're well and living the life you want to live.

mattgreenrocks 7 hours ago | parent [-]

Agree. I may not agree with the post but I will support their ability to post such things.

basisword 8 hours ago | parent | prev [-]

I'm not trying to judge you but it doesn't seem normal to not want to have deep bonds with any other humans. The only people I can think of who don't have deep bonds and even largely avoid forced bonds like family are very unwell (for various reasons). I think AI relationships would only send these people deeper down a dark path that they will struggle to ever get out of.

bsoles 8 hours ago | parent | prev | next [-]

The Catholic Church should probably not talk about "intimacy" at all, given their track record. As much as I am not a big fan of AI/LLMs, I would love it if AI took the Church's job away.

basisword 8 hours ago | parent | prev [-]

Removing anthropomorphism from LLM's seems like a really great idea with zero downside. Not just because people starting "relationships" with AI is going to harm society but I imagine people are also more willing to trust misinformation from an anthropomorphic AI.

OJFord 8 hours ago | parent | next [-]

Is that even possible while still training on 'things written by humans' (and not expressly for training purposes) though?

wredcoll 8 hours ago | parent | next [-]

It doesn't have to be perfect. A hypothetical law could be phrased something like "not allowed to intentionally influence the user into thinking the llm is a human", which sure, is up to judges at the end, but it also gives a clear indication of things to avoid doing intentionally.

basisword 8 hours ago | parent | prev [-]

I feel like you could do it via the system prompt quite easily (but maybe that's my lack of knowledge showing).

reaperducer 8 hours ago | parent | prev [-]

Removing anthropomorphism from LLM's seems like a really great idea with zero downside.

Step 1: Stop giving them human or human-like names.

Claude, Siri, Gemini, etc.

kitd 8 hours ago | parent | next [-]

I swear I'm about to get dumped by my wife for Claude. He gives her all the answers she wants, whereas I only give her the ones she needs.

lelanthran 8 hours ago | parent | prev | next [-]

Yeah. Maybe HAL9000 would be better :-)

ChrisGreenHeur 8 hours ago | parent | prev [-]

Hey T1000, give me a good apple pie recipe, make sure to include pears instead of apples.