Remix.run Logo
TimTheTinker 3 hours ago

> It's patently insane to demand that humans alter their behavior to accommodate the foibles of mere machines

Talking to chatbots is like taking a placebo pill for a condition. You know it's just sugar, but it creates a measurable psychosomatic effect nonetheless. Even if you know there's no person on the other end, the conversation still causes you to functionally relate as if there is.

So this isn't "accommodating foibles" with the machine, it's protecting ourselves from an exploit of a human vulnerability: we subconsciously tend to infer intent, understanding, judgment, emotions, moral agency, etc. to LLMs.

Humans are wired to infer these based on conversation alone, and LLMs are unfortunately able to exploit human conversation to leap compellingly over the uncanny valley. LLM engineering couldn't be better made to target the uncanny valley: training on a vast corpus of real human speech. That uncanny valley is there for a reason: to protect us from inferring agency where such inference is not due.

Bad things happen when we relate to unsafe people as if they are safe... how much more should we watch out for how we relate to machines that imitate human relationality to fool many of us into thinking they are something that they're not. Some particularly vulnerable people have already died because of this, so it isn't an imaginary threat.

miyoji 2 hours ago | parent | next [-]

> So this isn't "accommodating foibles" with the machine, it's protecting ourselves from an exploit of a human vulnerability: we subconsciously tend to infer intent, understanding, judgment, emotions, moral agency, etc. to LLMs.

Right, I'm saying that this framing is backwards. It's not that poor little humans are vulnerable and we need to protect ourselves on an individual level, we need to make it illegal and socially unacceptable to use AI to exploit human vulnerability.

Let me put it another way. Humans have another weakness, that is, we are made of carbon and water and it's very easy to kill us by putting metal through various fleshy parts of our bodies. In civilized parts of the world, we do not respond to this by all wearing body armor all the time. We respond to this by controlling who has access to weapons that can destroy our fleshy bits, and heavily punishing people who use them to harm another person.

I don't want a world where we have normalized the use of LLMs where everyone has to be wearing the equivalent of body armor to protect ourselves. I want a world where I can go outside in a T-shirt and not be afraid of being shot in the heart.

jimbokun an hour ago | parent [-]

Ah, I see, you are not American.

In the US we don't have the luxury of believing our governments will act in the interests of the voters.

semiquaver 2 hours ago | parent | prev | next [-]

  > That uncanny valley is there for a reason: to protect us from inferring agency
You’re committing a much older but related sin here: assigning agency and motivation to evolutionary processes. The uncanny valley is the product of evolution and thus by definition it has no “purpose”
TimTheTinker an hour ago | parent | next [-]

I reject the premise that the universe, the earth, and human existence is without purpose. It's one premise among several, and not one I subscribe to.

At least 80% of people agree with me, so I'm not holding to a fringe idea.

jplusequalt 16 minutes ago | parent | next [-]

>At least 80% of people agree with me, so I'm not holding to a fringe idea.

Appeal to majority much?

semiquaver an hour ago | parent | prev [-]

I didn’t say any such thing like the universe has no purpose. Merely that in a scientific sense evolution has no motivation. It is an emergent phenomenon which tends to maximize fitness to reproduce and cannot be said to do anything for a reason. Saying otherwise is just anti-science.

skirmish an hour ago | parent | prev [-]

> is the product of evolution and thus by definition it has no “purpose”

But as most things that appeared in evolution, it perhaps helped at least some individuals until sexual maturity and successful procreation.

semiquaver an hour ago | parent [-]

Agreed. Thats far off from what parent said, which is what the “purpose” of the uncanny valley is.

ButlerianJihad an hour ago | parent | prev | next [-]

> You know it's just sugar,

That is not the definition of a placebo.

You take the placebo (whatever it is: could be a pill; could be some kind of task or routine) and you believe it is medicine; you believe it to be therapeutic.

The placebo effect comes from your faith, your belief, and your anticipation that it will heal.

If the pharmacist hands you a pill and says, “here, this placebo is sugar!” they have destroyed the effect from the start.

Once on e.r. I heard the physicians preparing to administer “Obecalp”, which is a perfectly cromulent “drug brand”, but also unlikely to alert a nearby patient about their true intent.

the_af an hour ago | parent | next [-]

> That is not the definition of a placebo.

But, puzzlingly enough, it's the definition of open-label placebo, in which the patient is told they've been given a placebo. And some studies show there is a non-insignificant effect as well, albeit smaller (and less conclusive) than with blind placebo.

IAmBroom 44 minutes ago | parent | prev [-]

One, a placebo does not need to be given blindly. A sugar pill is a placebo, even if the recipient knows about it.

An actual definition: "A placebo is an inactive substance (like a sugar pill) or procedure (like sham surgery) with no intrinsic therapeutic value, designed to look identical to real treatment." No mention of the user's belief.

Two, real hard data proves that the placebo effect remains (albeit reduced) even if the recipient knows about it. It's counter-intuitive, but real.

soco 2 hours ago | parent | prev [-]

Rubber duck debugging, now with droughts.