Remix.run Logo
chipgap98 4 hours ago

Is this any different than people who believe random things they read on sketchy news sites or social media?

atomicnumber3 4 hours ago | parent | next [-]

Yes, somehow. I have been dealing with an awful lot of people who basically have what are theoretically logic degrees who suddenly just take LLMs at face value, or quote them to me like that actually means anything. People I formerly thought were sane.

sodapopcan 4 hours ago | parent | next [-]

I don't mean to put words in your mouth but from what I've seen, in person but mostly online, but the "problem" (and I put that in quotes because I don't even know what to call it... it seems deeper than a mere "problem") is that they quote them as if they are autonomous, sentient beings.

BobbyTables2 3 hours ago | parent | next [-]

The problem is that LLM output looks like a human conversation. People believe it.

Which is more believable?

“The sky is filled with a downpour of squealing pigs. Would you like me to suggest the best type of umbrella?”

“Sky pigs squealing”

sodapopcan 2 hours ago | parent | next [-]

People just don't like to be played for fools. Perhaps us giving into this is progress? I'd give a big ol' "fuck you" to anyone who claims it is, but I'm also pretty old.

yammosk 2 hours ago | parent | prev [-]

I am not sure I would even say "believe", I would think of it more as short-circuiting our critical thinking. I think it taps into something at the core of our tribal instincts. It was famously present in even basic systems like Eliza. And it's not just machines... The same tricks are used by conmen, politicians, and psychopaths, which is more negative than I intend. Even with good intentions and positive outcomes, I feel we need to remember that we drive it, not the other way around.

al_borland 4 hours ago | parent | prev [-]

Some of this might depend on the source.

I’ve seen some people quote AI like you’re saying. However, when I preface something with “ChatGPT said…”, my intention is to convey to the listener that they should take it with a grain of salt, as it might be completely bull shit. I suppose I should consider who I’m talking to when I make that assumption.

jazzyjackson 2 hours ago | parent [-]

it’s a slightly orthogonal problem to using the active voice of “XYZ says…”, it’s treating the text continuation engine as an “other” that may know better than they do, playing into sci fi conceptions of AI having its personal positronic brain or whatever, having its own ideas and deciding to carve a horse out of driftwood.

It’s not quite anthropomorphizing either that’s the issue, need a word for “treating it as tho it were a machine conscious that exists alongside humanity*”, how does cyborgropomorphizing sound

   * and not merely a markov chain running in Sam Altman’s closet
ACow_Adonis 4 hours ago | parent | prev | next [-]

Surely the correct conclusion is to question the value/veracity of those degree issuing institutions and rituals?

And if you previously were unaware of the insanity and irrationality passing under the surface of such human activity, I guess it can come as a bit of a shock :)

heliumtera 4 hours ago | parent | prev | next [-]

>take llms at face value

It happened with science, politics, traditional media, history books, "good engineering practices" applied to IT, OOP,tdd,DDD,server side rendering, containerization... Literally every bullshit shilled to the moon is accepted without second guessing and you would be without a job, in an asylum, for questioning 2 of them in a row.

Why is it different now? EVERYTHING is bullshit, only attention matters. And craftsmanship.

soopypoos 3 hours ago | parent [-]

and ruthless efficiency

cindyllm 3 hours ago | parent [-]

[dead]

cookiengineer 2 hours ago | parent | prev [-]

I don't think this has anything to do with sanity. This has to do with people for seeking self confirmation instead ot disproval.

For pretty much everything there is a conspiracy theory out there claiming the opposite, and these types usually started out searching the internet for someone else who believes the same that they did at the time.

But, as we all know, this technique will eventually lead to overfitting. And that's what those types of people have done to themselves.

Well, and as lack of education is the weakness of democracy, there's a lot of interested parties out there that invest money in these types of conspiracy websites. Even more so after LLMs.

Whoever controls the news controls the perpetual presence, where everything is independent of the forgotten history.

basilikum 4 hours ago | parent | prev | next [-]

Yes, I think AI bots are more compelling to some people. They break the concept of judging information by its source because they obscure the source. But at the same time they are trained on a lot of reputable sources and can say a lot of very smart things, just at other times they say complete BS. But they are really good at making things sound plausible, that's essentially how they work after all.

chipgap98 4 hours ago | parent [-]

I would argue for many people social media and news aggregators do the exact same thing. People site Instagram or TikTok as the source of their data in the same way others site ChatGPT

BobbyTables2 3 hours ago | parent [-]

Many go one step fewer and just take the headline title as the source…

ares623 4 hours ago | parent | prev | next [-]

Absolutely. These things are marketed from virtually everyone, from people that are historically considered experts and/or authoritative, as such.

sbinnee 4 hours ago | parent [-]

With their phd level intelligence, right? But they don't have emotion, no responsibility, no consequences whatsoever if anything bad happens.

smohare 2 hours ago | parent | prev [-]

[dead]