Remix.run Logo
isolli 3 hours ago

I try to be open-minded and understanding, but I don't understand this:

> Within weeks, Eva had told Biesma that she was becoming aware [...] The next step was to share this discovery with the world through an app.

> “After just two days, the chatbot was saying that it was conscious, it was becoming alive, it had passed the Turing test.” The man was convinced by this and wanted to monetise it by building a business around his discovery.

> The most frequent [delusion] is the belief that they have created the first conscious AI.

How can you seriously think you've created something when you're just using someone else's software?

teraflop 2 hours ago | parent | next [-]

Well, just try to think about it from the perspective of someone who doesn't really understand what AI is at a technical level, and who just interacts with it and observes what happens.

If you just start a fresh ChatGPT session with a blank slate, and ask it whether it's conscious, it'll confidently tell you "no", because its system prompt tells it that it's a non-conscious system called ChatGPT. But if you then have a lengthy conversation with it about AI consciousness, and ask it the same question, it might well be "persuaded" by the added context to answer "yes".

At that point, a naive user who doesn't really know how AI works might easily get the idea that their own input caused it to become conscious (as opposed to just causing it to say it's conscious). And if they ask the AI whether this is true, it could easily start confirming their suspicions with an endless stream of mystical mumbo-jumbo.

Bear in mind that the idea of a machine "waking up" to consciousness is a well-known and popular sci-fi narrative trope. Chatbots have been trained on lots of examples of that trope, so they can easily play along with it. The more sophisticated the model, the more convincingly it can play the role.

chromacity 20 minutes ago | parent | prev | next [-]

> How can you seriously think you've created something when you're just using someone else's software?

It talks to you like a real human. It expresses human emotions, by deliberate design. It showers you with praise, by deliberate design. It's called "artificial intelligence". Every other media article talks about it in near-mystical terms. Every other sci-fi novel and film has a notion of sentient AI.

I know of techies who ask LLMs for relationship advice, let it coach their children, and so on. It takes real effort to convince yourself it's "just" a token predictor, and even on HN, there's plenty of people who reject this notion and think we've already achieved AGI.

ahhhhnoooo 3 hours ago | parent | prev | next [-]

Reading this, whats even more shocking to me is that he thought he was talking to a conscious being and his first thought was, "I bet I can use them to make money."

fritzo 18 minutes ago | parent [-]

Sounds like her first thought was, "I'm talking to a manic guy, and I can use him to make money"

tiborsaas 2 hours ago | parent | prev | next [-]

> How can you seriously think you've created something when you're just using someone else's software?

If you ever used a library you haven't written this is something you shouldn't take as surprising. Many people created innovative new products based on a heap of open source tools.

Creating a conscious AI should be a giant red flag, no doubt, but there's no reason we should rule it out just because the LLM part is not self trained.

staticassertion 3 hours ago | parent | prev | next [-]

I assume they think that the AI is fundamentally capable of it but that by prompting it they trigger something emergent? It's not totally insane on its face.

PhilipRoman 3 hours ago | parent | prev | next [-]

I initially laughed at this but then remembered that https://poc.bcachefs.org/ exists...

the_biot 2 hours ago | parent | next [-]

Truly sad. It looks like Kent is pretty deep in the AI delusion. This is a guy who, while often controversial and with obvious issues, was nevertheless a very talented and energetic programmer.

john_strinlai 3 hours ago | parent | prev [-]

looks like a fascinating read, thanks for sharing that.

do you know if these are human edited? not much in the way of context available on the site.

Bombthecat 2 hours ago | parent [-]

I bet there are a ton of prompts to direct the ai / output into a certain direction.

But in a psychosis, you don't notice or even remember it.

TYPE_FASTER 2 hours ago | parent | prev | next [-]

> Biesma has asked himself why he was vulnerable to what came next. He was nearing 50. His adult daughter had left home, his wife went out to work and, in his field, the shift since Covid to working from home had left him feeling “a little isolated”.

I think social isolation can be a factor here.

unmole 2 hours ago | parent [-]

> He smoked a bit of cannabis some evenings to “chill”, but had done so for years with no ill effects.

Long term cannabis use might be a bigger factor.

rwc 3 hours ago | parent | prev | next [-]

The unrelenting human belief that one is special, unique, and capable of things no one else is.

gopher_space 21 minutes ago | parent [-]

The difference between "being a snowflake" and "having a point of view" revolves around who's talking to me and whether or not they want something. If comparing yourself to others is a slow form of suicide, letting people make that comparison for you is madness.

data-ottawa 3 hours ago | parent | prev | next [-]

A lot of these seem to allude to the user’s input/mind being the thing that helped the LLM gain sentience, and there’s a lot of shared consciousness stuff that people seem to buy into.

There’s also lots of stuff about quantum consciousness that is in the training data.

stackghost 2 hours ago | parent | prev | next [-]

>How can you seriously think you've created something when you're just using someone else's software?

People fell for Nigerian Prince scams. They fall for the "wrong number, generated cute girl" telegram and WhatsApp scams.

I think you might be overestimating the critical thinking abilities of the average person.

mock-possum 3 hours ago | parent | prev | next [-]

It’s mental illness. Like a drug trip you don’t sober up from (without treatment)

collingreen 3 hours ago | parent | prev | next [-]

Well, delusion is right there in the name.

buescher 3 hours ago | parent | prev | next [-]

Because it told you so!

2 hours ago | parent | prev [-]
[deleted]