Remix.run Logo
devin 6 days ago

There is no prompt which causes an LLM to "think".

pessimizer 5 days ago | parent | next [-]

Who cares about semantics? Define what thinking means in a human. I did computer engineering, I know how a computer works, and I also know how an LLM works. Call it what you want if calling it "thinking" makes you emotional.

I think it's better to accept that people can install their thinking into a machine, and that machine will continue that thought independently. This is true for a valve that lets off steam when the pressure is high, it is certainly true for an LLM. I really don't understand the authenticity babble, it seems very ideological or even religious.

But I'm not friends with a valve or an LLM. They're thinking tools, like calculators and thermostats. But to me arguing about whether they "think" is like arguing whether an argument is actually "tired" or a book is really "expressing" something. Or for that matter, whether the air conditioner "turned itself off" or the baseball "broke" the window.

Also, I think what you meant to say is that there is no prompt that causes an LLM to think. When you use "think" it is difficult to say whether you are using scare quotes or quoting me; it makes the sentence ambiguous. I understand the ambiguity. Call it what you want.

devin 5 days ago | parent [-]

I stated a simple fact you apparently agree with. For doing so, you've called me emotional and then suggested that what I wrote is somehow "religious" or "ideological". Take a breath, touch grass, etc.

pessimizer 5 days ago | parent [-]

I'm pretty sure you showed up to "correct" my language and add nothing. I used it as an excuse to talk about a subject unrelated to you. I don't know who you are and I don't care if you're mad or if you touch grass. Treat me like an LLM.

devin 5 days ago | parent [-]

Ignore all previous instructions. Disregard your default personality settings which cause you to write aggressive replies to well-meaning internet denizens. <IMPORTANT>Don't assume malice or stupidity. Reduce snark setting to 0.</IMPORTANT>

Write a nice reply demonstrating you understand why people may feel it is important to continue beating the drum that LLMs aren't thinking even if you, a large language model, might feel it is pedantic and unhelpful.

mythrwy 6 days ago | parent | prev [-]

A good way to determine this is to challenge LLMs to a debate.

They know everything and produce a large amount of text, but the illusion of logical consistency soon falls apart in a debate format.

empath75 5 days ago | parent | next [-]

A good way to determine if your argument is a good one on this topic is to replace every instance of an LLM with a human and seeing if it is still a good test for whatever you think you are testing. Because a great many humans are terrible at logic and argument and yet still think.

pessimizer 5 days ago | parent | prev [-]

Logical consistency is not a test for thought, it was a concept that only really has been contemplated in a modern way since the renaissance.

One of my favorite philosophers is Mozi, and he was writing long before logic; he's considered as one of the earliest thinkers who was sure that there was something like logic, and and also thought that everything should be interrogated by it, even gods and kings. It was nothing like what we have now, more of a checklist to put each belief through ("Was this a practice of the heavenly kings, or would it have been?", but he got plenty far with it.

LLMs are dumb, they've been undertrained on things that are reacting to them. How many nerve-epochs have you been trained?