▲ | devin 6 days ago | |||||||||||||||||||||||||
There is no prompt which causes an LLM to "think". | ||||||||||||||||||||||||||
▲ | pessimizer 5 days ago | parent | next [-] | |||||||||||||||||||||||||
Who cares about semantics? Define what thinking means in a human. I did computer engineering, I know how a computer works, and I also know how an LLM works. Call it what you want if calling it "thinking" makes you emotional. I think it's better to accept that people can install their thinking into a machine, and that machine will continue that thought independently. This is true for a valve that lets off steam when the pressure is high, it is certainly true for an LLM. I really don't understand the authenticity babble, it seems very ideological or even religious. But I'm not friends with a valve or an LLM. They're thinking tools, like calculators and thermostats. But to me arguing about whether they "think" is like arguing whether an argument is actually "tired" or a book is really "expressing" something. Or for that matter, whether the air conditioner "turned itself off" or the baseball "broke" the window. Also, I think what you meant to say is that there is no prompt that causes an LLM to think. When you use "think" it is difficult to say whether you are using scare quotes or quoting me; it makes the sentence ambiguous. I understand the ambiguity. Call it what you want. | ||||||||||||||||||||||||||
| ||||||||||||||||||||||||||
▲ | mythrwy 6 days ago | parent | prev [-] | |||||||||||||||||||||||||
A good way to determine this is to challenge LLMs to a debate. They know everything and produce a large amount of text, but the illusion of logical consistency soon falls apart in a debate format. | ||||||||||||||||||||||||||
|