Remix.run Logo
og_kalu 3 days ago

You cannot say, 'we know it's not thinking because we wrote the code' when the inference 'code' we wrote amounts to, 'Hey, just do whatever you figured out during training okay'.

'Power over your computer', all that is orthogonal to the point. A human brain without a functioning body would still be thinking.

almosthere 3 days ago | parent [-]

Well, a model by itself with data that emits a bunch of human written words is literally no different than what JIRA does when it reads a database table and shits it out to a screen, except maybe a lot more GPU usage.

I permit you, that yes, the data in the model is a LOT more cool, but some team could by hand, given billions of years (well probably at least 1 Octillion years), reproduce that model and save it to a disk. Again, no different than data stored in JIRA at that point.

So basically if you have that stance you'd have to agree that when we FIRST invented computers, we created intelligence that is "thinking".

og_kalu 3 days ago | parent | next [-]

>Well, a model by itself with data that emits a bunch of human written words is literally no different than what JIRA does when it reads a database table and shits it out to a screen, except maybe a lot more GPU usage.

Obviously, it is different or else we would just use JIRA and a database to replace GPT. Models very obviously do NOT store training data in the weights in the way you are imagining.

>So basically if you have that stance you'd have to agree that when we FIRST invented computers, we created intelligence that is "thinking".

Thinking is by all appearances substrate independent. The moment we created computers, we created another substrate that could, in the future think.

almosthere 3 days ago | parent [-]

But LLMs are effectively a very complex if/else if tree:

if the user types "hi" respond with "hi" or "bye" or "..." you get the point. It's basically storing the most probably following words (tokens) given the current point and its history.

That's not a brain and it's not thinking. It's similar to JIRA because it's stored information and there are if statements (admins can do this, users can do that).

Yes it is more complex, but it's nowhere near the complexity of the human or bird brain that does not use clocks, does not have "turing machines inside", or any of the other complete junk other people posted in this thread.

The information in Jira is just less complex, but it's in the same vein of the data in an LLM, just 10^100 times more complex. Just because something is complex does not mean it thinks.

iainmerrick 2 days ago | parent [-]

This is a pretty tired argument that I don't think really goes anywhere useful or illuminates anything (if I'm following you correctly, it sounds like the good old Chinese Room, where "a few slips of paper" can't possibly be conscious).

Yes it is more complex, but it's nowhere near the complexity of the human or bird brain that does not use clocks, does not have "turing machines inside", or any of the other complete junk other people posted in this thread.

The information in Jira is just less complex, but it's in the same vein of the data in an LLM, just 10^100 times more complex. Just because something is complex does not mean it thinks.

So, what is the missing element that would satisfy you? It's "nowhere near the complexity of the human or bird brain", so I guess it needs to be more complex, but at the same time "just because something is complex does not mean it thinks".

Does it need to be struck by lightning or something so it gets infused with the living essence?

almosthere 2 days ago | parent [-]

Well, at the moment it needs to be born. Nothing else has agency on this planet. So yes, the bar is HIGH. Just because you have a computer that can count beans FAST, it does not mean because you counted a trillion beans that it was an important feat. When LLMs were created it made a lot of very useful software developments. But it is just a large data file that's read in a special way. It has no agency, it does not just start thinking on it's own unless it is programmatically fed data. It has to be triggered to do something.

If you want the best comparison, it's closer to a plant- it reacts ONLY to external stimulous, sunlight, water, etc... but it does not think. (And I'm not comparing it to a plant so you can say - SEE you said it's alive!) It's just a comparison.

MrScruff 2 days ago | parent | prev [-]

You're getting to the heart of the problem here. At what point in evolutionary history does "thinking" exist in biological machines? Is a jumping spider "thinking"? What about consciousness?