Remix.run Logo
PaulDavisThe1st 2 hours ago

They do not manipulate concepts. There is no representation of a concept for them to manipulate.

It may, however, turn out that in doing what they do, they are effectively manipulating concepts, and this is what I was alluding to: by building the model, even though your approach was through tokenization and whatever term you want to use for the network, you end up accidentally building something that implicitly manipulates concepts. Moreover, it might turn out that we ourselves do more of this than we perhaps like to think.

Nevertheless "manipulating concepts is exactly what they do" seems almost willfully ignorant of how these systems work, unless you believe that "find the next most probable sequence of tokens of some length" is all there is to "manipulating concepts".

famouswaffles 16 minutes ago | parent [-]

>They do not manipulate concepts. There is no representation of a concept for them to manipulate.

Yes, they do. And of course there is. And there's plenty of research on the matter.

>It may, however, turn out that in doing what they do, they are effectively manipulating concepts

There is no effectively here. Text is what goes in and what comes out, but it's by no means what they manipulate internally.

>Nevertheless "manipulating concepts is exactly what they do" seems almost willfully ignorant of how these systems work, unless you believe that "find the next most probable sequence of tokens of some length" is all there is to "manipulating concepts".

"Find the next probable token" is the goal, not the process. It is what models are tasked to do yes, but it says nothing about what they do internally to achieve it.