| ▲ | dvt 10 hours ago | |||||||
> the user is immediately able to understand the constraints Nagel's point was quite literally the opposite[1] of this, though. We can't understand what it must "be like to be a bat" because their mental model is so fundamentally different than ours. So using all the human language tokens in the world can't get us to truly understand what it's like to be a bat, or a guppy, or whatever. In fact, Nagel's point is arguably even stronger: there's no possible mental mapping between the experience of a bat and the experience of a human. [1] https://www.sas.upenn.edu/~cavitch/pdf-library/Nagel_Bat.pdf | ||||||||
| ▲ | Terr_ 6 hours ago | parent | next [-] | |||||||
IMO we're a step before that: We don't even have a real fish involved, we have a character that is fictionally a fish. In LLM-discussions, obviously-fictional characters can be useful for this, like if someone builds a "Chat with Count Dracula" app. To truly believe that a typical "AI" is some entity that "wants to be helpful" is just as mistaken as believing the same architecture creates an entity that "feels the dark thirst for the blood of the living." Or, in this case, that it really enjoys food-pellets. | ||||||||
| ▲ | andoando 6 hours ago | parent | prev | next [-] | |||||||
Id highly disagree with that. Were all living in the same shared universe, and underlying every intelligence must be precisely an understanding of events happening in this space-time. | ||||||||
| ▲ | 8 hours ago | parent | prev | next [-] | |||||||
| [deleted] | ||||||||
| ▲ | AndrewKemendo 10 hours ago | parent | prev [-] | |||||||
Different argument I’m not going to argue other than to say that you need to view the point from a third party perspective evaluating “fish” vs “more verbose thing,” such that the composition is the determinant of the complexity of interaction (which has a unique qualia per nagel) Hence why it’s a “unintentional nod” not an instantiation | ||||||||
| ||||||||