Remix.run Logo
jhanschoo a day ago

Let's make this more concrete than talking about "understanding knowledge". Oftentimes I want to know something that cannot feasibly be arrived at by reasoning, only empirically. Remaining within the language domain, LLMs get so much more useful when they can search the web for news, or your codebase to know how it is organized. Similarly, you need a robot that can interact with the world and reason from newly collected empirical data in order to answer these empirical questions, if the work had not already been done previously.

skydhash a day ago | parent | next [-]

> LLMs get so much more useful when they can search the web for news, or your codebase to know how it is organized

But their usefulness is only surface-deep. The news that matters to you is always deeply contextual, it's not only things labelled as breaking news or happening near you. Same thing happens with code organization. The reason is more human nature (how we think and learn) than machine optimization (the compiler usually don't care).

awesome_dude a day ago | parent | prev [-]

I know the attributes of an Apple, i know the attributes of a Pear.

As does a computer.

But only i can bite into one and know without any doubt what it is and how it feels emotionally.

scrubs a day ago | parent | next [-]

You have half a point. "Without any doubt" is merely the apex of a huge undefined iceberg.

I write half .. eating is multi modal and consequential. The llm can read the menu, but it didn't eat the meal. Even humans are bounded. Feeling, licking, smelling, or eating the menu still is not eating the meal.

There is an insuperable gap in the analogy ... a gap in the concept and of sensory data doing it.

Back to first point: what one knows through that sensory data ... is not clear at present or even possible with llms.

awesome_dude a day ago | parent [-]

I think more, also, how i feel about the taste.

zaphirplane a day ago | parent | prev [-]

We segued to conscience and individuality.