Remix.run Logo
bayindirh 7 days ago

From my perspective, the fundamental problem arises from the assumption that brain's all functions are self contained, however there are feedback loops in the body which supports the functions of the brain.

The simplest one is fight/flight/freeze. Brain starts the process by being afraid, and hormones gets released, but next step is triggered by the nerve feedback coming from the body. If you are using beta-blockers and can't get panicked, the initial trigger fizzles and you return to your pre-panic state.

an LLM doesn't model a complete body. It just models the language. It's just a very small part of what brain handles, so assuming that modelling the language, even the whole brain gonna answer all the questions we have is a flawed approach.

Latest research shows body is a much more complicated and interconnected system than we learnt in school 30 years ago.

mft_ 7 days ago | parent [-]

Sure, your points about the body aren’t wrong, but (as you say) LLMs are only modelling a small subset of a brain’s functions at the moment: applied knowledge, language/communication, and recently interpretation of visual data. There’s no need or opportunity for an LLM (as they currently exist) to do anything further. Further, just because additional inputs exist in the human body (gut-brain axis, for example) it doesn’t mean that they are especially (or at all) relevant for knowledge/language work.

TheOtherHobbes 7 days ago | parent [-]

The point is that knowledge/language work can't work reliably unless it's grounded in something outside of itself. Without it you don't get an oracle, you get a superficially convincing but fundamentally unreliable idiot savant who lacks a stable sense of self, other, or real world.

The fundamental foundation of science and engineering is reliability.

If you start saying reliability doesn't matter, you're not doing science and engineering any more.

mft_ 7 days ago | parent [-]

I'm really struggling to understand what you're trying to communicate here; I'm even wondering if you're an LLM set up to troll, due to the weird language and confusing non-sequiturs.

> The point is that knowledge/language can't work reliably unless it's grounded in something outside of itself.

Just, what? Knowledge is facts, somehow held within a system allowing recall and usage of those facts. Knowledge doesn't have a 'self', and I'm totally not understanding how pure knowledge as a concept or medium needs "grounding"?

Being charitable, it sounds more like you're trying to describe "wisdom" - which might be considered as a combination of knowledge, lived experience, and good judgement? Yes, this is valuable in applying knowledge more usefully, but has nothing to do with the other bodily systems which interact with the brain, which is where you started?

> The fundamental foundation of science and engineering is reliability.

> If you start saying reliability doesn't matter, you're not doing science and engineering any more.

No-one mentioned reliability - not you in your original post, or me in my reply. We were discussing whether the various (unconscious) systems which link to the brain in the human body (like the gut:brain axis) might influence its knowledge/language/interpretation abilities.