Remix.run Logo
ornornor 4 days ago

What I mean is that the current generation of LLMs don’t understand how concepts relate to one another. Which is why they’re so bad at maths for instance.

Markov chains can’t deduce anything logically. I can.

astrange 4 days ago | parent | next [-]

> What I mean is that the current generation of LLMs don’t understand how concepts relate to one another.

They must be able to do this implicitly; otherwise why are their answers related to the questions you ask them, instead of being completely offtopic?

https://phillipi.github.io/prh/

A consequence of this is that you can steal a black box model by sampling enough answers from its API because you can reconstruct the original model distribution.

oasisaimlessly 4 days ago | parent | prev | next [-]

The definition of 'Markov chain' is very wide. If you adhere to a materialist worldview, you are a Markov chain. [Or maybe the universe viewed as a whole is a Markov chain.]

4 days ago | parent | prev | next [-]
[deleted]
anticrymactic 4 days ago | parent | prev | next [-]

> Which is why they’re so bad at maths for instance.

I don't think LLMs currently are intelligent. But please show a GPT-5 chat where it gets any math problem wrong, that most "intelligent" people would get right.

sindercal 4 days ago | parent | prev [-]

You and Chomsky are probably the last 2 persons on earth to believe that.

coldtea 4 days ago | parent | next [-]

It wouldn't matter if they are both right. Social truth is not reality, and scientific consensus is not reality either (just a good proxy of "is this true", but its been shown to be wrong many times - at least based on a later consensus, if not objective experiments).

red75prime 4 days ago | parent | prev [-]

Nah. There are whole communities that maintain a baseless, but utterly confident dismissive stance. Look in /r/programming, for example.