▲ | ornornor 4 days ago | |||||||||||||
What I mean is that the current generation of LLMs don’t understand how concepts relate to one another. Which is why they’re so bad at maths for instance. Markov chains can’t deduce anything logically. I can. | ||||||||||||||
▲ | astrange 4 days ago | parent | next [-] | |||||||||||||
> What I mean is that the current generation of LLMs don’t understand how concepts relate to one another. They must be able to do this implicitly; otherwise why are their answers related to the questions you ask them, instead of being completely offtopic? https://phillipi.github.io/prh/ A consequence of this is that you can steal a black box model by sampling enough answers from its API because you can reconstruct the original model distribution. | ||||||||||||||
▲ | oasisaimlessly 4 days ago | parent | prev | next [-] | |||||||||||||
The definition of 'Markov chain' is very wide. If you adhere to a materialist worldview, you are a Markov chain. [Or maybe the universe viewed as a whole is a Markov chain.] | ||||||||||||||
▲ | 4 days ago | parent | prev | next [-] | |||||||||||||
[deleted] | ||||||||||||||
▲ | anticrymactic 4 days ago | parent | prev | next [-] | |||||||||||||
> Which is why they’re so bad at maths for instance. I don't think LLMs currently are intelligent. But please show a GPT-5 chat where it gets any math problem wrong, that most "intelligent" people would get right. | ||||||||||||||
▲ | sindercal 4 days ago | parent | prev [-] | |||||||||||||
You and Chomsky are probably the last 2 persons on earth to believe that. | ||||||||||||||
|