Remix.run Logo
theptip a day ago

> LLMs seem to udnerstand language therefore they've trained a model of the world.

This isn’t the claim, obviously. LLMs seem to understand a lot more than just language. If you’ve worked with one for hundreds of hours actually exercising frontier capabilities I don’t see how you could think otherwise.

ekjhgkejhgk 10 hours ago | parent [-]

> This isn’t the claim, obviously.

This is precisely the claim that leads a of lot people to believe that all you need to reach AGI is more compute.

9 hours ago | parent | next [-]
[deleted]
theptip 9 hours ago | parent | prev [-]

What I mean here is that this is certainly not what Dwarkesh would claim. It’s a ludicrous strawman position.

Dwarkesh is AGI-pilled and would base his assumption of a world model on much more impressive feats than mere language understanding.

baobun 6 hours ago | parent [-]

Watching the video it seems that Dwarkesh doesn't really have a clue what he's confidently talking about yet running fast with his personal half-baked ideas, to the points where it gets both confusing and cringe when Karpathy apparently manages to make sense of it and yes-anding the word salad AK. Karpathy is supposedly there to clear up misunderstanding yet lets all the nonsense Dwarkesh is putting before him slide.

"ludicrous" sure but I wouldn't be so certain about "strawman" or that Dwarkesh has a consistent view.