Remix.run Logo
AndrewKemendo 8 hours ago

> I've seen an interesting behavior in India. If I ask someone on the street for directions, they will always give me an answer, even if they don't know. If they don't know, they'll make something up.

Isn’t this the precise failure pattern that everybody shits on LLMs for?

chrisjj 5 hours ago | parent | next [-]

Only on surface. The difference is the LLM doesn't know it doesn't know. An LLM provides the best solition it has regardless of whether that solution is in any way fit for purpose.

DominikPeters 4 hours ago | parent [-]

If you inspect the Chain of Thought summaries, the LLM often knows full well what it is doing.

chrisjj 3 hours ago | parent [-]

That's not knowing. That's just parotting in smaller chunks.

koliber 7 hours ago | parent | prev | next [-]

Yes.

fakedang 7 hours ago | parent [-]

Hence proved

AGI = A Guy/Gal in India

soco 6 hours ago | parent | next [-]

Ah so that's what Anthropic's Amodei meant when saying AGI was attained - they actually reached that guy/gal.

direwolf20 6 hours ago | parent [-]

Perhaps they meant detained.

Nevermark 5 hours ago | parent [-]

Hopefully, retained. But, a tained for sure.

bicepjai 3 hours ago | parent | prev [-]

This is really funny.

melvinmelih 6 hours ago | parent | prev [-]

--

AndrewKemendo 6 hours ago | parent [-]

Almost like…technology embeds the latent behaviors of the data that produced it!

Imagine that

Someone should really write a paper on that (hint: it’s the entire basis of information theory)

5 hours ago | parent [-]
[deleted]