Remix.run Logo
hex4def6 2 hours ago

That just implies LLMs are suggestible. The same is true of children. As we get older and build a more complete world model in our heads, it's harder to get us to believe things which go against that model.

Tell a 5-yr old about Santa, and they will believe it sincerely. Do the same with a 30-year old immigrant who has never heard of Santa, and I suspect you'll have a harder time.

That's not because the 5-year old is dumber, but just because their life-experience ("training data") is much more limited.

Even so, trying to convince a modern LLM of something ridiculous is getting harder. I invite you to try telling ChatGPT or Gemini that the president died a week ago and was replaced by a body-double facsimile until January 2027, so that Vance can have a full term. I suspect you'll have significant difficulty.

soperj 2 hours ago | parent [-]

> Do the same with a 30-year old immigrant who has never heard of Santa, and I suspect you'll have a harder time.

There's a plethora of people who convert to religion at an older age, and that seems far more far fetched than Santa.

dahart 8 minutes ago | parent | next [-]

> There's a plethora of people who convert to religion at an older age, and that seems far more far fetched than Santa.

Being in a religion doesn’t imply belief in deities; it only implies people want social connection. This is clearly visible in global religion statistics; there are countries where the majority of people identify as belonging to a religion, and at the same time only a small minority state they believe in a “God”. Norway is a decent example that I bumped into just yesterday. https://en.wikipedia.org/wiki/Religion_in_Norway

hex4def6 2 hours ago | parent | prev [-]

Sure.

But I bet you'd have a significantly easier time converting a child rather than a 30/40/50-yr old to a religion.

My point is that LLMs are suggestible, perhaps more so than the average adult, but less so than I child I suspect. I don't think suggestibility really solves the problem of whether something has AGI or not. To me, on the contrary, it seems like to be intelligent and adaptable you need to be able to modify your world model. How easily you are fooled is a function of how mature / data-rich your existing world model is.