Remix.run Logo
hex4def6 2 hours ago

Sure.

But I bet you'd have a significantly easier time converting a child rather than a 30/40/50-yr old to a religion.

My point is that LLMs are suggestible, perhaps more so than the average adult, but less so than I child I suspect. I don't think suggestibility really solves the problem of whether something has AGI or not. To me, on the contrary, it seems like to be intelligent and adaptable you need to be able to modify your world model. How easily you are fooled is a function of how mature / data-rich your existing world model is.