Remix.run Logo
jmcgough 4 days ago

Humans assume that being able to produce meaningful language is indicative of intelligence, because the only way to do this until LLMs was through human intelligence.

notahacker 4 days ago | parent | next [-]

Yep. Although the average human also considered proficiency in mathematics to be indicative of intelligence until we invented the pocket calculator, so maybe we're just not smart enough to define what intelligence is.

creata 3 days ago | parent [-]

Sorry if I'm being pedantic, but I think you mean arithmetic, not mathematics in general.

Izkata 3 days ago | parent | prev [-]

Not really, we saw this decades ago: https://en.wikipedia.org/w/index.php?title=ELIZA_effect

creata 3 days ago | parent [-]

I don't think I'm falling for the ELIZA effect.* I just feel like if you have a small enough model that can accurately handle a wide enough range of tasks, and is resistant to a wide enough range of perturbations to the input, it's simpler to assume it's doing some sort of meaningful simplification inside there. I didn't call it intelligence.

* But I guess that's what someone who's falling for the ELIZA effect would say.