▲ | ben_w 16 hours ago | |||||||
I broadly agree with your point, but would also draw attention to something I've observed: > LLMs will NEVER become true AGI. But do they need to? No, or course not! Everyone disagrees about the meaning of each of the three letters of the initialism "AGI", and also disagree about the compound whole and often argue it means something different than the simple meaning of those words separately. Even on this website, "AGI" means anything from "InstructGPT" (the precursor to ChatGPT) to "Biblical God" — or, even worse than "God" given this is a tech forum, "can solve provably impossible task such as the halting problem". | ||||||||
▲ | OtomotO 14 hours ago | parent [-] | |||||||
Well, I go by the definition I was brought up with and am not interesting and redefining words all the time. A true AGI is basically Skynet or the Basilisk ;-) | ||||||||
|