▲ | shkkmo 5 days ago | |||||||
> I've never seen someone state, as fact, that LLMs are AGI before now. Then you probably haven't been paying attention. https://deepmind.google/research/publications/66938/ > I've never seen someone state, as fact, that LLMs are AGI before now. Many LLMs are AI that weren't designed / trained to solve a narrow problem scope. They can complete a wide range of tasks with varying levels of proficiency. That makes them artificial general intelligence or AGI. You are confused because lots of people use "AGI" as a shorthand to talk about "human level" AGI that isn't limited to a narrow problem scope. It's not wrong to use the term this way, but it is ambiguous and vague. Even the term "human level" is poorly defined and if I wanted to use the term "Human level AGI" for any kind of discussion of what qualifies, I'd need to specify how I was defining that. | ||||||||
▲ | parineum 5 days ago | parent [-] | |||||||
I'm not confused at all. Your own personal definitions just further my point that tech people have a much different classification system that the general populous and that the need for those excessive classifications is that way ambitious CEOs keep using the term incorrectly in order to increase share prices. It's actually very funny to me that you are stating these definitions so authoritatively despite the terms not having any sort if rigor attached to either their definition or usage. | ||||||||
|