| ▲ | TMWNN 3 hours ago | |
If AGI can be defined as meeting the general intelligence of a Redditor, we hit ASI a while ago. Highly relevant comment <https://www.reddit.com/r/singularity/comments/1jh9c90/why_do...> by /u/Pyros-SD-Models: >Imagine you had a frozen [large language] model that is a 1:1 copy of the average person, let’s say, an average Redditor. Literally nobody would use that model because it can’t do anything. It can’t code, can’t do math, isn’t particularly creative at writing stories. It generalizes when it’s wrong and has biases that not even fine-tuning with facts can eliminate. And it hallucinates like crazy often stating opinions as facts, or thinking it is correct when it isn't. >The only things it can do are basic tasks nobody needs a model for, because everyone can already do them. If you are lucky you get one that is pretty good in a singular narrow task. But that's the best it can get. >and somehow this model won't shut up and tell everyone how smart and special it is also it claims consciousness. ridiculous. | ||
| ▲ | 3 hours ago | parent [-] | |
| [deleted] | ||