Remix.run Logo
caditinpiscinam 3 hours ago

Generative AI is the average of all human knowledge

oceansky 2 hours ago | parent | next [-]

More like: the output of genAI are probably the average knowledge of all the human training data.

XargonEnder an hour ago | parent [-]

I like this version much better because most people don't write books and AI is much better at writing than the average person, probably even a few standard deviations above the average.

lobofta 2 hours ago | parent | prev | next [-]

Why does it need to be the average? It seems to me more like it models the manifold of human knowledge. However we often query for the average, because that is often good enough and gives us quick results, but there is nothing fundamentally preventing us from sending AI into the deep end of under-explored territory and perhaps coming back with something new. It is ultimately the exploration vs exploitation trade off.

cbg0 2 hours ago | parent | prev | next [-]

Not quite. Large amounts of data going into these models has already been curated, otherwise you would get a tremendous amount of wrong answers for even the most basic questions.

Cthulhu_ an hour ago | parent [-]

It still produces wrong answers regardless though, not because of the training data but because of just... intrinsics. The question is what an acceptable error rate is, how severe those errors are, and whether a human would make comparable errors.

But this debate has parallels with self-driving cars; even if the numbers say that self-driving cars are not perfect but safer than human drivers, anything but perfection will be considered broken or outright illegal.

sabas123 2 hours ago | parent | prev | next [-]

It is the average of sub sets of humans, which can be vastly better than just entire population.

Example it can easily do moderately advanced calculus, that is way better than the average human.

andsoitis 2 hours ago | parent | prev | next [-]

While true in some sense, it does have more knowledge than the average person.

customguy 2 hours ago | parent | next [-]

I wonder how much knowledge can be decoupled from experience, if at all.

If I read thousands of books that explain the details of another civilization in another galaxy, very thoroughly and consistently, but it it just happens to be all made up - did I gain knowledge? More importantly, does what I have in my brain now flip from being fiction to being knowledge if that civilization flipped from not existing to existing? How so, if nothing in my brain, or how I live out the rest of my life, changes in the least, if not a single atom in this galaxy changes (let's ignore that gravity has infinite reach and all that, for the sake of argument)?

If yes, how? What in your definition of knowledge makes that possible?

Cthulhu_ an hour ago | parent | next [-]

It's an interesting analogy you're making because... this is the lived reality of a lot of people that are interested in fictional worldbuilding / stories. And it flips to being real in the film Galaxy Quest.

dist-epoch an hour ago | parent | prev [-]

> If I read thousands of books that explain the details of another civilization in another galaxy, very thoroughly and consistently, but it it just happens to be all made up - did I gain knowledge?

sounds a lot like math - made up entities that very thoroughly and consistently fit together.

oceansky 2 hours ago | parent | prev | next [-]

It also does not have access to any knowledge that isn't public or written down or even not in their training data.

alberto467 2 hours ago | parent [-]

Isn’t the same true for a human?

IsTom 2 hours ago | parent | next [-]

Besides "secret" knowledge like the know-how at jobs, there's things like unwritten social etiquette (especially as it varies from place to place) or interfacing with physical world – reading about chopping tomatoes is different from experience acquired by actually chopping tomatoes.

oceansky 2 hours ago | parent | prev [-]

It isn't. I constantly have access to non-public information, like the life of my peers and corporate secrets. Is it useful or essential or even desirable for LLM products? Hardly not, but it exists.

Edit: for "not in the training data" yes, humans generally can't know what they can't know.

qsera 2 hours ago | parent | prev [-]

With AI, everyone will be average in no time!

Internet started it, hopefully LLMs will finish it.

throwatdem12311 2 hours ago | parent | next [-]

“Think of how stupid the average person is, and realize half of them are stupider than that.” — George Carlin

Now with LLMs lowering the average through cognitive offloading and skill atrophy, prepare for it to get a whole lot worse.

Cthulhu_ an hour ago | parent | prev [-]

For some that'll be an upgrade.

esafak an hour ago | parent | prev [-]

That is not true. AI can synthesize information, which is the essence of intelligence. And since it knows more than everyone, it is more intelligent too. What they lack is the ability to create information.