Remix.run Logo
matt123456789 a day ago

What's different is nearly everything that goes on inside. Human brains aren't a big pile of linear algebra with some softmaxes sprinkled in trained to parrot the Internet. LLMs are.

TuringTourist a day ago | parent | next [-]

I cannot fathom how you have obtained the information to be as sure as you are about this.

mensetmanusman a day ago | parent | next [-]

Where is the imagination plane in linear algebra. People forget that the concept of information can not be derived from physics/chemistry/etc.

matt123456789 10 hours ago | parent | prev [-]

You can't fathom reading?

csallen a day ago | parent | prev | next [-]

What's the difference between parroting the internet vs parroting all the people in your culture and time period?

amlib a day ago | parent | next [-]

Even with a ginormous amount of data generative AIs still produce largely inconsistent results to the same or similar tasks. This might be fine for fictional purposes like generating a funny image or helping you get new ideas for a fictional story but has extremely deleterious effects for serious use cases, unless you want to be that idiot writing formal corporate email with LLMs that end up full of inaccuracies while the original intent gets lost in a soup of buzzwords.

Humans with their tiny amount of data and "special sauce" can produce much more consistent results even though they may be giving the objectively wrong answer. They can also tell you when they don't know about a certain topic, rather than lying compulsively (unless that person has a disorder to lie compulsively...).

lordnacho 17 hours ago | parent [-]

Isn't this a matter of time to fix? Slightly smarter architecture maybe reduces your memory/data needs, we'll see.

matt123456789 10 hours ago | parent | prev [-]

Interesting philosophical question, but entirely beside the point that I am making, because you and I didn't have to do either one before having this discussion.

jml78 a day ago | parent | prev | next [-]

It kinda is.

More and more researches are showing via brain scans that we don’t have free will. Our subconscious makes the decision before our “conscious” brain makes the choice. We think we have free will but the decision to do something was made before you “make” the choice.

We are just products of what we have experienced. What we have been trained on.

sally_glance a day ago | parent | prev | next [-]

Different inside yes, but aren't human brains even worse in a way? You may think you have the perfect altruistic leader/expert at any given moment and the next thing you know, they do a 360 because of some random psychosis, illness, corruption or even just (for example romantic or nostalgic) relationships.

djeastm a day ago | parent | prev | next [-]

We know incredibly little about exactly what our brains are, so I wouldn't be so quick to dismiss it

quotemstr a day ago | parent | prev | next [-]

> Human brains aren't a big pile of linear algebra with some softmaxes sprinkled in trained to parrot the Internet.

Maybe yours isn't, but mine certainly is. Intelligence is an emergent property of systems that get good at prediction.

matt123456789 10 hours ago | parent [-]

Please tell me you're actually an AI so that I can record this as the pwn of the century.

ekianjo a day ago | parent | prev [-]

If you believe that, then how do you explain that brainwashing actually works?