| ▲ | CamperBob2 2 hours ago | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
It's a very very very fancy next token predictor Yes, and unless you are prepared to rebut the argument with evidence of the supernatural, that's all there is, period. That's all we are. So tired of the thought-terminating "stochastic parrot" argument. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| ▲ | godshatter 44 minutes ago | parent | next [-] | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Do LLMs even learn? The companies that build them build new models based partly on the conversations the older models have had with people, but do they incorporate knowledge into their neural nets as they go along? Can an LLM decide, without prompting or api calls, to text someone or go read about something or do anything at all except for waiting for the next prompt? Do LLMs have any conceptual understanding of anything they output? Do they even have a mechanism for conceptual understanding? LLMs are incredibly useful and I'm having a lot of fun working with them, but they are a long way from some kind of general intelligence, at least as far as I understand it. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| ▲ | mort96 2 hours ago | parent | prev [-] | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
I'm not sure why you think you know the human brain works through predicting the next token. It's not supernatural, I believe that an artificial intelligence is possible because I believe human intelligence is just a clever arrangement of matter performing computation, but I would never be presumptuous enough to claim to know exactly how that mechanism works. My opinion is that human intelligence might be what's essentially a fancy next token predictor, or it might work in some completely different way, I don't know. Your claim is that human intelligence is a next token predictor. It seems like the burden on proof is on you. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||