| ▲ | ffwd 3 days ago | ||||||||||||||||||||||
I think something that's missing from AI is the ability humans have to combine and think about ANY sequence of patterns as much as we want. A simple example is say I think about a sequence of "banana - car - dog - house". I can if I want to in my mind, replace car with tree, then replace tree with rainbow, then replace rainbow with something else, etc... I can sit and think about random nonsense for as long as I want and create these endless sequences of thoughts. Now I think when we're trying to reason about a practical problem or whatever, maybe we are doing pattern recognition via probability and so on, and for a lot of things it works OK to just do pattern recognition, for AI as well. But I'm not sure that pattern recognition and probability works for creating novel interesting ideas all of the time, and I think that humans can create these endless sequences, we stumble upon ideas that are good, whereas an AI can only see the patterns that are in its data. If it can create a pattern that is not in the data and then recognize that pattern as novel or interesting in some way, it would still lack the flexibility of humans I think, but it would be interesting nevertheless. | |||||||||||||||||||||||
| ▲ | nrclark 3 days ago | parent [-] | ||||||||||||||||||||||
one possible counter-argument: can you say for sure how your brain is creating those replacement words? When you replace tree with rainbow, does rainbow come to mind because of an unconscious neural mapping between both words and "forest"? It's entirely possible that our brains are complex pattern matchers, not all that different than an LLM. | |||||||||||||||||||||||
| |||||||||||||||||||||||