| ▲ | smccabe0 7 hours ago | |
There's some empirical backing to this if you consider what LLMs doing as part of the same regime: LLMs take a token stream and inflate it to the N-dimensional space in their embedding. We take a string of words and apply it to our model of the world, our understanding, and memories. I've had a lot of success in understanding the math through that lens. | ||
| ▲ | DanielVZ 3 hours ago | parent [-] | |
Interesting. Then tokens would be the best encoding we have for thought so far for communication. | ||