| ▲ | coldtea 2 days ago | |
>Neural networks were inspired by the brain, but transformers? It is totally plausible but do we really think just in words? LLMs might be trained via words, but as a backend transformers are not just for words. They're for high dimensional structured sequences. To make an analogy, transformers are not working on:
but
where words just happens to be a handy training set we use.And, we too, might not think in words, but I bet that we do think using multi-dimensional sequences/vectors. | ||