Remix.run Logo
visarga 5 days ago

> LLMs reduce a very high-dimensional vector space into a very low-dimensional vector space.

What do you mean? There is an embedding size that is maintained constant from the first layer to the last. Embedding lookup, N x transformer layers, softmax - all three of them have the same dimension.

Maybe you mean LoRA is "reducing a high-dimensional vector space into a lower-dimensional vector space"

k__ 5 days ago | parent [-]

I mean LLMs reduce the "vector space" that describes reality into a vector space with fewer dimensions (e.g. 300 in the article I was replying to.)