Remix.run Logo
miki123211 3 days ago

You can understand 95+% of current LLM / neural network tech if you know what matrices are (on the "2d array" level, not the deeper lin alg intuition level), and if you know how to multiply them (and have an intuitive understanding why a matrix is a mapping between latent spaces and how a matrix can be treated as a list of vectors). Very basic matrix / tensor calculus comes in useful, but that's not really part of lin alg.

There are places where things like eigenvectors / eigenvalues or svd come into play, but those are pretty rare and not part of modern architectures (tbh, I still don't really have a good intuition for them).

devmor 3 days ago | parent | next [-]

I was about to respond with a similar comment. The majority of the underlying systems are the same and can be understood if you know a decent amount of vector math. That last 3-5% can get pretty mystical, though.

Honestly, where stuff gets the most confusing to me is when the authors of the newer generations of AI papers invent new terms for existing concepts, and then new terms for combining two of those concepts, then new terms for combining two of those combined concepts and removing one... etc.

Some of this redefinition is definitely useful, but it turns into word salad very quickly and I don't often feel like teaching myself a new glossary just to understand a paper I probably wont use the concepts in.

buildbot 3 days ago | parent [-]

This happens so much! It’s actually imo much more important to be able to let the math go and compare concepts vs. the exact algorithms. It’s much more useful to have semantic intuition than concrete analysis.

Being really good at math does let you figure out if two techniques are mathematically the same but that’s fairly rare (it happens though!)

whimsicalism 3 days ago | parent | prev | next [-]

> There are places where things like eigenvectors / eigenvalues or svd come into play, but those are pretty rare and not part of modern architectures (tbh, I still don't really have a good intuition for them)

This stuff is part of modern optimizers. You can often view a lot of optimizers as doing something similar to what is called mirror/'spectral descent.'

tomrod 2 days ago | parent | prev [-]

Eigenvector/eigenvalues: direction and amount of stretch a matrix pushes a basis vector.