▲ | jcranmer 2 days ago | |
If I write a vector v = [1, 3, 2], what I am actually saying is that v is equal to 1 * e₁ + 3 * e₂ + 2 * e₃ for three vectors I have previously decided on ahead of time that form an orthonormal basis of the corresponding vector space. If I write a matrix, say, this:
What I am doing is describing is a transformation of one vector space into another, by describing how the basis vectors of the first vector space are represented as a linear combination of the basis vectors of the second vector space. Of course, the transformed vectors may not necessarily be a basis of the latter vector space.> The natural motivation of matrices is as representing systems of equations. That is very useful for only very few things about matrices, primarily Gaussian elimination and related topics. Matrix multiplication--which is what the original poster was talking about, after all--is something that doesn't make sense if you're only looking at it as a system of equations; you have to understand a matrix as a linear transformation to have it make sense, and that generally means you have to start talking about vector spaces. | ||
▲ | nh23423fefe 2 days ago | parent [-] | |
Doesn't make sense is too strong though. If you have a system Ax=y and a system By=z there exists a system (BA)x=z This system BA is naturally seen as the composition of both systems of equations And the multiplication rule expresses the way to construct the new systems' coefficients over x constrained by z. The C_i equation has coefficients which are the evaluations of the B_i equation over the A_k-th coefficients C_ik = B_ij A_jk concretely
the coefficients express the dot product rule directly |