▲ | griffzhowl 2 days ago | ||||||||||||||||
There's no single best way to understand any of this, but the action of a matrix on the standard basis vectors is a totally reasonable place to start because of its simplicity, and then the action on any vector can be built out of that because they're linear combinations of basis vectors. | |||||||||||||||||
▲ | nh23423fefe 12 hours ago | parent [-] | ||||||||||||||||
i don't agree because this seems circular. You cant even define a matrix as something that acts on vectors meaningfully until you have some machinery. if you start with a set S and then make it vector space V over field K. Then by definition, linear combinations (and its not an algebra so nonlinear isn't even defined) are closed in V. You can then define spanning sets and linear independence to get bases. From bases you can define coordinate vectors over K^n as isomorphic to V. Then given some linear function f : V->W by definition f(v) = f(v^i * b_i) = v^i * f(b_i) Only here is when you can even define a matrix meaningfully as a tuple of coordinate vectors which are the image of some basis vectors. Then you need to prove that what was function application of linear functions on vectors is the same as a new operation of multiplication of matrices with coordinate vectors. And then to prove the multiplication rule (which is inherently coordinate based) you are going make the same argument I made in sibling comment. But I could prove the rule directly by substitution using only systems of linear equations as the starting point. | |||||||||||||||||
|