▲ | dapper_bison17 2 days ago | |||||||
> This "little book" seems to take a fairly standard approach, defining all the boring stuff and leading to Gaussian elimination. The other approach I've seen is to try to lead into it by talking about multi-linear functions and then deriving the notion of bases and matrices at the end. Or trying to start from an application like rotation or Markov chains. Which books or “non-standard” resources would you recommend then, that do a better job? | ||||||||
▲ | andrewla 2 days ago | parent [-] | |||||||
I have yet to encounter an approach that is not boring. You just have to power through it. This approach seems as good as any. Once you get to eigenvalues (in my opinion) things start to pick up in terms of seeing that linear spaces are actually interesting. This approach sort of betrays itself when the very first section about scalars has this line: > Vectors are often written vertically in column form, which emphasizes their role in matrix multiplication: This is a big "what?" moment because we don't know why we should care about anything in that sentence. Just call it a convention and later on we can see its utility. | ||||||||
|