Remix.run Logo
The Little Book of Linear Algebra(github.com)
444 points by scapbi a day ago | 115 comments
tamnd 5 minutes ago | parent | next [-]

[Author here] Linear Algebra is the first book in a series I'm writing on math, computer science, and programming. You can find the index for the upcoming books here: https://github.com/the-litte-book-of/everything. If the style clicks with you or you're curious about certain topics, I’d love to hear your thoughts!

andrewla a day ago | parent | prev | next [-]

It's crazy that Linear Algebra is one of the deepest and most interesting areas of mathematics, with applications in almost every field of mathematics itself plus having practical applications in almost every quantitative field that uses math.

But it is SOOO boring to learn the basic mechanics. There's almost no way to sugar coat it either; you have to learn the basics of vectors and scalars and dot products and matrices and Gaussian elimination, all the while bored out of your skull, until you have the tools to really start to approach the interesting areas.

Even the "why does matrix multiplication look that way" is incredibly deep but practically impossible to motivate from other considerations. You just start with "well that's the way it is" and grind away until one day when you're looking at a chain of linear transformations you realize that everything clicks.

This "little book" seems to take a fairly standard approach, defining all the boring stuff and leading to Gaussian elimination. The other approach I've seen is to try to lead into it by talking about multi-linear functions and then deriving the notion of bases and matrices at the end. Or trying to start from an application like rotation or Markov chains.

It's funny because it's just a pedagogical nightmare to get students to care about any of this until one day two years later it all just makes sense.

srean a day ago | parent | next [-]

> Even the "why does matrix multiplication look that way" is incredibly deep but practically impossible to motivate from other considerations. You just start with "well that's the way it is" and grind away

In my experience it need not be like that at all.

One can start by defining and demonstrating linear transformations. Perhaps from graphics -- translation, rotation, reflection etc. Show the students that these follow the definition of a linear transformation. That rotating a sum is same as summing the rotated(s).

[One may also mention that all differentiable functions (from vector to vector) are locally linear.]

Then you define adding two linear transformations using vector addition. Next you can define scaling a linear transformation. The point being that the combination can be expressed as linear transformations themself. No need to represent the vectors as R^d, geometric arrows and parallelogram rule would suffice.

Finally, one demonstrates composition of linear transformations and the fact that the result itself is a linear transformation.

The beautiful reveal is that this addition and composition of linear transformations behave almost the same as addition and multiplication of real numbers.

The addition asociates and commutes. The multiplication associates but doesn't necessarily commute. Most strikingly, the operations distributes. It's almost like algebra of real numbers !

Now, when you impose a coordinate system or choose a basis, the students can discover that matrix multiplication rule for themselves over a couple of days of playing with it -- Look, rather than maintaining this long list of linear transformations, I can store it as a single linear transformation in the chosen basis.

andrewla a day ago | parent | next [-]

> Perhaps from graphics -- translation, rotation, reflection

Maybe ... but the fact that you included translation in the list of linear operations seems like a big red flag. Translation feels very linear but it is emphatically not [1]. This is not intended to be a personal jab; just that the intuitions of linear algebra are not easy to internalize.

Adding linear transformations is similarly scary territory. You can multiply rotations to your heart's content but adding two rotations gives you a pretty funky object that does not have any obvious intuition in graphics.

[1] I wouldn't jump into projective or affine spaces until you have the linear algebra tools to deal with them in a sane way, so this strikes me as a bit scary to approach it this way.

srean 21 hours ago | parent | next [-]

Mea culpa about translation.

For a moment I was thinking in homogeneous coordinates - that's not the right thing to do in the introductory phase.

Thanks for catching the error and making an important point. I am letting my original comment stand unedited so that your point stands.

About rotations though, one need not let the cat out of the bag and explain what addition of rotation is *.

One simply defines addition of two linear operators as the addition of the vectors that each would have individually produced. This can be demonstrated geometrically with arrows, without fixing coordinates.

* In 2D it's a scaled rotation.

dreamcompiler 11 hours ago | parent | prev | next [-]

To me the fact that translation in n dimensions is nonlinear but it becomes linear if you embed your system in n+1 dimensions is one of the coolest results from linear algebra, and it's why you need 4x4 matrices to express the full set of transformations possible in 3-space.

ferfumarma 3 hours ago | parent | prev [-]

Can you elaborate on your point that translation is not linear? The OP agrees with you, so clearly your point is correct, but I personally just don't understand it. Isn't it true that translation is linear within the coordinate space of your model, even if the final distance traveled within a projected camera view is not?

edit to add: (I think your point relates only to the projection system, and not a pure, unprojected model; I just want to make sure I understand because it seems like an important point)

srean 3 hours ago | parent [-]

All linear operators map origin to origin. But translation applied to the origin will shift it. So translation cannot be linear.

Let's take another approach.

Take a point p that's sum of vectors a and b, that is

p = a + b.

Now, if translation was a linear transformation, then translating p (say along x-axis by 1 unit) is equivalent to applying same translation to a and b separately and then summing them. But the latter ends up translating by twice the amount. Or in other words

p +t ≠ (a +t) + (b +t) = p + 2t.

So translation is not a linear operators in this vector space.

cosmic_cheese a day ago | parent | prev | next [-]

If anybody is aware of materials that teach linear algebra via graphics as suggested here, I would be interested to hear about them. As someone who learns best through practical application, maths have been by far among my greatest weak points, despite having written software for upwards of a decade. It’s limiting in some scenarios and pure imposter syndrome fuel.

viewtransform 17 hours ago | parent | next [-]

3Blue1Brown [Essense of linear algebra](https://www.youtube.com/playlist?list=PLZHQObOWTQDPD3MizzM2x...)

itchingsphynx 15 hours ago | parent [-]

This series by Grant was very useful to review and learn and see the implications. Highly recommended.

te 21 hours ago | parent | prev [-]

codingthematrix.com

imtringued 5 hours ago | parent | prev [-]

>The beautiful reveal is that this addition and composition of linear transformations behave almost the same as addition and multiplication of real numbers.

This is only beautiful if you already understand monoids, magmas and abelian half groups (semigroups) and how they form groups. Also, we do not talk of linear transformations, we talk of group homomorphisms.

I don't know about anyone else, but I was taught linear algebra this way in the first semester and it felt like stumbling in a dark room and then having the lights turned on in the last week as if that was going to be payback for all the toe stubbing.

srean 3 hours ago | parent [-]

It can be beautiful with less.

All that needs to be demonstrated is that for real numbers + associates and commutes. That * associates and commutes. And most satisfyingly, these two operations interact through the distribution property.

Of course, it's more revealing and interesting if one has some exposure to groups and fields.

Do people encounter linear algebra in their course work before that ?

For us it came after coordinate/analytical geometry where we had encountered parallelogram law. So while doing LA we had some vague awareness that there's a connection. This connection solidified later.

We also had an alternative curriculum where matrices were taught in 9th grade as a set of rules without any motivation whatsoever. "This is the rule for adding, this one's for multiplication, see you at the test"

tptacek a day ago | parent | prev | next [-]

I didn't think any part of linear algebra was boring. I was hooked from the moment I saw Ax=b => x = b/A. Gaussian elimination is a blast, like an actually-productive Sudoku puzzle, and once you have it down you can blaze through the first 2/3rds of an undergrad linear algebra course. I don't consciously try to gain automaticity with math subjects, but matrix-column multiplication I got pretty quickly and now I just have it.

I learned from Strang, for what it's worth, which is basically LU, spaces, QR, then spectral.

I am really bad at math, for what it's worth; this is just the one advanced math subject that intuitively clicked for me.

dpflan a day ago | parent | next [-]

MIT OCW is an amazing resource -- anyone can learn from Strang, which is a goldmine.

He also created a course on using Linear Algebra for machine learning:

> Linear algebra concepts are key for understanding and creating machine learning algorithms, especially as applied to deep learning and neural networks. This course reviews linear algebra with applications to probability and statistics and optimization–and above all a full explanation of deep learning.

- MIT OCW Course: Matrix Methods in Data Analysis, Signal Processing, and Machine Learning (https://ocw.mit.edu/courses/18-065-matrix-methods-in-data-an...)

- The text book website: Linear Algebra and Learning from Data (2019) https://math.mit.edu/~gs/learningfromdata/

- The Classic Linear Algebra Course: https://ocw.mit.edu/courses/18-06-linear-algebra-spring-2010...

tenacious_tuna 21 hours ago | parent | prev | next [-]

> I didn't think any part of linear algebra was boring.

My formal linear algebra course was boring as hell, to me. The ~4 lectures my security prof dedicated to explaining just enough to do some RSA was absolutely incredible. I would pay lots of money for a hands-on what-linalg-is-useful-for course with practical examples like that.

tptacek 21 hours ago | parent | next [-]

Try this. :)

https://kel.bz/post/lll/

(If you work through the prerequisites and use "understanding this post" as a sort of roadmap of what you actually need to know, this gets you about 2/3rds through undergraduate linear algebra, and you can skim through nullspaces --- all in the service of learning a generally useful tool for attacking cryptosystems).

srean 20 hours ago | parent [-]

Thanks for the notes. This is marvellous. I do not work on, or have interest in cryptography algorithms, but this is such an interesting read.

tptacek 20 hours ago | parent [-]

Kelby Ludwig is such a talented explainer, it upsets me.

sureglymop 17 hours ago | parent | prev [-]

There's a book called ILA (Interactive Linear Algebra) that I found really good: https://textbooks.math.gatech.edu/ila/

andrewla a day ago | parent | prev | next [-]

I haven't looked at Strang's approach.

The "x = b / A" is a bit of a gut-punch on first look because my mind immediately tells me all the ways that that does not work. It makes a some sense once I take a second to think about it, and I can see why it would make you want to jump in a little deeper, but matrices being non-commutative makes me cringe at the idea of a division operator which does not very very clearly spell out where it appears in the chain.

Ax = b is all well and good, but AxA^-1 = bA^-1 is not meaningful; the application/composition order is very important.

jdshaffer 18 hours ago | parent | prev [-]

Didn't know about Gaussian elimination until today. Really cool, and really useful! Thanks for sharing!

hiAndrewQuinn 11 hours ago | parent | prev | next [-]

Linear algebra was such a tedious thing to learn I skipped over it to abstract algebra and doubled back once I had some kind of minimally interesting framework to work with it against. Normally I think this is a foolish way to do things, but sometimes things are just so dull you have to take the hard route to power through at all.

Sharlin a day ago | parent | prev | next [-]

For anyone who’s interested in graphics programming and/or is a visual learner/thinker, there’s an incredibly motivating and rewarding way to learn the basics of linear algebra. (And affine algebra, which tends to be handwaved away, unfortunately. I’m writing a MSc thesis about this and related topics.)

andrewla a day ago | parent | next [-]

To a degree I think this is true, but it requires (at least in my experience) that you have an intrinsic grasp of trigonometry for it to make sense. If you have some complex function analysis and e^itheta then you can skirt the problem for a bit, but if you're like me and have to break out soh-cah-toa whenever you break down a triangle then this method ends up being pretty tedious too.

Sharlin a day ago | parent [-]

I’m not sure what you mean. Beyond rotation matrices, there’s really only trig involved in graphics if you actively want it.

andrewla a day ago | parent [-]

Maybe I was making unwarranted assumptions about the nature of your way to learn linear algebra. The approaches that I've seen invariably have to produce a sample matrix, and rotation is really the best example. The rotation matrix is going to have sines and cosines, and understanding what that means is not trivial; and even now if you asked me to write a rotation matrix I would have to work it out from scratch. Easy enough to do mechanically but I have no intuitions here even now.

Sharlin 21 hours ago | parent | next [-]

Rotation matrices are somewhat mysterious to the uninitiated, but so is matrix multiplication until it "clicks". Whether it ever clicks is a function of the quality of the learning resource (I certainly do not recommend trying to learn linalg via 3D graphics by just dabbling without a good graphics-oriented textbook or tutorial – that usually doesn’t end well).

Anyway, I believe that it's perfectly possible to explain rotation matrices so that it "clicks" with a high probability, as long as you understand the basic fact that (cos a, sin a) is the point that you get when you rotate the point (1, 0) by angle a counter-clockwise about the origin (that's basically their definition!) Involving triangles at all is fully optional.

srean 21 hours ago | parent | prev [-]

In 2D there's an alternative. One can rotate purely synthetically, by that I mean with compass and straight edge. This avoids getting into transcendentals.

Of course I am not suggesting building synthetic graphics engines :) but the synthetic approach is sufficient to show that the operation is linear.

greymalik 21 hours ago | parent | prev | next [-]

> there’s an incredibly motivating and rewarding way to learn the basics of linear algebra

What is it?

tptacek 20 hours ago | parent [-]

They mean graphics programming; learning graphics programming will give you intuitions for a lot of linear algebra stuff.

bmacho 21 hours ago | parent | prev | next [-]

There is no such thing as affine algebra: https://en.wikipedia.org/wiki/Affine_algebra

Sharlin 21 hours ago | parent [-]

There are affine spaces, and there is an algebra of the elements of affine spaces. That is, rules that describe how the elements can be manipulated. There are affine transforms, affine combinations, affine bases, and so on, all of them analogous to the corresponding concepts in linear algebra.

(The term "algebra" can also refer to a particular type of algebraic structure in math, but that’s not what I meant.)

cassepipe a day ago | parent | prev [-]

...

What is this incredible motivating way ? Please do tell

bmacho 21 hours ago | parent | next [-]

Linear algebra has motivations and applications everywhere, since its main defining concepts, 'addition' and 'transformation that keeps sums' are everywhere. So a linear algebra curse is a huge pile of disjointed facts. It is not such a set of material that can have motivations behind it.

But the good news is that if you are only interested in for example geometry, game theory, systems of linear equations, polynomials, statistics, etc, then you can skip 80% of the content of linear algebra books. You don't have to read them, understand them, memorize them. You'll interact with a tiny part of linear algebra anyway, and you don't have to do that upfront.

Sharlin a day ago | parent | prev [-]

Well, graphics programming itself. Learning while doing, preferably from some good resource written with graphics in mind. 2D is fine for the basics, 3D is more challenging and potentially confusing but also more rewarding.

joshmarlow 20 hours ago | parent | prev | next [-]

The older I get the more convinced I am that "math is not hard; teaching math is hard".

rramadass 13 hours ago | parent [-]

This is far more truer than most people may realize.

Because there is so much to teach/learn, "Modern Mathematics" syllabi has devolved into giving students merely an exposure to all possible mathematical tools in an abstract manner, dis-jointly with no unifying framework, and no motivating examples to explain the need for such mathematics. Most teachers are parrots and have no understanding/insight that they can convey to students and so the system perpetuates itself in a downward spiral.

The way to properly teach/learn mathematics is to follow V.I.Arnold's advice i.e. On Teaching Mathematics - https://dsweb.siam.org/The-Magazine/All-Issues/vi-arnold-on-... Ground all teaching in actual physical phenomena (in the sense of existence with a purpose) and then show the invention/derivation of abstract mathematics to explain such phenomena. Everything is "Applied Mathematics", there is no "Pure Mathematics" which is just another name for "Abstract Mathematics" to generalize methods of application to different and larger classes of problems.

jonahx 10 hours ago | parent | prev | next [-]

> Even the "why does matrix multiplication look that way" is incredibly deep but practically impossible to motivate from other considerations.

Short, simple answer to that question by Michael Penn: https://www.youtube.com/watch?v=cc1ivDlZ71U

Another interesting treatment by Math the World: https://www.youtube.com/watch?v=1_2WXH4ar5Q&t=4s

There's no impenetrable mystery here. Probably just bad teaching you experienced.

jcranmer 21 hours ago | parent | prev | next [-]

> Even the "why does matrix multiplication look that way" is incredibly deep but practically impossible to motivate from other considerations.

It's only difficult if you are wedded to a description of matrices and vectors as seas of numbers that you grind your way through without trying to instill a fuller understanding of what those numbers actually mean. The definition makes a lot more sense when you see a matrix as a description of how to convert one sense of basis vectors to another set of basis vectors, and for that, you first need to understand how vectors are described in terms of basis vectors.

nh23423fefe 19 hours ago | parent [-]

I dont agree with this. Matrices don't convert sets of basis vectors to sets of basis vectors. What would you say about singular matrices for example?

The natural motivation of matrices is as representing systems of equations.

jcranmer 18 hours ago | parent | next [-]

If I write a vector v = [1, 3, 2], what I am actually saying is that v is equal to 1 * e₁ + 3 * e₂ + 2 * e₃ for three vectors I have previously decided on ahead of time that form an orthonormal basis of the corresponding vector space.

If I write a matrix, say, this:

  [[1  2]
   [3  4]
   [5  6]]
What I am doing is describing is a transformation of one vector space into another, by describing how the basis vectors of the first vector space are represented as a linear combination of the basis vectors of the second vector space. Of course, the transformed vectors may not necessarily be a basis of the latter vector space.

> The natural motivation of matrices is as representing systems of equations.

That is very useful for only very few things about matrices, primarily Gaussian elimination and related topics. Matrix multiplication--which is what the original poster was talking about, after all--is something that doesn't make sense if you're only looking at it as a system of equations; you have to understand a matrix as a linear transformation to have it make sense, and that generally means you have to start talking about vector spaces.

nh23423fefe 12 minutes ago | parent [-]

Doesn't make sense is too strong though.

If you have a system Ax=y and a system By=z there exists a system (BA)x=z

This system BA is naturally seen as the composition of both systems of equations

And the multiplication rule expresses the way to construct the new systems' coefficients over x constrained by z.

The C_i equation has coefficients which are the evaluations of the B_i equation over the A_k-th coefficients

C_ik = B_ij A_jk

griffzhowl 18 hours ago | parent | prev [-]

There's no single best way to understand any of this, but the action of a matrix on the standard basis vectors is a totally reasonable place to start because of its simplicity, and then the action on any vector can be built out of that because they're linear combinations of basis vectors.

ViscountPenguin 14 hours ago | parent | prev | next [-]

I found the university level presentation of "Vector spaces -> Linear functions -> Matrices are isomorphic to linear functions" much more motivating than the rote mechanics I was taught in highschool, but it's hard to see if I would've had that appreciation without being taught the shitty way first.

jameshart 20 hours ago | parent | prev | next [-]

What I find amazing is, given how important linear algebra is to actual practical applications, high school math still goes so deep on calculus at the expense of really covering even basic vectors and matrices.

Where vectors do come up it’s usually only Cartesian vectors for mechanics, and only basic addition, scalar multiplication and component decomposition are talked about - even dot products are likely ignored.

bee_rider 14 hours ago | parent | next [-]

I think it was a brilliant and evil trick by the linear algebra folks.

Start the path at calculus. Naturally, this will lead to differential equations. Trick the engineers into defining everything in terms of differential equations.

The engineers will get really annoyed, because solving differential equations is impossible.

Then, the mathematicians swoop in with the idea of discretizing everything and using linear algebra to step through it instead. Suddenly they can justify all the million-by-millions matrices they wanted and everybody thinks they are heroes. Engineers will build the giant vector processing machines that they want.

ViscountPenguin 14 hours ago | parent | prev | next [-]

That's very strange, where I live linear algebra was a significant portion of the highschool maths curriculum.

The actual presentation was terrible, I'll be lucky if I die before having to invert a matrix by hand again, but it was there.

JadeNB 19 hours ago | parent | prev [-]

I think that, to be frank, it's a combination of (1) a curriculum developed before it was clear how ubiquitous linear algebra would become, and (2) the fact that it's a lot easier to come up with a standardized assessment for algorithmic calculus than for linear algebra, precisely because linear algebra is both conceptual and proof-based in a way that has been squeezed out of algorithmic calculus.

(I use algorithmic calculus to describe the high-school subject, and distinguish it from what in American universities is usually called "analysis," where one finally has the chance to make the acquaintance of the conceptual and proof-based aspects squeezed out of algorithmic calculus.)

dapper_bison17 a day ago | parent | prev | next [-]

> This "little book" seems to take a fairly standard approach, defining all the boring stuff and leading to Gaussian elimination. The other approach I've seen is to try to lead into it by talking about multi-linear functions and then deriving the notion of bases and matrices at the end. Or trying to start from an application like rotation or Markov chains.

Which books or “non-standard” resources would you recommend then, that do a better job?

andrewla a day ago | parent [-]

I have yet to encounter an approach that is not boring. You just have to power through it. This approach seems as good as any.

Once you get to eigenvalues (in my opinion) things start to pick up in terms of seeing that linear spaces are actually interesting.

This approach sort of betrays itself when the very first section about scalars has this line:

> Vectors are often written vertically in column form, which emphasizes their role in matrix multiplication:

This is a big "what?" moment because we don't know why we should care about anything in that sentence. Just call it a convention and later on we can see its utility.

_mu 16 hours ago | parent [-]

Maybe we can petition Paul Lockhart to do a book on Linear Algebra, I would definitely buy it.

Chinjut a day ago | parent | prev | next [-]

Why do you say it's practically impossible to motivate matrix multiplication? The motivation is that this represents composition of linear functions, exactly as you follow up by mentioning.

It's a disservice to anyone to tell them "Well, that's the way it is" instead of telling them from the start "Look, these represent linear functions. And look, this is how they compose".

andrewla a day ago | parent | next [-]

Sure, that's a way to approach it. All you have to do is stay interested in "linear functions" long enough to get there. It's totally possible -- I got there, and so did many many many other people (arguably everyone who has applied mathematics to almost any problem has).

But when I was learning linear algebra all I could think was "who cares about linear functions? It's the simplest, dumbest kind of function. In fact, in one dimension it's just multiplication -- that's the only linear function and the class of scalar linear functions is completely specified by the factor that you multiple by". I stuck to it because that was what the course taught, and they wouldn't teach me multidimensional calculus without making me learn this stuff first, but it was months and years later when I suddenly found that linear functions were everywhere and I somehow magically had the tools and the knowledge to do stuff with them.

bananaflag 16 hours ago | parent | next [-]

Yeah, concepts can make a student reject them with passion.

I remember in a differential geometry course, when we reached "curves on surfaces", I thought "what stupidity! what are the odds a curve lies exactly on a surface?"

ndriscoll 17 hours ago | parent | prev | next [-]

Linear functions are the ones that we can actually wrap our heads around (maybe), and the big trick we have to understand nonlinear problems is to use calculus to be able to understand them in terms of linear ones again. Problems that can't be made linear tend to be exceptionally difficult, so basically any topic you want to learn is going to be calculus+linear algebra because everything else is too hard.

The real payoff though is after you do a deep dive and convince yourself there's plenty of theory and all of these interesting examples and then you learn about SVD or spectral theorems and that when you look at things correctly, you see they act independently in each dimension by... just multiplication by a single number. Unclear whether to be overwhelmed or underwhelmed by the revelation. Or perhaps a superposition.

JadeNB 19 hours ago | parent | prev [-]

> But when I was learning linear algebra all I could think was "who cares about linear functions? It's the simplest, dumbest kind of function. In fact, in one dimension it's just multiplication -- that's the only linear function and the class of scalar linear functions is completely specified by the factor that you multiple by".

This seems to make it good motivation for an intellectually curious student—"linear functions are the simplest, dumbest kind of function, and yet they still teach us this new and exotic kind of multiplication." That's not how I learned it (I was the kind of obedient student who was interested in a mathematical definition because I was told that I should be), but I can't imagine that I wouldn't have been intrigued by such a presentation!

cassepipe a day ago | parent | prev [-]

Agree.The fact that it's just linear functions is what made it click for me

hinkley 9 hours ago | parent | prev | next [-]

There was a brief time when I understood how linear algebra was used to render 3d models but that's all gone now.

I wonder if these days that would be a better starting point.

srean 8 hours ago | parent [-]

That's just the sad truth of the maximum -- use it or lose it. It can be quite disheartening. The good part is that it becomes easier to pick it up next time around. Happened to me so many times.

127 11 hours ago | parent | prev | next [-]

I had the opposite experience when learning linear algebra as I was also doing 3D computer graphics at the time. It was super interesting and fun. I guess you just have to find an application for it.

Sleaker 16 hours ago | parent | prev | next [-]

I went through Khan academy on linear algebra a long time ago because I wanted to learn how to write rendering logic, for me I was implementing things as I learned them with immediate feedback for a lot of the material. Was probably the single most useful thing I learned then.

imtringued 5 hours ago | parent | prev | next [-]

Actually, the calculation of matrix multiplication is extremely easy to understand if you show it done with a row and column vector first. Once your students understand dot products, they will recognize matrix multiplication as the mere pairwise calculation of dot products.

BenFranklin100 19 hours ago | parent | prev | next [-]

Probability and statistics falls into that category too. It’s one of the more boring undergraduate math courses, but is mind-boggling useful in the real world.

(Basic probability / combinatorics is actually pretty cool, but both tend to be glossed over.)

winwang a day ago | parent | prev | next [-]

I hated precalc, but I loved proof-based linear algebra, my first university math course (linalg + multivariate calc).

The simplicity(/beauty) of matrix multiplication still irks me though, in the sense of "wow, seriously? when you work it out, it really looks that simple?"

msgodel a day ago | parent | prev [-]

I don't know, maybe it's because I read the book on the side in highschool when I was supposed to be doing something else but I really loved Linear Algebra. Once I understood what it was I used matrix operations for everything. Vector spaces are such a powerful abstraction.

photon_lines a day ago | parent | prev | next [-]

If anyone is interested in a more visual or intuitive over-view, I made a mini-book on it as well a few years ago which you can find here: https://github.com/photonlines/Intuitive-Overview-of-Linear-...

jonahx 11 minutes ago | parent | next [-]

This looks very nice. Thanks for posting.

WillAdams a day ago | parent | prev [-]

Nice pairing the text with 3Blue1Brown's lectures on linear algebra!

https://www.youtube.com/playlist?list=PLZHQObOWTQDPD3MizzM2x...

phforms a day ago | parent [-]

As an autodidact who never learned this stuff at school/uni, his lectures are what made linear algebra really click for me. I can only recommend them to anyone who wants to get a visual intuition on the fundamentals of LA.

What also helped me as a visual learner was to program/setup tiny experiments in Processing[1] and GeoGebra Classic[2].

- [1] https://processing.org - [2] https://www.geogebra.org/classic

lll-o-lll 17 hours ago | parent [-]

Nitpick: No one is a visual learner, or more correctly everyone is. Multimodal is the way, so good teachers will express the same concepts in several ‘modes’ and that helps develop the intuition.

zkmon 12 hours ago | parent | prev | next [-]

In terms of understanding why something is like that, Linear Algebra belongs to Geometry more than Algebra. Every formula in Linear Algebra which ultimately is justified by "it's just that way" can be better justified by geometry. The algebraic formulas are like animals who lost their natural habitat and were put in a zoo, making people think that these animals evolved in the zoo itself.

To give an example: A simple multiplication of two numbers is better seen as rotating one of the numbers to be perpendicular to the other and then quantifying the area/volume spanned by them. This gives vector dot product.

While geometry might better address "why", algebra gets into the work of "how to do it". Mathematics in old times, like other branches of science, did not encourage "why". Instead, most stuff would say "This is how to do it, Now just do it". Algebra probably evolved to answer "how to do it" - the need to equip the field workers with techniques of calculating numbers, instead of answering their "why" questions. In this sense, Geometry is more fundamental providing the roots of concepts and connecting all equations to the real world of spatial dimensions. Physics adds time to this, addressing the change, involving human memory of the past, perceiving the change.

hinkley 9 hours ago | parent [-]

I remember how betrayed I felt when I got to calculus and realized all of those equations I learned in high school for physics were just introductory calculus. Like why did we have to learn all that the hard way?

ziedaniel1 an hour ago | parent | prev | next [-]

A little bit too concise IMO. E.g. it doesn't really explain where the normal equations come from and it mentions eigenspaces without ever defining them.

eftychis 7 hours ago | parent | prev | next [-]

My recommendation: Linear Algebra Done Right by Axler: https://linear.axler.net/

Starts from linear transformations and builds from there.

bobajeff 2 hours ago | parent | prev | next [-]

I love this. It's a great companion to all the resources online like Kahn Academy, 3blue1brown etc. and mathisfun, Wolfram Mathworld and Google's Gemini.

Edit: Also the mathematics stackexchange.

csunoser a day ago | parent | prev | next [-]

At around 7.4 orthonormal basis and there after, the tex rendering stops working on the github readme preview page.

Instead, it is replaced with a red error box saying: [ Unable to render expression. ]

I wonder if there is an artificial limit for the amount of latex expression that can rendered per page.

asplake a day ago | parent [-]

I switched to the epub at that point. Still, credit I think to github that the page renders as well as it does.

jonahx 10 hours ago | parent | prev | next [-]

For an unconventional but beautiful presentation of the subject, I highly recommend "Wild Linear Algebra" by the brilliant Norman Wildberger:

https://www.youtube.com/watch?v=yAb12PWrhV0&list=PLBQcPIGljH...

It starts with the axioms of being able to draw one line parallel to another, and a line through point, and builds up everything from there. No labeled Cartesian axes. Just primitive Euclidean objects in an affine space.

barrenko a day ago | parent | prev | next [-]

Tried to pick a book to get into linear algebra recently, the experience was fairly hellish. First course this, second course that, done right, done wrong... I'd to the LADR4e route, but I don't have the proof-it chops yet...

griffzhowl 17 hours ago | parent | next [-]

I like Serge Lang's books for clarity of explanations. He has an Introduction to Linear Algebra which concisely covers the basics (265 pages in the main text), and grounds the matrix computations in the geometric interpretation.

Be aware that Lang has another book, called just "Linear Algebra", which is more theoretical.

cybrox a day ago | parent | prev | next [-]

I found the book "Linear Algebra" and accompanying lecture recordings by Jim Hefferon very approachable and solid.

It's free, including exercises and an (also free) solutions book.

atrettel 16 hours ago | parent | prev | next [-]

I had the same experience when I first learned linear algebra. I don't have any book recommendations, but I did want to say that for some topics, it is better to learn it by applying it than by using a book. Linear algebra was like that for me, but oddly enough I was able to learn tensor calculus from a book later (after doing a lot of problems).

Thanks to everyone recommending books too!

rramadass 17 hours ago | parent | prev | next [-]

You might want to checkout the book Practical Linear Algebra: A Geometry Toolbox by Dianne Hansford and Gerald Farin (its 1st edition was simply named The Geometry Toolbox: For Graphics and Modeling) to get an intuitive and visual introduction to Linear Algebra.

Pair it with Edgar Goodaire's Linear Algebra: Pure & Applied and you can transition nicely from intuitive geometric to pure mathematical approach. The author's writing style is quite accessible.

Add in Introduction to Applied Linear Algebra: Vectors, Matrices, and Least Squares by Stephen Boyd et al. and you are golden. Free book available at https://web.stanford.edu/~boyd/vmls/

GabriDaFirenze a day ago | parent | prev [-]

I've found the "No bullshit Guide to Linear Algebra" pretty good. Could be worth checking it out. It's the one resource that has things click more for me.

dkga 19 hours ago | parent | prev | next [-]

By the way, 3Blue1Brown’s videos on linear algebra are nothing my short of amazing. And I use linear algebra every day (I’m an economist).

vismit2000 4 hours ago | parent | prev | next [-]

Related: Immersive Linear Algebra - https://immersivemath.com/ila/

rossant 21 hours ago | parent | prev | next [-]

A linear algebra course without graphics? When I learnt it at school almost 25 years ago, the teacher made schematics all the time to explain the visual intuition behind each concept. I was totally confused when he introduced the abstract definition of a vector space with the addition and scalar multiplication. Then he drew some arrows. Then it all suddenly made sense.

jean_lannes 17 hours ago | parent | prev | next [-]

As someone who took a standard undergrad linear algebra course but never really used it in my work, what are some good ways to get acquainted with practical applications of linear algebra?

JBits 15 hours ago | parent | next [-]

A good use of linear algebra that I'm working with at the moment is the use of splines as a basis for real (vector) functions. After obtaining the matrix/vector representations you can solve for the spline coefficients (and then plot them).

Linear transforms (such as rotations and displacements) in GPU graphics.

Fourier series in signal processing.

JPEG compression.

Obtaining the best fit element in a vector space of curves given data or other constraints.

Understanding autodiff in JAX.

The mathematical definition of a tensor helps develop intuition for manipulating arrays/tensors in array libraries.

Transition matrices of a Markov chain.

PageRank.

defrost 16 hours ago | parent | prev [-]

There were some hints upstream: https://news.ycombinator.com/item?id=45107638

Machine learning, LLMs, RSA, etc.

It's generally useful for multivariate statistics, 3D flies (insects), in 3D space, clustering about a narrow slanting plane of light from a window slit are points that can be projected onto "the plane of best fit" - nominally the slanting plane of light.

That right there is a geometric picture of fitting a line, a plane, a lower order manifold, to a higher order data set, the errors (distance from plane), etc. and something of what Singular Value Decomposition is about (used for image enhancement, sharpening fuzzy data, etc).

The real test of applications is what kind of work do you see yourself doing? - A quick back read suggests your currently a CS student, so all unfocused potential for now (perhaps).

ddavis a day ago | parent | prev | next [-]

The organization and formatting of the single .tex file is such that one could almost read the source alone. Really nice. Also, I had no idea that GitHub did such a good job rendering the LaTeX math in markdown, it's imperfect but definitely good.

eliaskickbush 20 hours ago | parent | prev | next [-]

Highly recommend to anyone struggling with linear algebra to check out Linear Algebra Done Right, by Sheldon Axler. Do always keep in mind that some concepts are very verbose, but truly out of necessity. If you're talking about an N by N matrix, you're naturally going to have to distinguish N^2 different elements.

You can go very far without touching matrices, and actually find motivation on this abstract base before learning how it interops with matrices.

akww 27 minutes ago | parent [-]

A little surprising to me this doesn’t come up the most! Excellent text, and look, you can get the 4th edition (2024) for free as a pdf at http://axler.net.

defanor a day ago | parent | prev | next [-]

Always nice to see CC-licensed textbooks. This one looks fairly minimal, not including much of explanation, illustrations, or proofs; I think those are generally useful for the initial study, but this should still work as a cheat sheet, at least.

ivan_ah a day ago | parent | prev | next [-]

Wow very nice. Lots of content in here, with no lengthy explanations but useful point-form intuition.

The .epub has very clean math done in HTML (no images), which is a cool way to do things. I've never seen this before. I wonder what the author used to produce the .epub from the .tex?

ivan_ah a day ago | parent [-]

Update: using Sigil to look inside the .epub, I saw it was produced by `pandoc` and the math is rendered as MathML.

tekknolagi 14 hours ago | parent | prev | next [-]

In case the author didn't notice, the organization name appears to be missing an L - "litte"

michelpp a day ago | parent | prev | next [-]

Beautiful, a great intro to and reference to core concepts. Definitely keeping this one around for mental refresh!

snarfy a day ago | parent [-]

Yep, same here. It is a good refresher.

m3047 a day ago | parent | prev | next [-]

I'd encourage the author to put a pointer to the repo in the actual doc. Maybe I should send a PR...

mugamuga 11 hours ago | parent | prev | next [-]

Is there a version or a similar book that deals with Calculus?

tamnd 10 minutes ago | parent | next [-]

[Author here] We hear you! Here’s a similar book for Calculus: https://github.com/the-little-book-of/calculus

In this book, I cover Functions, Derivatives, Integrals, Multivariable Calculus, and Infinite Processes. In addition, I’ve included appendices with sketch proofs and applications to Physics, Probability and Statistics, and Computer Science.

griffzhowl 4 hours ago | parent | prev [-]

Serge Lang's "Short Calculus" is clear and concise, efficiently covering the basics in ~170 pages instead of hundreds of pages like some books. It then has a few more chapters going into a bit more depth but on topics that are also essential if you want to take things further

anthk 19 hours ago | parent | prev | next [-]

Klong/k looks ideal for this to have some practice with linear alg concepts:

http://t3x.org/klong/index.html

rramadass 5 hours ago | parent [-]

This was one of the reasons for me wanting to study the J programming language. I have a notion AI/ML programming might be better done using array languages.

Some books for studying Mathematics using J are listed here - https://code.jsoftware.com/wiki/Books

russellbeattie 19 hours ago | parent | prev | next [-]

Someone should convert all the examples into C code so it's more intelligible to programmers who are, let's admit, the main audience for something like this.

To the best of my knowledge: Scalars are variables. Vectors are arrays. Matrices are multi dimensional arrays. Addition and multiplication is iteration with operators. Combinations are concatenation. The rest like dot products or norms are just specialized functions.

But it'd be nice to see it all coded up. It wouldn't be as concise, but it'd be readable.

CamperBob2 19 hours ago | parent [-]

That's basically what you'll get if you pick up a book on 3D game programming. However, progress will come to a halt when you get to things like determinants and eigenvalues that don't show up in core 3D graphics pipelines. You'll have to find other ways to motivate a C version of that part of the curriculum... but I agree, that's a well-worthwhile thing to ask for.

rw_panic0_0 a day ago | parent | prev [-]

how is it beginner friendly, first paragraph and already an obscure formula for non math people

cybrox a day ago | parent | next [-]

You will never find a level of "beginner friendly" that suits everyone.

I agree that this is not an ideal start - at least without any further clarification - for beginners but I think it works well for people that already known mathematical notation but not many specifics of linear algebra.

Also, I don't want to be the preacher bringing this into every argument but this is one of the genuinely good uses for AI that I have found. Bringing the beginning of a beginner friendly work down to my level. I can have it explain this if I'm unsure about the specific syntax and it will convey the relevant idea (which is explained in a bit of unnecessary complexity / generality, yes) in simple terms.

relaxing a day ago | parent | prev [-]

I agree. I wouldn’t consider someone who has taken (and remembers) a course in set theory a beginner without some added qualifier.

One of my pet peeves is using mathematical symbols beyond basic arithmetic without introducing them once by name. Trying to figure out what a symbol is and what branch of math it comes from is extremely frustrating.

solarwindy 14 hours ago | parent | next [-]

VLLMs are incredibly good at decoding math from screenshots, if you’re working from a PDF textbook. ChatGPT especially, and since it’s conversant in LaTeX, it can respond directly in the notation you don’t recognize to break it down for you. It even manages with photos of my handwritten scrawl (mostly).

schoen 19 hours ago | parent | prev | next [-]

There are a handful of textbooks that have a nice appendix that defines each symbol (or maybe sometimes tells you where you can go to learn more about the topic represented by the symbol!). That way they can presume that most readers are already familiar but still be helpful to those who aren't.

CamperBob2 19 hours ago | parent | prev [-]

I haven't taken any courses in set theory, but it makes perfect sense to me. Once someone tells you that the funny script 'R' means "Real numbers", the funny E means "is an element of", and the vertical | means "given that," that's pretty much all you need to know to dive in.

If those concepts cause difficulty, it probably makes sense to go back down the learning curve a bit before tackling linear algebra. Alternatively, just cut and paste the expression into any LLM and it'll explain what's what.