▲ | GistNoesis 3 days ago | |||||||
The way that really made me understand gradients and derivative was when visualizing them as Arrow Maps. I even made a small tool https://github.com/GistNoesis/VisualizeGradient . This visualization helps understand optimization algorithm. Jacobians can be understood as a collection of gradients when considering each coordinates of the output independently. My mental picture for Hessian is to associate each point with the shape of a parabola (or saddle), which best match the function locally. It's easy to visualize once you realize it's the shape of what you see when you zoom-in on the point. (Technically this mental picture is more of a hessian + gradient tangent plane simultaneously multivariate Taylor expansion but I find them hard to mentally separate the slope from the curvature). | ||||||||
▲ | MathMonkeyMan 3 days ago | parent | next [-] | |||||||
The "eigenchris" Youtube channel teaches tensor algebra, differential calculus, general relativity, and some other topics. When I started thinking of vector calculus in terms of multiplying both vector components and the corresponding basis vectors, there was a nice unification of ordinary vector operations, jacobians, and the metric tensor. | ||||||||
| ||||||||
▲ | uoaei 3 days ago | parent | prev [-] | |||||||
I'm also a visual learner and my class on dynamical systems really put a lot into perspective, particularly the parts about classifying stable/unstable/saddle points by finding eigenvectors/values of Jacobians. A lot of optimization theory becomes intuitive once you work through a few of those and compare your understanding to arrow maps like you suggest. |