| ▲ | CamperBob2 3 days ago | |
Anyone who watches the videos and follows along will indeed come up to speed on the basics of neural nets, at least with respect to MLPs. It's an excellent introduction. | ||
| ▲ | HarHarVeryFunny 3 days ago | parent [-] | |
Sure, the basics of neural nets, but it seems just as a foundation leading to LLMs. He doesn't cover the zoo of ANN architectures such as ResNets, RNNs, LSTMs, GANs, diffusion models, etc, and barely touches on regularization, optimization, etc, other than mentioning BatchNorm and promising ADAM in a later video. It's a useful series of videos no doubt, but his goal is to strip things down to basics and show how an ANN like a Transformer can be built from the ground up without using all the tools/libraries that would actually be used in practice. | ||