| ▲ | malnourish 2 hours ago | |
I read through this entire article. There was some value in it, but I found it to be very "draw the rest of the owl". It read like introductions to conceptual elements or even proper segues had been edited out. That said, I appreciated the interactive components. | ||
| ▲ | davidw 2 hours ago | parent [-] | |
It started off nicely but before long you get "The MLP (multilayer perceptron) is a two-layer feed-forward network: project up to 64 dimensions, apply ReLU (zero out negatives), project back to 16" Which starts to feel pretty owly indeed. I think the whole thing could be expanded to cover some more of it in greater depth. | ||