▲ | jpcompartir 5 days ago | ||||||||||||||||||||||||||||||||||||||||||||||
I would echo some caution if using as a reference, as in another blog the writer states: "Backpropagation, often referred to as “backward propagation of errors,” is the cornerstone of training deep neural networks. It is a supervised learning algorithm that optimizes the weights and biases of a neural network to minimize the error between predicted and actual outputs.." https://chizkidd.github.io/2025/05/30/backpropagation/ backpropagation is a supervised machine learning algorithm, pardon? | |||||||||||||||||||||||||||||||||||||||||||||||
▲ | cl3misch 5 days ago | parent [-] | ||||||||||||||||||||||||||||||||||||||||||||||
I actually see this a lot: confusing backpropagation with gradient descent (or any optimizer). Backprop is just a way to compute the gradients of the weights with respect to the cost function, not an algorithm to minimize the cost function wrt. the weights. I guess giving the (mathematically) simple principle of computing a gradient with the chain rule the fancy name "backpropagation" comes from the early days of AI where the computers were much less powerful and this seemed less obvious? | |||||||||||||||||||||||||||||||||||||||||||||||
|