▲ | cl3misch 5 days ago | |||||||||||||||||||||||||||||||||||||||||||||||||||||||
In the entropy implementation:
Using `where` in ufuncs like log results in the output being uninitialized (undefined) at the locations where the condition is not met. Summing over that array will return incorrect results for sure.Better would be e.g.
Also, the cross entropy code doesn't match the equation. And, as explained in the comment below the post, Ax+b is not a linear operation but affine (because of the +b).Overall it seems like an imprecise post to me. Not bad, but not stringent enough to serve as a reference. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||
▲ | jpcompartir 5 days ago | parent [-] | |||||||||||||||||||||||||||||||||||||||||||||||||||||||
I would echo some caution if using as a reference, as in another blog the writer states: "Backpropagation, often referred to as “backward propagation of errors,” is the cornerstone of training deep neural networks. It is a supervised learning algorithm that optimizes the weights and biases of a neural network to minimize the error between predicted and actual outputs.." https://chizkidd.github.io/2025/05/30/backpropagation/ backpropagation is a supervised machine learning algorithm, pardon? | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|