| ▲ | cubefox 2 hours ago | |
> Training a one bit neural network from scratch is apparently an unsolved problem though. It was until recently, but there is a new method which trains them directly without any floating point math, using "Boolean variation" instead of Newton/Leibniz differentiation: https://proceedings.neurips.cc/paper_files/paper/2024/hash/7... | ||