| ▲ | godelski 4 hours ago | |
FYI float is already quantized. It isn't continuous nor infinite. Even the distribution of representable numbers isn't uniform (more dense in [-1,1]). | ||
| ▲ | measurablefunc 3 hours ago | parent [-] | |
The standard definition of quantized arithmetic for neural networks is not the same as the one used for floating point or double floating point values in the IEEE standardization of "real" arithmetic: https://arxiv.org/abs/1712.05877 | ||