Discrete weight Neural networks

We have published a pre print, here we discuss the use of how Quantized Fixed precision weights neural network have more understanding of training dataset than float 32 weights weights achieved 6% less training loss and more generalization on the same dataset.

This is major step towards bridging the gap between silicon intelligence and carbon based intelligence.

https://www.researchsquare.com/article/rs-8499757/v1