Back to Feed
Robust Training of Neural Networks at Arbitrary Precision and Sparsity

arXiv:2409.09245v3 Announce Type: replace-cross Abstract: The discontinuous operations inherent in quantization and sparsification introduce a long-standing obstacle to backpropagation, particularly in ultra-low precision and sparse regimes. While the community has long viewed...

🔗 Read more: https://arxiv.org/abs/2409.09245

#News #AI #Neuro #Software #Policy #Academic
Edited

Comments

No comments yet. Be the first to comment!