Paper Group ANR 11
April 3, 2020
Communication Efficient Federated Learning over Multiple Access Channels. Shifted and Squeezed 8-bit Floating Point format for Low-Precision Training of Deep Neural Networks. A Study of Human Summaries of Scientific Articles. Low-Complexity LSTM Training and Inference with FloatSD8 Weight Representation. Least squares binary quantization of neural …