City Research Online

Nuqsgd: Provably communication-efficient data-parallel sgd via nonuniform quantization

Ramezani-Kebrya, A., Faghri, F., Markov, I. , Aksenov, V. ORCID: 0000-0001-9134-5490, Alistarh, D. & Roy, D. M. (2021). Nuqsgd: Provably communication-efficient data-parallel sgd via nonuniform quantization. Journal of Machine Learning Research, 22, article number 114.


As the size and complexity of models and datasets grow, so does the need for communication efficient variants of stochastic gradient descent that can be deployed to perform parallel model training. One popular communication-compression method for data-parallel SGD is QSGD (Alistarh et al., 2017), which quantizes and encodes gradients to reduce communication costs. The baseline variant of QSGD provides strong theoretical guarantees, however, for practical purposes, the authors proposed a heuristic variant which we call QSGDinf, which demonstrated impressive empirical gains for distributed training of large neural networks. In this paper, we build on this work to propose a new gradient quantization scheme, and show that it has both stronger theoretical guarantees than QSGD, and matches and exceeds the empirical performance of the QSGDinf heuristic and of other compression methods.

Publication Type: Article
Additional Information: ©2021 Ali Ramezani-Kebrya, Fartash Faghri, Ilya Markov, Vitalii Aksenov, Dan Alistarh, and Daniel M. Roy. License: CC-BY 4.0.
Publisher Keywords: Communication-efficient SGD, Quantization, Gradient Compression, Dataparallel SGD, Deep Learning
Subjects: Q Science > QA Mathematics > QA75 Electronic computers. Computer science
Departments: School of Science & Technology > Computer Science
SWORD Depositor:
[thumbnail of 20-255.pdf]
Text - Published Version
Available under License Creative Commons: Attribution International Public License 4.0.

Download (11MB) | Preview


Add to AnyAdd to TwitterAdd to FacebookAdd to LinkedinAdd to PinterestAdd to Email


Downloads per month over past year

View more statistics

Actions (login required)

Admin Login Admin Login