vqSGD: Vector Quantized Stochastic Gradient Descent

Venkata Gandikota, Daniel Kane, Raj Kumar Maity, Arya Mazumdar

Research output: Contribution to journalConference Articlepeer-review

39 Scopus citations

Abstract

In this work, we present a family of vector quantization schemes vqSGD (Vector-Quantized Stochastic Gradient Descent) that provide an asymptotic reduction in the communication cost with convergence guarantees in first-order distributed optimization. In the process we derive the following fundamental information theoretic fact: ⊖(Rd2 ) bits are necessary and sufficient (up to an additive O(log d) term) to describe an unbiased estimator ĝ(g) for any g in the ddimensional unit sphere, under the constraint that kĝ(g)k2 ≤ R almost surely. In particular, we consider a randomized scheme based on the convex hull of a point set, that returns an unbiased estimator of a d-dimensional gradient vector with almost surely bounded norm. We provide multiple efficient instances of our scheme, that are near optimal, and require only o(d) bits of communication at the expense of tolerable increase in error. The instances of our quantization scheme are obtained using the properties of binary error-correcting codes and provide a smooth tradeoff between the communication and the estimation error of quantization. Furthermore, we show that vqSGD also offers some automatic privacy guarantees.

Original languageEnglish (US)
Pages (from-to)2197-2205
Number of pages9
JournalProceedings of Machine Learning Research
Volume130
StatePublished - 2021
Event24th International Conference on Artificial Intelligence and Statistics, AISTATS 2021 - Virtual, Online, United States
Duration: Apr 13 2021Apr 15 2021

ASJC Scopus subject areas

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering
  • Statistics and Probability

Fingerprint

Dive into the research topics of 'vqSGD: Vector Quantized Stochastic Gradient Descent'. Together they form a unique fingerprint.

Cite this