Quantized Consensus by the ADMM: Probabilistic Versus Deterministic Quantizers

Shengyu Zhu, Biao Chen

Research output: Contribution to journalArticle

14 Citations (Scopus)

Abstract

This paper develops efficient algorithms for distributed average consensus with quantized communication using the alternating direction method of multipliers (ADMM). We first study the effects of probabilistic and deterministic quantizations on a distributed ADMM algorithm. With probabilistic quantization, this algorithm yields linear convergence to the desired average in the mean sense with a bounded variance. When deterministic quantization is employed, the distributed ADMM converges to a consensus within 3 + [log1+δΩ] iterations where δ > 0 depends on the network topology and Ω is a polynomial fraction depending on the quantization resolution, the agents' data, and the network topology. A tight upper bound on the consensus error is also obtained, which depends only on the quantization resolution and the average degree of the graph. This bound is much preferred in large scale networks over existing algorithms whose consensus errors are increasing in the range of agents' data, the quantization resolution, and the number of agents. We finally propose our algorithm which combines both probabilistic and deterministic quantizations. Simulations show that the consensus error of our algorithm is typically less than one quantization resolution for all connected networks where agents' data can be of arbitrary magnitudes.

Original languageEnglish (US)
Article number7339710
Pages (from-to)1700-1713
Number of pages14
JournalIEEE Transactions on Signal Processing
Volume64
Issue number7
DOIs
StatePublished - Apr 1 2016

Fingerprint

Topology
Polynomials
Communication

Keywords

  • Alternating direction method of multipliers
  • deterministic quantization
  • dither
  • linear convergence
  • probabilistic quantization
  • quantized consensus

ASJC Scopus subject areas

  • Electrical and Electronic Engineering
  • Signal Processing

Cite this

Quantized Consensus by the ADMM : Probabilistic Versus Deterministic Quantizers. / Zhu, Shengyu; Chen, Biao.

In: IEEE Transactions on Signal Processing, Vol. 64, No. 7, 7339710, 01.04.2016, p. 1700-1713.

Research output: Contribution to journalArticle

@article{5710d9b7787f49ba80e4a981785b8bdc,
title = "Quantized Consensus by the ADMM: Probabilistic Versus Deterministic Quantizers",
abstract = "This paper develops efficient algorithms for distributed average consensus with quantized communication using the alternating direction method of multipliers (ADMM). We first study the effects of probabilistic and deterministic quantizations on a distributed ADMM algorithm. With probabilistic quantization, this algorithm yields linear convergence to the desired average in the mean sense with a bounded variance. When deterministic quantization is employed, the distributed ADMM converges to a consensus within 3 + [log1+δΩ] iterations where δ > 0 depends on the network topology and Ω is a polynomial fraction depending on the quantization resolution, the agents' data, and the network topology. A tight upper bound on the consensus error is also obtained, which depends only on the quantization resolution and the average degree of the graph. This bound is much preferred in large scale networks over existing algorithms whose consensus errors are increasing in the range of agents' data, the quantization resolution, and the number of agents. We finally propose our algorithm which combines both probabilistic and deterministic quantizations. Simulations show that the consensus error of our algorithm is typically less than one quantization resolution for all connected networks where agents' data can be of arbitrary magnitudes.",
keywords = "Alternating direction method of multipliers, deterministic quantization, dither, linear convergence, probabilistic quantization, quantized consensus",
author = "Shengyu Zhu and Biao Chen",
year = "2016",
month = "4",
day = "1",
doi = "10.1109/TSP.2015.2504341",
language = "English (US)",
volume = "64",
pages = "1700--1713",
journal = "IEEE Transactions on Signal Processing",
issn = "1053-587X",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
number = "7",

}

TY - JOUR

T1 - Quantized Consensus by the ADMM

T2 - Probabilistic Versus Deterministic Quantizers

AU - Zhu, Shengyu

AU - Chen, Biao

PY - 2016/4/1

Y1 - 2016/4/1

N2 - This paper develops efficient algorithms for distributed average consensus with quantized communication using the alternating direction method of multipliers (ADMM). We first study the effects of probabilistic and deterministic quantizations on a distributed ADMM algorithm. With probabilistic quantization, this algorithm yields linear convergence to the desired average in the mean sense with a bounded variance. When deterministic quantization is employed, the distributed ADMM converges to a consensus within 3 + [log1+δΩ] iterations where δ > 0 depends on the network topology and Ω is a polynomial fraction depending on the quantization resolution, the agents' data, and the network topology. A tight upper bound on the consensus error is also obtained, which depends only on the quantization resolution and the average degree of the graph. This bound is much preferred in large scale networks over existing algorithms whose consensus errors are increasing in the range of agents' data, the quantization resolution, and the number of agents. We finally propose our algorithm which combines both probabilistic and deterministic quantizations. Simulations show that the consensus error of our algorithm is typically less than one quantization resolution for all connected networks where agents' data can be of arbitrary magnitudes.

AB - This paper develops efficient algorithms for distributed average consensus with quantized communication using the alternating direction method of multipliers (ADMM). We first study the effects of probabilistic and deterministic quantizations on a distributed ADMM algorithm. With probabilistic quantization, this algorithm yields linear convergence to the desired average in the mean sense with a bounded variance. When deterministic quantization is employed, the distributed ADMM converges to a consensus within 3 + [log1+δΩ] iterations where δ > 0 depends on the network topology and Ω is a polynomial fraction depending on the quantization resolution, the agents' data, and the network topology. A tight upper bound on the consensus error is also obtained, which depends only on the quantization resolution and the average degree of the graph. This bound is much preferred in large scale networks over existing algorithms whose consensus errors are increasing in the range of agents' data, the quantization resolution, and the number of agents. We finally propose our algorithm which combines both probabilistic and deterministic quantizations. Simulations show that the consensus error of our algorithm is typically less than one quantization resolution for all connected networks where agents' data can be of arbitrary magnitudes.

KW - Alternating direction method of multipliers

KW - deterministic quantization

KW - dither

KW - linear convergence

KW - probabilistic quantization

KW - quantized consensus

UR - http://www.scopus.com/inward/record.url?scp=84962014009&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84962014009&partnerID=8YFLogxK

U2 - 10.1109/TSP.2015.2504341

DO - 10.1109/TSP.2015.2504341

M3 - Article

AN - SCOPUS:84962014009

VL - 64

SP - 1700

EP - 1713

JO - IEEE Transactions on Signal Processing

JF - IEEE Transactions on Signal Processing

SN - 1053-587X

IS - 7

M1 - 7339710

ER -