### Abstract

This paper develops efficient algorithms for distributed average consensus with quantized communication using the alternating direction method of multipliers (ADMM). We first study the effects of probabilistic and deterministic quantizations on a distributed ADMM algorithm. With probabilistic quantization, this algorithm yields linear convergence to the desired average in the mean sense with a bounded variance. When deterministic quantization is employed, the distributed ADMM converges to a consensus within 3 + [log_{1+δ}Ω] iterations where δ > 0 depends on the network topology and Ω is a polynomial fraction depending on the quantization resolution, the agents' data, and the network topology. A tight upper bound on the consensus error is also obtained, which depends only on the quantization resolution and the average degree of the graph. This bound is much preferred in large scale networks over existing algorithms whose consensus errors are increasing in the range of agents' data, the quantization resolution, and the number of agents. We finally propose our algorithm which combines both probabilistic and deterministic quantizations. Simulations show that the consensus error of our algorithm is typically less than one quantization resolution for all connected networks where agents' data can be of arbitrary magnitudes.

Original language | English (US) |
---|---|

Article number | 7339710 |

Pages (from-to) | 1700-1713 |

Number of pages | 14 |

Journal | IEEE Transactions on Signal Processing |

Volume | 64 |

Issue number | 7 |

DOIs | |

State | Published - Apr 1 2016 |

### Fingerprint

### Keywords

- Alternating direction method of multipliers
- deterministic quantization
- dither
- linear convergence
- probabilistic quantization
- quantized consensus

### ASJC Scopus subject areas

- Electrical and Electronic Engineering
- Signal Processing

### Cite this

*IEEE Transactions on Signal Processing*,

*64*(7), 1700-1713. [7339710]. https://doi.org/10.1109/TSP.2015.2504341

**Quantized Consensus by the ADMM : Probabilistic Versus Deterministic Quantizers.** / Zhu, Shengyu; Chen, Biao.

Research output: Contribution to journal › Article

*IEEE Transactions on Signal Processing*, vol. 64, no. 7, 7339710, pp. 1700-1713. https://doi.org/10.1109/TSP.2015.2504341

}

TY - JOUR

T1 - Quantized Consensus by the ADMM

T2 - Probabilistic Versus Deterministic Quantizers

AU - Zhu, Shengyu

AU - Chen, Biao

PY - 2016/4/1

Y1 - 2016/4/1

N2 - This paper develops efficient algorithms for distributed average consensus with quantized communication using the alternating direction method of multipliers (ADMM). We first study the effects of probabilistic and deterministic quantizations on a distributed ADMM algorithm. With probabilistic quantization, this algorithm yields linear convergence to the desired average in the mean sense with a bounded variance. When deterministic quantization is employed, the distributed ADMM converges to a consensus within 3 + [log1+δΩ] iterations where δ > 0 depends on the network topology and Ω is a polynomial fraction depending on the quantization resolution, the agents' data, and the network topology. A tight upper bound on the consensus error is also obtained, which depends only on the quantization resolution and the average degree of the graph. This bound is much preferred in large scale networks over existing algorithms whose consensus errors are increasing in the range of agents' data, the quantization resolution, and the number of agents. We finally propose our algorithm which combines both probabilistic and deterministic quantizations. Simulations show that the consensus error of our algorithm is typically less than one quantization resolution for all connected networks where agents' data can be of arbitrary magnitudes.

AB - This paper develops efficient algorithms for distributed average consensus with quantized communication using the alternating direction method of multipliers (ADMM). We first study the effects of probabilistic and deterministic quantizations on a distributed ADMM algorithm. With probabilistic quantization, this algorithm yields linear convergence to the desired average in the mean sense with a bounded variance. When deterministic quantization is employed, the distributed ADMM converges to a consensus within 3 + [log1+δΩ] iterations where δ > 0 depends on the network topology and Ω is a polynomial fraction depending on the quantization resolution, the agents' data, and the network topology. A tight upper bound on the consensus error is also obtained, which depends only on the quantization resolution and the average degree of the graph. This bound is much preferred in large scale networks over existing algorithms whose consensus errors are increasing in the range of agents' data, the quantization resolution, and the number of agents. We finally propose our algorithm which combines both probabilistic and deterministic quantizations. Simulations show that the consensus error of our algorithm is typically less than one quantization resolution for all connected networks where agents' data can be of arbitrary magnitudes.

KW - Alternating direction method of multipliers

KW - deterministic quantization

KW - dither

KW - linear convergence

KW - probabilistic quantization

KW - quantized consensus

UR - http://www.scopus.com/inward/record.url?scp=84962014009&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84962014009&partnerID=8YFLogxK

U2 - 10.1109/TSP.2015.2504341

DO - 10.1109/TSP.2015.2504341

M3 - Article

VL - 64

SP - 1700

EP - 1713

JO - IEEE Transactions on Signal Processing

JF - IEEE Transactions on Signal Processing

SN - 1053-587X

IS - 7

M1 - 7339710

ER -