TY - GEN
T1 - Distributed average consensus with deterministic quantization
T2 - IEEE Global Conference on Signal and Information Processing, GlobalSIP 2015
AU - Zhu, Shengyu
AU - Chen, Biao
N1 - Publisher Copyright:
© 2015 IEEE.
PY - 2016/2/23
Y1 - 2016/2/23
N2 - This paper develops efficient algorithms for distributed average consensus with quantized communication using the alternating direction method of multipliers (ADMM). When rounding quantization is employed, a distributed ADMM algorithm is shown to converge to a consensus within 3 + log1+δ ω iterations where δ > 0 depends on the network topology and O is a polynomial of the quantization resolution, the agents' data and the network topology. A tight upper bound on the consensus error is also obtained, which depends only on the quantization resolution and the average degree of the graph. This bound is much preferred in large scale networks over existing algorithms whose consensus errors are increasing in the range of agents' data, the quantization resolution, and the number of agents. To minimize the consensus error, our final algorithm uses dithered quantization to obtain a good starting point and then adopts rounding quantization to reach a consensus. Simulations show that the consensus error of this algorithm is typically less than one quantization resolution for all connected networks with agents' data of arbitrary magnitudes.
AB - This paper develops efficient algorithms for distributed average consensus with quantized communication using the alternating direction method of multipliers (ADMM). When rounding quantization is employed, a distributed ADMM algorithm is shown to converge to a consensus within 3 + log1+δ ω iterations where δ > 0 depends on the network topology and O is a polynomial of the quantization resolution, the agents' data and the network topology. A tight upper bound on the consensus error is also obtained, which depends only on the quantization resolution and the average degree of the graph. This bound is much preferred in large scale networks over existing algorithms whose consensus errors are increasing in the range of agents' data, the quantization resolution, and the number of agents. To minimize the consensus error, our final algorithm uses dithered quantization to obtain a good starting point and then adopts rounding quantization to reach a consensus. Simulations show that the consensus error of this algorithm is typically less than one quantization resolution for all connected networks with agents' data of arbitrary magnitudes.
KW - Quantized consensus
KW - alternating direction method of multipliers
KW - deterministic quantization
UR - http://www.scopus.com/inward/record.url?scp=84964758643&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84964758643&partnerID=8YFLogxK
U2 - 10.1109/GlobalSIP.2015.7418285
DO - 10.1109/GlobalSIP.2015.7418285
M3 - Conference contribution
AN - SCOPUS:84964758643
T3 - 2015 IEEE Global Conference on Signal and Information Processing, GlobalSIP 2015
SP - 692
EP - 696
BT - 2015 IEEE Global Conference on Signal and Information Processing, GlobalSIP 2015
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 13 December 2015 through 16 December 2015
ER -