Decentralized data reduction with quantization constraints

Ge Xu, Shengyu Zhu, Biao Chen

Research output: Contribution to journalArticlepeer-review

3 Scopus citations


A guiding principle for data reduction in statistical inference is the sufficiency principle. This paper extends the classical sufficiency principle to decentralized inference, i.e., data reduction needs to be achieved in a decentralized manner. We examine the notions of local and global sufficient statistics and the relationship between the two for decentralized inference under different observation models. We then consider the impact of quantization on decentralized data reduction, which is often needed when communications among sensors are subject to finite capacity constraints. The central question we intend to ask is: if each node in a decentralized inference system has to summarize its data using a finite number of bits, is it still optimal to implement data reduction using global sufficient statistics prior to quantization? We show that the answer is negative using a simple example and proceed to identify conditions under which sufficiency based data reduction followed by quantization is indeed optimal. They include the well known case when the data at decentralized nodes are conditionally independent as well as a class of problems with conditionally dependent observations that admit conditional independence structure through the introduction of an appropriately chosen hidden variable.

Original languageEnglish (US)
Article number6728713
Pages (from-to)1775-1784
Number of pages10
JournalIEEE Transactions on Signal Processing
Issue number7
StatePublished - Apr 1 2014


  • Decentralized inference
  • quantization
  • sufficiency principle
  • sufficient statistic

ASJC Scopus subject areas

  • Signal Processing
  • Electrical and Electronic Engineering

Fingerprint Dive into the research topics of 'Decentralized data reduction with quantization constraints'. Together they form a unique fingerprint.

Cite this