Distributed inference in tree networks using coding theory

Bhavya Kailkhura, Aditya Vempaty, Pramod K. Varshney

Research output: Contribution to journalArticle

4 Scopus citations

Abstract

In this paper, we consider the problem of distributed inference in tree based networks. In the framework considered in this paper, distributed nodes make a 1-bit local decision regarding a phenomenon before sending it to the fusion center (FC) via intermediate nodes. We propose the use of coding theory based techniques to solve this distributed inference problem in such structures. Data is progressively compressed as it moves towards the FC. The FC makes the global inference after receiving data from intermediate nodes. Data fusion at nodes as well as at the FC is implemented via error correcting codes. In this context, we analyze the performance for a given code matrix and also design the optimal code matrices at every level of the tree. We address the problems of distributed classification and distributed estimation separately and develop schemes to perform these tasks in tree networks. The proposed schemes are of practical significance due to their simple structure. We study the asymptotic inference performance of our schemes for two different classes of tree networks: fixed height tree networks, and fixed degree tree networks. We derive the sufficient conditions under which the proposed schemes are asymptotically optimal.

Original languageEnglish (US)
Article number7109942
Pages (from-to)3755-3766
Number of pages12
JournalIEEE Transactions on Signal Processing
Volume63
Issue number14
DOIs
StatePublished - Jul 15 2015

Keywords

  • Distributed classification
  • distributed estimation
  • error correcting codes
  • information fusion
  • tree networks
  • wireless sensor networks

ASJC Scopus subject areas

  • Signal Processing
  • Electrical and Electronic Engineering

Fingerprint Dive into the research topics of 'Distributed inference in tree networks using coding theory'. Together they form a unique fingerprint.

  • Cite this