The common information of N dependent random variables

Liu Wei, Xu Ge, Chen Biao

Research output: Chapter in Book/Entry/PoemConference contribution

26 Scopus citations

Abstract

This paper generalizes Wyner's definition of common information of a pair of random variables to that of N random variables. We prove coding theorems that show the same operational meanings for the common information of two random variables generalize to that of N random variables. As a byproduct of our proof, we show that the Gray-Wyner source coding network can be generalized to N source squences with N decoders. We also establish a monotone property of Wyner's common information which is in contrast to other notions of the common information, specifically Shannon's mutual information and Gács and Körner's common randomness. Examples about the computation of Wyner's common information of N random variables are also given.

Original languageEnglish (US)
Title of host publication2010 48th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2010
Pages836-843
Number of pages8
DOIs
StatePublished - 2010
Event48th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2010 - Monticello, IL, United States
Duration: Sep 29 2010Oct 1 2010

Publication series

Name2010 48th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2010

Other

Other48th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2010
Country/TerritoryUnited States
CityMonticello, IL
Period9/29/1010/1/10

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Control and Systems Engineering

Fingerprint

Dive into the research topics of 'The common information of N dependent random variables'. Together they form a unique fingerprint.

Cite this