Abstract
Wyner's common information was originally defined for a pair of dependent discrete random variables. Its significance is largely reflected in, and also confined to, several existing interpretations in various source coding problems. This paper attempts to expand its practical significance by providing a new operational interpretation. In the context of the Gray-Wyner network, it is established that Wyner's common information has a new lossy source coding interpretation. Specifically, it is established that, under suitable conditions, Wyner's common information equals to the smallest common message rate when the total rate is arbitrarily close to the rate distortion function with joint decoding for the Gray-Wyner network. A surprising observation is that such equality holds independent of the values of distortion constraints as long as the distortions are within some distortion region. The new lossy source coding interpretation provides the first meaningful justification for defining Wyner's common information for continuous random variables and the result can also be extended to that of multiple variables. Examples are given for characterizing the rate distortion region for the Gray-Wyner lossy source coding problem and for identifying conditions under which Wyner's common information equals that of the smallest common rate. As a by-product, the explicit expression for the common information between a pair of Gaussian random variables is obtained.
Original language | English (US) |
---|---|
Article number | 7349201 |
Pages (from-to) | 754-768 |
Number of pages | 15 |
Journal | IEEE Transactions on Information Theory |
Volume | 62 |
Issue number | 2 |
DOIs | |
State | Published - Feb 1 2016 |
Keywords
- Common information
- Gray-Wyner network
- Rate distortion function
ASJC Scopus subject areas
- Information Systems
- Computer Science Applications
- Library and Information Sciences