TY - GEN
T1 - Information for inference
AU - Xu, Ge
AU - Chen, Biao
PY - 2011
Y1 - 2011
N2 - Wyner defined the notion of common information of two discrete random variables as the minimum of I(W; X, Y) where W induces conditional independence between X and Y. Its generalization to multiple dependent random variables revealed a surprising monotone property in the number of variables. Motivated by this monotonicity property, this paper explores the application of Wyner's common information to inference problems and its connection with other performance metrics. A central question is that under what conditions Wyner's common information captures the entire information contained in the observations about the inference object under a simple Bayesian model. For infinitely exchangeable random variables, it is shown using the de Finetti-Hewitt-Savage theorem that the common information is asymptotically equal to the information of the inference object. For finite exchangeable random variables, such conclusion is no longer true even for infinitely extendable sequences. However, for some special cases, including both the binary and the Gaussian cases, concrete connection between common information and inference performance metrics can be established even for finite samples.
AB - Wyner defined the notion of common information of two discrete random variables as the minimum of I(W; X, Y) where W induces conditional independence between X and Y. Its generalization to multiple dependent random variables revealed a surprising monotone property in the number of variables. Motivated by this monotonicity property, this paper explores the application of Wyner's common information to inference problems and its connection with other performance metrics. A central question is that under what conditions Wyner's common information captures the entire information contained in the observations about the inference object under a simple Bayesian model. For infinitely exchangeable random variables, it is shown using the de Finetti-Hewitt-Savage theorem that the common information is asymptotically equal to the information of the inference object. For finite exchangeable random variables, such conclusion is no longer true even for infinitely extendable sequences. However, for some special cases, including both the binary and the Gaussian cases, concrete connection between common information and inference performance metrics can be established even for finite samples.
UR - http://www.scopus.com/inward/record.url?scp=84862955761&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84862955761&partnerID=8YFLogxK
U2 - 10.1109/Allerton.2011.6120347
DO - 10.1109/Allerton.2011.6120347
M3 - Conference contribution
AN - SCOPUS:84862955761
SN - 9781457718168
T3 - 2011 49th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2011
SP - 1516
EP - 1520
BT - 2011 49th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2011
T2 - 2011 49th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2011
Y2 - 28 September 2011 through 30 September 2011
ER -