TY - GEN
T1 - Numerical calculation of information rates and capacity of quadrature Gaussian mixture channels
AU - Le, Duc Anh
AU - Vu, Hung V.
AU - Tran, Nghi H.
AU - Gursoy, Mustafa Cenk
AU - Le-Ngoc, Tho
N1 - Publisher Copyright:
© 2016 IEEE.
PY - 2016/9/7
Y1 - 2016/9/7
N2 - This paper presents novel methods to accurately calculate the information rates and capacity of quadrature Gaussian mixture (GM) noise channels without the need of time-consuming Monte Carlo simulations or numerical integrations. The focus is on three important input signals: i) a Gaussian input; ii) a complex input with discrete amplitude and independent uniform phase, which is a capacity-Achieving input; and iii) finite-Alphabet signaling schemes, such as practical quadrature amplitude modulation (QAM). To this end, a novel piecewise-linear curve fitting (PWLCF) method is first proposed to estimate the entropy of a complex GM random variable to achieve any desired level of accuracy. The result can then be used to calculate the information rate when a Gaussian input is used. For a complex input with discrete amplitude and independent uniform phase, the output entropy is estimated in a similar manner but using polar coordinates and the Kernel function. When a finite-Alphabet input is used, we exploit the Laguerre-Gauss quadrature formula for an effective calculation of the output entropy. Combining with the noise entropy, we show that in all cases, the information rates can be computed accurately.
AB - This paper presents novel methods to accurately calculate the information rates and capacity of quadrature Gaussian mixture (GM) noise channels without the need of time-consuming Monte Carlo simulations or numerical integrations. The focus is on three important input signals: i) a Gaussian input; ii) a complex input with discrete amplitude and independent uniform phase, which is a capacity-Achieving input; and iii) finite-Alphabet signaling schemes, such as practical quadrature amplitude modulation (QAM). To this end, a novel piecewise-linear curve fitting (PWLCF) method is first proposed to estimate the entropy of a complex GM random variable to achieve any desired level of accuracy. The result can then be used to calculate the information rate when a Gaussian input is used. For a complex input with discrete amplitude and independent uniform phase, the output entropy is estimated in a similar manner but using polar coordinates and the Kernel function. When a finite-Alphabet input is used, we exploit the Laguerre-Gauss quadrature formula for an effective calculation of the output entropy. Combining with the noise entropy, we show that in all cases, the information rates can be computed accurately.
KW - Complex Gaussian mixture
KW - Gaussian input
KW - discrete input
KW - information rates
KW - piecewise linear approximation
UR - http://www.scopus.com/inward/record.url?scp=84988850968&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84988850968&partnerID=8YFLogxK
U2 - 10.1109/CCE.2016.7562620
DO - 10.1109/CCE.2016.7562620
M3 - Conference contribution
AN - SCOPUS:84988850968
T3 - 2016 IEEE 6th International Conference on Communications and Electronics, IEEE ICCE 2016
SP - 99
EP - 104
BT - 2016 IEEE 6th International Conference on Communications and Electronics, IEEE ICCE 2016
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 6th IEEE International Conference on Communications and Electronics, IEEE ICCE 2016
Y2 - 27 July 2016 through 29 July 2016
ER -