Estimation of KL divergence between large-alphabet distributions

Yuheng Bu, Shaofeng Zou, Yingbin Liang, Venugopal V. Veeravalli

Research output: Chapter in Book/Report/Conference proceedingConference contribution

10 Scopus citations


The problem of estimating the KL divergence between two unknown distributions is studied. The alphabet size k of the distributions can scale to infinity. The estimation is based on m and n independent samples respectively drawn from the two distributions. It is first shown that there does not exist any consistent estimator to guarantee asymptotic small worst-case quadratic risk over the set of all pairs of distributions. A restricted set that contains pairs of distributions with bounded ratio F(k) is further considered. An augmented plug-in estimator is proposed, and is shown to be consistent if and only if m = ω(k V log2(F(k)) and n = ω(kF(k)). Furthermore, if F(k) ≥ log2k and log2(F(k)) = o(k), it is shown that any consistent estimator must satisfy the necessary conditions: m = ω( k/log k V log2(F(k)) and n = ω( kF(k)/log k).

Original languageEnglish (US)
Title of host publicationProceedings - ISIT 2016; 2016 IEEE International Symposium on Information Theory
PublisherInstitute of Electrical and Electronics Engineers Inc.
Number of pages5
ISBN (Electronic)9781509018062
StatePublished - Aug 10 2016
Event2016 IEEE International Symposium on Information Theory, ISIT 2016 - Barcelona, Spain
Duration: Jul 10 2016Jul 15 2016


Other2016 IEEE International Symposium on Information Theory, ISIT 2016

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Information Systems
  • Modeling and Simulation
  • Applied Mathematics

Fingerprint Dive into the research topics of 'Estimation of KL divergence between large-alphabet distributions'. Together they form a unique fingerprint.

Cite this