Abstract
Universal hypothesis testing (UHT) refers to the problem of deciding whether samples come from a nominal distribution or an unknown distribution that is different from the nominal distribution. Hoeffding's test, whose test statistic is equivalent to the empirical Kullback-Leibler divergence (KL divergence), is known to be asymptotically optimal for distributions defined on finite alphabets. With continuous observations, however, the discontinuity of the KL divergence in the distribution functions results in significant complications for UHT. This paper introduces a robust version of the classical KL divergence, defined as the KL divergence from a distribution to the Lévy ball of a known distribution. This robust KL divergence is shown to be continuous in the underlying distribution function with respect to the weak convergence. The continuity property enables the development of an asymptotically optimal test for the university hypothesis testing problem with continuous observations. The optimality is in the same sense as that of the Hoeffding's test and stronger than that of Zeitouni and Gutman. Perhaps more importantly, the developed test statistic can be computed through convex programs, making it much more meaningful in practice. Numerical experiments are also conducted to evaluate its performance as compared with some kernel based goodness of fit test that has been proposed recently.
Original language | English (US) |
---|---|
Article number | 8528471 |
Pages (from-to) | 2360-2373 |
Number of pages | 14 |
Journal | IEEE Transactions on Information Theory |
Volume | 65 |
Issue number | 4 |
DOIs | |
State | Published - Apr 2019 |
Keywords
- Kullback-Leibler divergence
- Lévy metric
- universal hypothesis testing
ASJC Scopus subject areas
- Information Systems
- Computer Science Applications
- Library and Information Sciences