Ideal regularization for learning kernels from labels

Binbin Pan, Jianhuang Lai, Lixin Shen

Research output: Contribution to journalArticle

13 Scopus citations

Abstract

In this paper, we propose a new form of regularization that is able to utilize the label information of a data set for learning kernels. The proposed regularization, referred to as ideal regularization, is a linear function of the kernel matrix to be learned. The ideal regularization allows us to develop efficient algorithms to exploit labels. Three applications of the ideal regularization are considered. Firstly, we use the ideal regularization to incorporate the labels into a standard kernel, making the resulting kernel more appropriate for learning tasks. Next, we employ the ideal regularization to learn a data-dependent kernel matrix from an initial kernel matrix (which contains prior similarity information, geometric structures, and labels of the data). Finally, we incorporate the ideal regularization to some state-of-the-art kernel learning problems. With this regularization, these learning problems can be formulated as simpler ones which permit more efficient solvers. Empirical results show that the ideal regularization exploits the labels effectively and efficiently.

Original languageEnglish (US)
Pages (from-to)22-34
Number of pages13
JournalNeural Networks
Volume56
DOIs
StatePublished - Aug 2014

Keywords

  • Ideal kernel
  • Kernel methods
  • Labels
  • Regularization
  • Semi-supervised learning
  • Von Neumann divergence

ASJC Scopus subject areas

  • Cognitive Neuroscience
  • Artificial Intelligence

Fingerprint Dive into the research topics of 'Ideal regularization for learning kernels from labels'. Together they form a unique fingerprint.

  • Cite this