TY - JOUR
T1 - Ideal regularization for learning kernels from labels
AU - Pan, Binbin
AU - Lai, Jianhuang
AU - Shen, Lixin
N1 - Funding Information:
This project was supported by NSFC grant ( 61173084 , 61128009 , 61272252 ), US National Science Foundation grant ( DMS-0712827 , DMS-1115523 ), Guangdong Provincial Government of China (grant no. 2010.189 ) through the “Computational Science Innovative Research Team”, Natural Science Foundation of SZU (grant no. 00035693 ).
PY - 2014/8
Y1 - 2014/8
N2 - In this paper, we propose a new form of regularization that is able to utilize the label information of a data set for learning kernels. The proposed regularization, referred to as ideal regularization, is a linear function of the kernel matrix to be learned. The ideal regularization allows us to develop efficient algorithms to exploit labels. Three applications of the ideal regularization are considered. Firstly, we use the ideal regularization to incorporate the labels into a standard kernel, making the resulting kernel more appropriate for learning tasks. Next, we employ the ideal regularization to learn a data-dependent kernel matrix from an initial kernel matrix (which contains prior similarity information, geometric structures, and labels of the data). Finally, we incorporate the ideal regularization to some state-of-the-art kernel learning problems. With this regularization, these learning problems can be formulated as simpler ones which permit more efficient solvers. Empirical results show that the ideal regularization exploits the labels effectively and efficiently.
AB - In this paper, we propose a new form of regularization that is able to utilize the label information of a data set for learning kernels. The proposed regularization, referred to as ideal regularization, is a linear function of the kernel matrix to be learned. The ideal regularization allows us to develop efficient algorithms to exploit labels. Three applications of the ideal regularization are considered. Firstly, we use the ideal regularization to incorporate the labels into a standard kernel, making the resulting kernel more appropriate for learning tasks. Next, we employ the ideal regularization to learn a data-dependent kernel matrix from an initial kernel matrix (which contains prior similarity information, geometric structures, and labels of the data). Finally, we incorporate the ideal regularization to some state-of-the-art kernel learning problems. With this regularization, these learning problems can be formulated as simpler ones which permit more efficient solvers. Empirical results show that the ideal regularization exploits the labels effectively and efficiently.
KW - Ideal kernel
KW - Kernel methods
KW - Labels
KW - Regularization
KW - Semi-supervised learning
KW - Von Neumann divergence
UR - http://www.scopus.com/inward/record.url?scp=84900001495&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84900001495&partnerID=8YFLogxK
U2 - 10.1016/j.neunet.2014.04.003
DO - 10.1016/j.neunet.2014.04.003
M3 - Article
C2 - 24824969
AN - SCOPUS:84900001495
SN - 0893-6080
VL - 56
SP - 22
EP - 34
JO - Neural Networks
JF - Neural Networks
ER -