Thumbnail
Access Restriction
Open

Author Qi, Jinshan ♦ Liang, Xun ♦ Xu, Rui
Editor Tanaka, Toshihisa
Source Hindawi
Content type Text
Publisher Hindawi
File Format PDF
Copyright Year ©2018
Language English
Abstract By utilizing kernel functions, support vector machines (SVMs) successfully solve the linearly inseparable problems. Subsequently, its applicable areas have been greatly extended. Using multiple kernels (MKs) to improve the SVM classification accuracy has been a hot topic in the SVM research society for several years. However, most MK learning (MKL) methods employ L1-norm constraint on the kernel combination weights, which forms a sparse yet nonsmooth solution for the kernel weights. Alternatively, the Lp-norm constraint on the kernel weights keeps all information in the base kernels. Nonetheless, the solution of Lp-norm constraint MKL is nonsparse and sensitive to the noise. Recently, some scholars presented an efficient sparse generalized MKL (L1- and L2-norms based GMKL) method, in which L1  L2 established an elastic constraint on the kernel weights. In this paper, we further extend the GMKL to a more generalized MKL method based on the p-norm, by joining L1- and Lp-norms. Consequently, the L1- and L2-norms based GMKL is a special case in our method when p=2. Experiments demonstrated that our L1- and Lp-norms based MKL offers a higher accuracy than the L1- and L2-norms based GMKL in the classification, while keeping the properties of the L1- and L2-norms based on GMKL.
ISSN 16875265
Learning Resource Type Article
Publisher Date 2018-01-23
Rights License This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
e-ISSN 16875273
Journal Computational Intelligence and Neuroscience
Volume Number 2018
Page Count 7


Open content in new tab

   Open content in new tab