|
系统工程理论与实践 2004
Least Square Generalized Support Vector Machines for Regression
|
Abstract:
Least square generalized support vector machines (LS__GSVMs) are applied to regression estimation. LS__GSVMs' kernel functions have no or few limits when they are compared with standard support vector machines (SVMs). We give a presentation of quadratic programming (QP) problem for the LS__GSVMs. In order to solve the QP problem, we apply the combination of the gradient projection and successive overrelaxation (SOR) based on the matrix splitting. That is, we train the LS__GSVMs with above algorithm. Because SOR handles one point at a time, it can process very large datasets that need not reside in memory.