The support vector classification-regression machine for K-class classification (K-SVCR) is a novel multi-class classification method based on the "1-versus-1-versus-rest" structure. In this paper, we propose a least squares version of K-SVCR named LSK-SVCR.
Similarly to the K-SVCR algorithm, this method assesses all the training data into a "1-versus-1-versus-rest" structure, so that the algorithm generates ternary outputs {- 1,0,+ 1}. In LSK-SVCR, the solution of the primal problem is computed by solving only one system of linear equations instead of solving the dual problem, which is a convex quadratic programming problem in K-SVCR.
Experimental results on several benchmark, MC-NDC, and handwritten digit recognition data sets show that not only does the LSK-SVCR have better performance in the aspects of classification accuracy to that of K-SVCR and Twin-KSVC algorithms but also has remarkably higher learning speed.