In binary classification, kernel-free quadratic support vector machines are proposed to avoid difficulties such as finding appropriate kernel functions or tuning their hyper-parameters. Furthermore, Universum data points, which do not belong to any class, can be exploited to embed prior knowledge into the corresponding models to improve the general performance.
This paper designs novel kernel-free Universum quadratic surface support vector machine models. Further, this paper proposes the l(1) norm regularized version that is beneficial for detecting potential sparsity patterns in the Hessian of the quadratic surface and reducing to the standard linear models if the data points are (almost) linearly separable.
The proposed models are convex, so standard numerical solvers can be utilized to solve them. Moreover, a least squares version of the l(1) norm regularized model is proposed.
We also design an effective tailored algorithm that only requires solving one linear system. Several theoretical properties of these models are then reported and proved as well.
The numerical results show that the least squares version of the proposed model achieves the highest mean accuracy scores with promising computational efficiency on some artificial and public benchmark data sets. Some statistical tests are conducted to show the competitiveness of the proposed models.