Least Squares Support Vector Regression (LSSVR) which is a least squares version of the Sup-port Vector Regression (SVR) is defined with a regularized squared loss without epsilon-in-sensitiveness. LSSVR is formulated in the dual space as a linear equality constrained quadratic minimization which can be transformed into solution of a linear algebraic equation system. As a consequence of this system where the number of Lagrange multipliers is half that of classical SVR, LSSVR has much less time consumption compared to the classical SVR. De-spite this computationally attractive feature, it lacks the sparsity characteristic of SVR due to epsilon-insensitiveness. In LSSVR, every (training) input data is treated as a support vector, yielding extremely poor generalization performance. To overcome these drawbacks, the epsi-lon-insensitive LSSVR with epsilon-insensitivity at quadratic loss, in which sparsity is directly controlled by the epsilon parameter, is derived in this paper. Since the quadratic loss is sensi-tive to outliers, its weighted version (epsilon insensitive WLSSVR) has also been developed. Finally, the performances of epsilon-insensitive LSSVR and epsilon-insensitive WLSSVR are quantitatively compared in detail with those commonly used in the literature, pruning-based LSSVR and weighted pruning-based LSSVR. Experimental results on simulated and 8 differ-ent real-life data show that epsilon-insensitive LSSVR and epsilon-insensitive WLSSVR are superior in terms of computation time, generalization ability, and sparsity.
Least Squares Support Vector Regression Pruning Epsilon Insensitiveness Robustness Sparseness
Birincil Dil | İngilizce |
---|---|
Konular | Klinik Kimya |
Bölüm | Research Articles |
Yazarlar | |
Yayımlanma Tarihi | 30 Nisan 2024 |
Gönderilme Tarihi | 6 Mayıs 2022 |
Yayımlandığı Sayı | Yıl 2024 Cilt: 42 Sayı: 2 |
IMPORTANT NOTE: JOURNAL SUBMISSION LINK https://eds.yildiz.edu.tr/sigma/