Machine learning methods employing positive kernels have been developed and widely used for classification, regression, prediction and unsupervised learning applications, whereby the estimate function takes the form of a weighted-sum kernel expansion. Unacceptable computational burden with large datasets and difficulty in tuning hyperparameters are usually the drawbacks of kernel methods. In order to reduce the computational burden, this paper presents a modified version of the Feature Vector Selection (FVS) method, proposing an approximation of the estimate function as a weighted sum of the predicted values of the Feature Vectors (FVs), where the weights are computed as the oblique projections of the new data points on the FVs in the feature space. Such approximation is, then, obtained by optimizing only the predicted values of the FVs. By defining a least square error optimization problem with equal constraints, analytic solutions of the predicted values of the FVs can be obtained. The proposed method is named Feature Vector Regression (FVR). The tuning of hyperparameters in FVR is also explained in the paper and shown to be less complicated than for other kernel methods. Comparisons with some other popular kernel methods for regression on several public datasets show that FVR, with a small subset of the training dataset (i.e. selected FVs), gives results comparable with those of the methods which give best results in terms of the prediction accuracy. The main contribution of this paper is the new kernel method (i.e. FVR), capable of achieving satisfactory results with reduced efforts because of the small number of hyperparameters to be tuned and the reduced training dataset size used.

Feature vector regression with efficient hyperparameters tuning and geometric interpretation

ZIO, ENRICO
2016-01-01

Abstract

Machine learning methods employing positive kernels have been developed and widely used for classification, regression, prediction and unsupervised learning applications, whereby the estimate function takes the form of a weighted-sum kernel expansion. Unacceptable computational burden with large datasets and difficulty in tuning hyperparameters are usually the drawbacks of kernel methods. In order to reduce the computational burden, this paper presents a modified version of the Feature Vector Selection (FVS) method, proposing an approximation of the estimate function as a weighted sum of the predicted values of the Feature Vectors (FVs), where the weights are computed as the oblique projections of the new data points on the FVs in the feature space. Such approximation is, then, obtained by optimizing only the predicted values of the FVs. By defining a least square error optimization problem with equal constraints, analytic solutions of the predicted values of the FVs can be obtained. The proposed method is named Feature Vector Regression (FVR). The tuning of hyperparameters in FVR is also explained in the paper and shown to be less complicated than for other kernel methods. Comparisons with some other popular kernel methods for regression on several public datasets show that FVR, with a small subset of the training dataset (i.e. selected FVs), gives results comparable with those of the methods which give best results in terms of the prediction accuracy. The main contribution of this paper is the new kernel method (i.e. FVR), capable of achieving satisfactory results with reduced efforts because of the small number of hyperparameters to be tuned and the reduced training dataset size used.
2016
Computational complexity; Feature Vector Regression; Feature vector selection; Hyperparameters tuning; Kernel method; Prediction; Regression; Computer Science Applications1707 Computer Vision and Pattern Recognition; Cognitive Neuroscience; Artificial Intelligence
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11311/1020667
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 9
  • ???jsp.display-item.citation.isi??? 7
social impact