2007
García-Pedrajas, Nicolás; García-Osorio, César; Fyfe, Colin
Nonlinear ``boosting'' projections for ensemble construction Journal Article
In: Journal of Machine Learning Research, vol. 8, pp. 1–33, 2007, ISSN: 1532-4435.
Abstract | Links | BibTeX | Tags: Boosting, Classifier ensembles, Data Mining, Ensemble methods, Neural networks, Nonlinear projections
@article{cgosorio07boosting,
title = {Nonlinear ``boosting'' projections for ensemble construction},
author = {Nicolás García-Pedrajas and César García-Osorio and Colin Fyfe},
url = {http://jmlr.csail.mit.edu/papers/volume8/garcia-pedrajas07a/garcia-pedrajas07a.pdf},
issn = {1532-4435},
year = {2007},
date = {2007-01-01},
journal = {Journal of Machine Learning Research},
volume = {8},
pages = {1–33},
abstract = {In this paper we propose a novel approach for ensemble construction based on the use of nonlinear
projections to achieve both accuracy and diversity of individual classifiers. The proposed approach
combines the philosophy of boosting, putting more effort on difficult instances, with the basis of
the random subspace method. Our main contribution is that instead of using a random subspace,
we construct a projection taking into account the instances which have posed most difficulties to
previous classifiers. In this way, consecutive nonlinear projections are created by a neural network
trained using only incorrectly classified instances. The feature subspace induced by the hidden layer
of this network is used as the input space to a new classifier. The method is compared with bagging
and boosting techniques, showing an improved performance on a large set of 44 problems from the
UCI Machine Learning Repository. An additional study showed that the proposed approach is less
sensitive to noise in the data than boosting methods.},
keywords = {Boosting, Classifier ensembles, Data Mining, Ensemble methods, Neural networks, Nonlinear projections},
pubstate = {published},
tppubtype = {article}
}
In this paper we propose a novel approach for ensemble construction based on the use of nonlinear
projections to achieve both accuracy and diversity of individual classifiers. The proposed approach
combines the philosophy of boosting, putting more effort on difficult instances, with the basis of
the random subspace method. Our main contribution is that instead of using a random subspace,
we construct a projection taking into account the instances which have posed most difficulties to
previous classifiers. In this way, consecutive nonlinear projections are created by a neural network
trained using only incorrectly classified instances. The feature subspace induced by the hidden layer
of this network is used as the input space to a new classifier. The method is compared with bagging
and boosting techniques, showing an improved performance on a large set of 44 problems from the
UCI Machine Learning Repository. An additional study showed that the proposed approach is less
sensitive to noise in the data than boosting methods.
projections to achieve both accuracy and diversity of individual classifiers. The proposed approach
combines the philosophy of boosting, putting more effort on difficult instances, with the basis of
the random subspace method. Our main contribution is that instead of using a random subspace,
we construct a projection taking into account the instances which have posed most difficulties to
previous classifiers. In this way, consecutive nonlinear projections are created by a neural network
trained using only incorrectly classified instances. The feature subspace induced by the hidden layer
of this network is used as the input space to a new classifier. The method is compared with bagging
and boosting techniques, showing an improved performance on a large set of 44 problems from the
UCI Machine Learning Repository. An additional study showed that the proposed approach is less
sensitive to noise in the data than boosting methods.