Large-scale attribute selection using wrappers

Martin Gütlein, Eibe Frank, Mark A. Hall, Andreas Karwath

Research output: Contribution to conference (unpublished)Paperpeer-review

182 Citations (Scopus)


Scheme-specific attribute selection with the wrapper and variants of forward selection is a popular attribute selection technique for classification that yields good results. However, it can run the risk of overfitting because of the extent of the search and the extensive use of internal cross-validation. Moreover, although wrapper evaluators tend to achieve superior accuracy compared to filters, they face a high computational cost. The problems of overfitting and high runtime occur in particular on high-dimensional datasets, like microarray data. We investigate Linear Forward Selection, a technique to reduce the number of attributes expansions in each forward selection step. Our experiments demonstrate that this approach is faster, finds smaller subsets and can even increase the accuracy compared to standard forward selection. We also investigate a variant that applies explicit subset size determination in forward selection to combat overfitting, where the search is forced to stop at a precomputed ldquooptimalrdquo subset size. We show that this technique reduces subset size while maintaining comparable accuracy.
Original languageEnglish
Publication statusPublished - 2009


  • crossvalidation, machine learning


Dive into the research topics of 'Large-scale attribute selection using wrappers'. Together they form a unique fingerprint.

Cite this