next up previous contents
Next: Genetic Programming Up: GBML Areas Previous: GBML Areas   Contents

Subsections

GBML for Sub-problems of Learning



3.1 GBML for Sub-problems of Learning

This section briefly reviews ways in which evolution has been used not for the primary task of learning - generating hypotheses - but for sub-problems including data preparation and optimisation within other learning methods.


Evolutionary Feature Selection

Some attributes (features) of the input are of little or no use in classification. We can simplify and speed learning by selecting only useful attributes to work with, especially when there are very many attributes and many contribute little. EAs are widely used in the wrapper approach to feature selection [143] in which the base learner (the one which generates hypotheses) is treated as a black box to be optimised by a search algorithm. In this EAs usually give good results compared to non-evolutionary methods [139,250,166] but there are exceptions [139]. In [63], Estimation of Distribution Algorithms were found to give similar accuracy but run more slowly than a GA. More generally we can weight features (instead of making an all-or-nothing selection) and some learners can use weights directly e.g. weighted k-nearest neighbours [234]. The main drawback of EAs for feature selection is their slowness compared to non-evolutionary methods. See [201,95,96] for overviews and [266,12] for some recent real-world applications.


Evolutionary Feature Construction

Some features are not very useful by themselves but can be when combined with others. We can leave the base learner to discover this itself or we can preprocess data to construct informative new features by combining existing ones e.g. new feature $f_{\mbox{new}}$ = $f_1$ AND $f_3$ AND $f_8$. This is also called constructive induction and there are different approaches. GP has been used to construct features out of the original attributes e.g. [132,164,253]. The original features have also been linearly transformed by evolving a vector of coefficients [145,232]. Simultaneous feature transformation and selection has had good results [234].

Other Sub-problems of Learning

EAs have been used in a variety of other ways. One is training set optimisation in which we can partition the data into training sets [238], select the most useful training inputs [137], and even generate synthetic inputs [322,70]. EAs have also been used for optimisation within a learner e.g. [145] optimised weighted k-nearest neighbours with a GA, [62] optimised decision tree tests using a GA and an Evolution Strategy and [272,273] optimised voting weights in an ensemble. [141] replaced beam search in AQ with a genetic algorithm and similarly [269,83,81,82,270] have investigated Inductive Logic Programming driven by a GA.


next up previous contents
Next: Genetic Programming Up: GBML Areas Previous: GBML Areas   Contents
T Kovacs 2011-03-12