next up previous contents
Next: Conclusions Up: GBML Areas Previous: Learning Classifier Systems   Contents

Subsections


3.6 Genetic Fuzzy Systems

Following the section on LCS, this section covers a second actively developing approach to evolving rule-based systems. We will see that the two areas overlap considerably and that the distinction between them is somewhat arbitrary. Nonetheless the two communities and their literatures are somewhat disjoint.

Fuzzy Logic is a major paradigm in soft computing which provides a means of approximate reasoning not found in traditional crisp logic. Genetic Fuzzy Systems (GFS) apply evolution to fuzzy learning systems in various ways: GAs, GP and Evolution Strategies have all been used. We will cover a particular form of GFS called genetic Fuzzy Rule-Based Systems (FRBS), which are also known as Learning Fuzzy Classifier Systems (LFCS) [25] or referred to as e.g. ``genetic learning of fuzzy rules'' and (for Reinforcement Learning tasks) ``fuzzy Q-learning''. Like other LCS, FRBS evolve if-then rules but in FRBS the rules are fuzzy. Most systems are Pittsburgh but there are many Michigan examples [283,284,104,25,218,65,223]. In addition to FRBS we briefly cover genetic fuzzy NNs but we do not cover genetic fuzzy clustering (see [74]).

In the terminology of fuzzy logic, ordinary scalar values are called crisp values. A membership function defines the degree of match between crisp values and a set of fuzzy linguistic terms. The set of terms is a fuzzy set. The following figure shows a membership function for the set $\{$cold, warm, hot$\}$.

Image shadow-fuzzy-sets

Each crisp value matches each term to some degree in the interval [0,1], so, for example, a membership function might define 5$^\circ$ as 0.8 cold, 0.3 warm and 0.0 hot. The process of computing the membership of each term is called fuzzification and can be considered a form of discretisation. Conversely, defuzzification refers to computing a crisp value from fuzzy values.

Fuzzy rules are condition/action (IF-THEN) rules composed of a set of linguistic variables (e.g. temperature, humidity) which can each take on linguistic terms (e.g. cold, warm, hot). For example:


\begin{frame}{%draw a box around table
\begin{tabular}{lll}
IF temperature IS ...
... warm & AND humidity IS low & THEN heater IS medium
\end{tabular}}
\end{frame}

As illustrated in figure 12 (adapted from [123]), a fuzzy rule-based system consists of:

Figure 12: Components and information flow in a fuzzy rule-based system
Image shadow-fuzzy-inference

3.6.1 Evolution of FRBSs

We distinguish i) genetic tuning and ii) genetic learning of DB, RB or inference engine parameters.

Genetic Tuning

The concept behind genetic tuning is to first train a hand-crafted FRBS and then to evolve the DB (linguistic terms and membership functions) to improve performance. In other words, we do not alter the hand-crafted rule base but only tune its parameters. Specifically, we can adjust the shape of the membership functions, adjust parameterised expressions in the (adaptive) inference system and adapt defuzzification methods.

Genetic Learning

The concept of genetic learning is to evolve the DB, RB or inference engine parameters. There are a number of approaches. In genetic rule learning we usually predefine the DB by hand and evolve the RB. In genetic rule selection we use the GA to remove irrelevant, redundant, incorrect or conflicting rules. This is a similar role to condensation in LCS (see §3.5.3). In genetic KB learning we learn both the DB and RB. We either learn the DB first and then learn the RB or we iteratively learn a series of DBs and evaluate each one by learning an RB using it.

It is also possible to learn components simultaneously which may produce better results though the larger search space makes it slower and more difficult than adapting components independently. As examples, [209] learns the DB and RB simultaneously while [129] simultaneously learns KB components and inference engine parameters.

Recently [242] claimed that all existing GFS have been applied to crisp data and that with such data the benefits of GFS compared to other learning methods are limited to linguistic interpretability. However, GFS has the potential to outperform other methods on fuzzy data and they identify three cases ([242] p. 558):

  1. crisp data with hand-added fuzziness
  2. transformations of data based on semantic interpretations of fuzzy sets
  3. inherently fuzzy data

They argue GFS should use fuzzy fitness functions in such cases to deal directly with the uncertainty in the data and propose such systems as a new class of GFS to add to the taxonomy of [123].

3.6.2 Genetic Neuro-fuzzy Systems

A Neuro-Fuzzy System (NFS) or Fuzzy Neural Network (FNN) is any combination of fuzzy logic and neural networks. Among the many examples of such systems, [188] uses a GA to minimise the error of the NN, [116] uses both a GA and backpropagation to minimise error, [229] optimises a fuzzy expert system using a GA and NN, and [209] uses a NN to approximate the fitness function for a GA which adapts membership functions and control rules. See [74] for an introduction to NFS, [189] for a review of EAs, NNs and fuzzy logic from the perspective of intelligent control, and [120] for a discussion of combining the three. [150] introduces Fuzzy All-permutations Rule-Bases (FARBs) which are mathematically equivalent to NNs.

3.6.3 Conclusions

Herrera [123] p. 38 lists the following active areas within GFS:

  1. Multiobjective genetic learning of FRBSs: interpretability-precision trade-off
  2. GA-based techniques for mining fuzzy association rules and novel data mining approaches
  3. Learning genetic models based on low quality data (e.g. noisy data)
  4. Genetic learning of fuzzy partitions and context adaptation
  5. Genetic adaptation of inference engine components
  6. Revisiting the Michigan-style GFSs

Herrera also lists (p. 42) current issues for GFS:

  1. Human readability
  2. New data mining tasks: frequent and interesting pattern mining, mining data streams ...
  3. Dealing with high dimensional data

Reading

There is a substantial GFS literature. Notable works include the four seminal 1991 papers on genetic tuning of the DB [146], the Michigan approach [283], the Pittsburgh approach [274] and relational matrix-based FRBS [230]. Subsequent work includes Geyer-Schulz's 1997 book on Michigan fuzzy LCS learning RBs with GP [104], Bonarini's 2000 introductory chapter from an LCS perspective [25], Mitra and Hayashi's 2000 survey of neuro-fuzzy rule generation methods [208], Cordon et al.'s 2001 book on Genetic Fuzzy Systems in general [74], Angelov's 2002 book on evolving FRBS [8], chapter 10 of Freitas' 2002 book on evolutionary data mining [95], Herrera's 2008 survey article on GFS [123] (which lists further key reading), and finally Kolman and Margaliot's 2009 book on the neuro-fuzzy FARB approach [150].


next up previous contents
Next: Conclusions Up: GBML Areas Previous: Learning Classifier Systems   Contents
T Kovacs 2011-03-12