A unification of Genetic Algorithms, Neural Networks and Fuzzy Logic: The GANNFL Approach

  • Martin Schmidt
Poster Presentations 1 Theory II: Learning
Part of the Lecture Notes in Computer Science book series (LNCS, volume 1112)


The GANNFL Approach uses a steady state Genetic Algorithm (GA) to build and train hybrid classifiers which are combinations of Neural Networks (NN's) and Fuzzy Logic (FL). This novel approach finds both the architecture, the types of the hidden units, the types of the output units, and all weights ! The designed modular and tight GA encoding together with the GA fitness function lets the GA develop high performance hybrid classifiers, which consist of NN parts and FL parts co-operating tightly within the same architecture. By analysing the behaviour of the GA it will be investigated whether there is evidence for preferring NN classifiers or FL classifiers. Further, the importance of the types of the hidden units is investigated. Parameter reduction is a very important issue according to the theory of the VC-dimension and Ockham's Razor. Hence, the GANNFL Approach also focuses on parameter reduction, which is achieved by automatically pruning unnecessary weights and units. The GANNFL Approach was tested on 5 well known classification problems: The artificial, noisy monks3 problem and 4 problems representing difficult real-world problems that contain missing, noisy, misclassified, and few data: The cancer, card, diabetes and glass problems. The results are compared to public available results found by other NN approaches, the sGANN approach (simple GA to train NN's), and the ssGAFL approach (steady state GA to train FL classifiers). In every case the GANNFL Approach found a better or comparable result than the best other approach!


Genetic Algorithm Fuzzy Logic Neural Networks uncertain real-world data 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. [1]
    L. Prechelt, “Proben1 — A Set of Neural Network Benchmark Problems and Benchmarking Rules”, TR 21/94 (1994), anonymous ftp: /pub/papers/techreports/1994/ on Scholar
  2. [2]
    Martin Schmidt and Thomas Stidsen, “Using GA to train NN using sharing and pruning”, SCAI'95, anon. ftp: pub/empl/marsch/ on ftp.daimi.aau.dkGoogle Scholar
  3. [3]
    Martin Schmidt and Thomas Stidsen, “GA to train NN's using sharing and pruning: Global GA search combined with local BP search”, ADT'95, anon. ftp: pub/empl/marsch/ on, also in John Wiley's book “Neural Networks and their Applications” (chapter 8).Google Scholar
  4. [4]
    Martin Schmidt, “Genetic Algorithm to train Fuzzy Logic Rulebases and Neural Networks”, FLAMOC'96, anon. ftp: pub/empl/marsch/ on ftp.daimi.aau.dkGoogle Scholar
  5. [5]
    Wnek, Sarma, Wahab and Michalski, “Comparison learning paradigms via diagrammatic Visualization”, TR (1990), George Mason UniversityGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1996

Authors and Affiliations

  • Martin Schmidt
    • 1
  1. 1.University of AarhusDenmark

Personalised recommendations