Genetic algorithms in constructive induction

  • Jacek Jelonek
  • Maciej Komosiński
Communications 8B Evolutionary Computation
Part of the Lecture Notes in Computer Science book series (LNCS, volume 1609)


In this paper, genetic algorithms are used in machine learning classification task. They act as a constructive induction engine, which selects features and adjusts weights of attributes, in order to obtain the highest classification accuracy. We compare two classification approaches: a single 1-NN and a n 2 meta-classifier. For the n 2-classifier, the idea of an improvement of classification accuracy is based on independent modification of descriptions of examples for each pair of n classes. Finally, it gives (n 2n)/2 spaces of attributes dedicated for discrimination of pairs of classes. The computational experiment was performed on a medical data set. Its results reveal the utility of using genetic algorithms for features selection and weight adjusting, and point out the advantage of using a multi-classification model (n 2-classifier) with constructive induction in relation to the analogous single-classifier approach.


evolutionary computation machine learning knowledge representation feature selection multi/meta-classification systems 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Aha, D.W., Kibler E., Albert M.K.: Instance-based learning algorithms. Machine Learning 6 (1991) 37–66.Google Scholar
  2. 2.
    Aha, D.W., Bankert R.L.: Feature Selection for Case-based Classification of Cloud Types: An Empirical Comparison, Proceedings AAAI-94 Workshop Case-Based Reasoning, (1994), 106–112.Google Scholar
  3. 3.
    Bloedorn, E., Michalski, R.S., Wnek, J.: Multistrategy Constructive Induction: AQ17-MCI. Proceedings of the 2 nd International Workshop on Multi-Strategy Learning, Harper’s Ferry, WV, (1993).Google Scholar
  4. 4.
    Chan, P.K., Stolfo, S.J.: Experiments on multistrategy learning by meta-learning. In Proceedings of the Second International Conference on Information and Knowledge Management, (1993), 314–323.Google Scholar
  5. 5.
    Davis, L.: Handbook of genetic algorithms, Van Nostrand Reinhold, New York, (1991).Google Scholar
  6. 6.
    Friedman, J.H.: Another approach to polychotomous classification, Technical Report, Stanford University, (1996).Google Scholar
  7. 7.
    Goldberg, D.E.: Genetic Algorithms in Search, Optimization, and Machine Learning, Addison-Wesley Publishing Co. (1989).Google Scholar
  8. 8.
    Hastie, T., Tibshirani R.: Classification by pairwise coupling, Proc. NIPS97.Google Scholar
  9. 9.
    Jelonek, J., Stefanowski, J.: Using n 2-classifier to solve multiclass learning problems. Technical Report, Poznan University of Technology, (1997).Google Scholar
  10. 10.
    Jelonek, J., Stefanowski J.: Feature subset selection for classification of histological images, Artificial Intelligence in Medicine 9 (1997) 227–239.CrossRefGoogle Scholar
  11. 11.
    Jelonek, J., Krawiec, K., Slowiński, R., Stefanowski, J., Szymaś, J.: Computer-assisted diagnosis of neurepithelial tumours based on clinical and pictorial data, Computers in Medicine 4 (1997) 170–175, Polish Society of Medical Informatics, Łódź.Google Scholar
  12. 12.
    Jelonek, J., Stefanowski, J.: Experiments on solving multicalss learning problems by n 2-classifier, Proceedings 10 th European Conference on Machine Learning, Chemnitz, Germany, (1998).Google Scholar
  13. 13.
    Jelonek, J., Komosiński, M.: Using n 2-classifier in constructive induction. In: Ch. Giraud-Carrier, M. Hilario (eds.), ECML ’98 Workshop on Upgrading Learning to the Meta-Level: Model Selection and Data Transformation, Chemitzer Informatik-Berichte, Chemnitz, (1998), 21–29.Google Scholar
  14. 14.
    John, G., Kohavi, R., Pfleger K.: Irrelevant features and the subset selection problem, In Proceedings 11 th International Machine Learning Conference, (1994), 121–129.Google Scholar
  15. 15.
    Komosiński, M.: ECIS—Evolutionary Constructive Induction System, Scholar
  16. 16.
    Littlestone, N., Warmuth, M.K.: The weighted majority algorithm. Information and Computation, 108 (2), (1994), 212–261.MATHMathSciNetCrossRefGoogle Scholar
  17. 17.
    Matheus, Ch.J.: The need for constructive induction, In Proceedings 8 th International Workshop on Machine Learning.Google Scholar
  18. 18.
    Matheus, C.J. and Rendell, L.: Constructive Induction on Decision Trees, Proceedings of IJCAI-89, pp. 645–650, Detroit, MI, (1989).Google Scholar
  19. 19.
    Mayoraz, E., Moreira, M.: On the decomposition of polychotomies into dichotomies, In Proceedings 14 th International Conference on Machine Learning, (1997), 219–226.Google Scholar
  20. 20.
    Michalewicz, Z.: Genetic Algorithms+Data Structures=Evolution Programs, Springer-Verlag, (1996).Google Scholar
  21. 21.
    Michalski, R.S.: Pattern recognition as knowledge-guided computer induction, Department of Computer Science Reports, No. 927, University of Illinois, Urbana, June 1978.Google Scholar
  22. 22.
    Michalski, R.S., Tecuci, G.: Machine Learning. A multistrategy approach. Volume IV, Morgan Kaufmann (1994).Google Scholar
  23. 23.
    Pudil, P., Novovicova J., Kittler J.: Floating Search methods in feature selection, Pattern Recognition Letters 15 (1994) 1119–1125.CrossRefGoogle Scholar
  24. 24.
    Wnek, J. and Michalski, R.S.: Hypothesis-driven constructive induction in AQ17: a method and experiments, Proceedings of IJCAI-91, Workshop on Evaluating and Changing Representation in Machine Learning, Sydney, Australia, (1991).Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1999

Authors and Affiliations

  • Jacek Jelonek
    • 1
  • Maciej Komosiński
    • 1
  1. 1.Institute of Computing SciencePoznan University of TechnologyPoznanPoland

Personalised recommendations