Evolving Multiple Discretizations with Adaptive Intervals for a Pittsburgh Rule-Based Learning Classifier System

  • Jaume Bacardit
  • Josep Maria Garrell
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 2724)


One of the ways to solve classification problems with real-value attributes using a Learning Classifier System is the use of a discretization algorithm, which enables traditional discrete knowledge representations to solve these problems. A good discretization should balance losing the minimum of information and having a reasonable number of cut points. Choosing a single discretization that achieves this balance across several domains is not easy. This paper proposes a knowledge representation that uses several discretization (both uniform and non-uniform ones) at the same time, choosing the correct method for each problem and attribute through the iterations. Also, the intervals proposed by each discretization can split and merge among them along the evolutionary process, reducing the search space where possible and expanding it where necessary. The knowledge representation is tested across several domains which represent a broad range of possibilities.


Knowledge Representation Discretization Interval Learn Classifier System Rule Representation Attribute Term 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Holland, J.H.: Adaptation in Natural and Artificial Systems. University of Michigan Press (1975)Google Scholar
  2. 2.
    Goldberg, D.E.: Genetic Algorithms in Search, Optimization and Machine Learning. Addison-Wesley Publishing Company, Inc. (1989)Google Scholar
  3. 3.
    DeJong, K.A., Spears, W.M.: Learning concept classification rules using genetic algorithms. Proceedings of the International Joint Conference on Artificial Intelligence (1991) 651–656Google Scholar
  4. 4.
    Wilson, S.W.: Classifier fitness based on accuracy. Evolutionary Computation 3 (1995) 149–175CrossRefGoogle Scholar
  5. 5.
    Bacardit, J., Garrell, J.M.: Evolution of multi-adaptive discretization intervals for A rule-based genetic learning system. In: Proceedings of the VIII Iberoamerican Conference on Artificial Intelligence (IBERAMIA’2002), LNAI vol. 2527, Springer (2002) 350–360Google Scholar
  6. 6.
    Fayyad, U.M., Irani, K.B.: Multi-interval discretization of continuous-valued attributes for classification learning. In: IJCAI. (1993) 1022–1029Google Scholar
  7. 7.
    Wilson, S.W.: Get real! XCS with continuous-valued inputs. In Booker, L., Forrest, S., Mitchell, M., Riolo, R.L., eds.: Festschrift in Honor of John H. Holland, Center for the Study of Complex Systems (1999) 111–121Google Scholar
  8. 8.
    Liu, H., Setiono, R.: Chi2: Feature selection and discretization of numeric attributes. In: Proceedings of 7th IEEE International Conference on Tools with Artificial Intelligence, IEEE Computer Society (1995) 388–391Google Scholar
  9. 9.
    Aguilar-Ruiz, J.S., Riquelme, J.C., Valle, C.D.: Improving the evolutionary coding for machine learning tasks. In: Proceedings of the European Conference on Artificial Intelligence, ECAI’02, Lyon, France, IOS Press (2002) pp. 173–177Google Scholar
  10. 10.
    Giráldez, R., Aguilar-Ruiz, J.S., Riquelme, J.C.: Discretización supervisada no paramétrica orientada a la obtención de reglas de decisión. In: Proceedings of the CAEPIA2001. (2001) 53–62Google Scholar
  11. 11.
    Riquelme, J.C., Aguilar, J.S.: Codificación indexada de atributos continuos para algoritmos evolutivos en aprendizaje supervisado. In: Proceedings of the “Primer Congreso Español de Algoritmos Evolutivos y Bioinspirados (AEB’02)”. (2002) 161–167Google Scholar
  12. 12.
    Llorà, X., Garrell, J.M.: Knowledge-independent data mining with fine-grained parallel evolutionary algorithms. In: Proceedings of the Genetic and Evolutionary Computation Conference (GECCO-2001), Morgan Kaufmann (2001) 461–468Google Scholar
  13. 13.
    Rivest, R.L.: Learning decision lists. Machine Learning 2 (1987) 229–246MathSciNetGoogle Scholar
  14. 14.
    Soule, T., Foster, J.A.: Effects of code growth and parsimony pressure on populations in genetic programming. Evolutionary Computation 6 (1998) 293–309CrossRefGoogle Scholar
  15. 15.
    Bacardit, J., Garrell, J.M.: Métodos de generalización para sistemas clasificadores de Pittsburgh. In: Proceedings of the “Primer Congreso Español de Algoritmos Evolutivos y Bioinspirados (AEB’02)”. (2002) 486–493Google Scholar
  16. 16.
    Blake, C., Keogh, E., Merz, C.: Uci repository of machine learning databases (1998) Blake, C., Keogh, E., & Merz, C.J. (1998). UCI repository of machine learning databases ( Scholar
  17. 17.
    Martínez Marroquín, E., Vos, C., et al.: Morphological analysis of mammary biopsy images. In: Proceedings of the IEEE International Conference on Image Processing (ICIP’96). (1996) 943–947Google Scholar
  18. 18.
    Martí, J., Cufí, X., Regincós, J., et al.: Shape-based feature selection for microcalcification evaluation. In: Imaging Conference on Image Processing, 3338:1215–1224. (1998)Google Scholar
  19. 19.
    Golobardes, E., Llorà, X., Garrell, J.M., Vernet, D., Bacardit, J.: Genetic classifier system as a heuristic weighting method for a case-based classifier system. Butlletí de l’Associació Catalana d’Intel.ligència Artificial 22 (2000) 132–141Google Scholar
  20. 20.
    Witten, I.H., Frank, E.: Data Mining: practical machine learning tools and techniques with java implementations. Morgan Kaufmann (2000)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2003

Authors and Affiliations

  • Jaume Bacardit
    • 1
  • Josep Maria Garrell
    • 1
  1. 1.Intelligent Systems Research Group, Enginyeria i Arquitectura La SalleUniversitat Ramon LlullBarcelona CataloniaSpain, Europe

Personalised recommendations