Pareto Inspired Multi-objective Rule Fitness for Noise-Adaptive Rule-Based Machine Learning

  • Ryan J. UrbanowiczEmail author
  • Randal S. Olson
  • Jason H. Moore
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9921)


Learning classifier systems (LCSs) are rule-based evolutionary algorithms uniquely suited to classification and data mining in complex, multi-factorial, and heterogeneous problems. The fitness of individual LCS rules is commonly based on accuracy, but this metric alone is not ideal for assessing global rule ‘value’ in noisy problem domains and thus impedes effective knowledge extraction. Multi-objective fitness functions are promising but rely on prior knowledge of how to weigh objective importance (typically unavailable in real world problems). The Pareto-front concept offers a multi-objective strategy that is agnostic to objective importance. We propose a Pareto-inspired multi-objective rule fitness (PIMORF) for LCS, and combine it with a complimentary rule-compaction approach (SRC). We implemented these strategies in ExSTraCS, a successful supervised LCS and evaluated performance over an array of complex simulated noisy and clean problems (i.e. genetic and multiplexer) that each concurrently model pure interaction effects and heterogeneity. While evaluation over multiple performance metrics yielded mixed results, this work represents an important first step towards efficiently learning complex problem spaces without the advantage of prior problem knowledge. Overall the results suggest that PIMORF paired with SRC improved rule set interpretability, particularly with regard to heterogeneous patterns.


Data mining Classifier systems Fitness evaluation Multi-objective optimization Machine learning 



The computations in this work were performed on the Discovery cluster supported by the Research Computing group, ITS at Dartmouth College. This work was supported by NIH grants AI116794, LM009012, LM010098, EY022300, LM011360, CA134286, and GM103534.


  1. 1.
    Urbanowicz, R.J., Moore, J.H.: Learning classifier systems: a complete introduction, review, and roadmap. J Artif. Evol. Appl. 2009, 25 (2009). Article no. 1Google Scholar
  2. 2.
    Urbanowicz, R.J., Moore, J.H.: ExSTraCS 2.0: description and evaluation of a scalable learning classifier system. Evol. Intel. 8(2–3), 89–116 (2015)CrossRefGoogle Scholar
  3. 3.
    Bernadó-Mansilla, E., Garrell-Guiu, J.M.: Accuracy-based learning classifier systems: models, analysis and applications to classification tasks. Evol. Comput. 11(3), 209–238 (2003)CrossRefGoogle Scholar
  4. 4.
    Orriols, A., Bernadó-Mansilla, E.: Class imbalance problem in UCS classifier system: fitness adaptation. In: The 2005 IEEE Congress on Evolutionary Computation, vol. 1, pp. 604–611. IEEE (2005)Google Scholar
  5. 5.
    Urbanowicz, R.J., Bertasius, G., Moore, J.H.: An extended michigan-style learning classifier system for flexible supervised learning, classification, and data mining. In: Bartz-Beielstein, T., Branke, J., Filipič, B., Smith, J. (eds.) PPSN 2014. LNCS, vol. 8672, pp. 211–221. Springer, Heidelberg (2014)Google Scholar
  6. 6.
    Urbanowicz, R., Moore, J.: The application of Michigan-style learning classifier systems to address genetic heterogeneity and epistasis in association studies. In: Proceedings of the 12th Annual Conference on Genetic and Evolutionary Computation, pp. 195–202. ACM (2010)Google Scholar
  7. 7.
    Bernadó-mansilla, E., Llorà, X., Traus, I.: Multiobjective learning classifier systems: an overview. Technical report 2005020 (2005)Google Scholar
  8. 8.
    Llorà, X., Goldberg, D.E.: Bounding the effect of noise in multiobjective learning classifier systems. Evol. Comput. 11(3), 279–298 (2003)CrossRefGoogle Scholar
  9. 9.
    Bacardit, J., Garrell, J.M.: Bloat control and generalization pressure using the minimum description length principle for a pittsburgh approach learning classifier system. In: Kovacs, T., Llorà, X., Takadama, K., Lanzi, P.L., Stolzmann, W., Wilson, S.W. (eds.) IWLCS 2003. LNCS (LNAI), vol. 4399, pp. 59–79. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  10. 10.
    Mansilla, E.B., Guiu, J.M.G.: MOLeCS Using multiobjective evolutionary algorithms for learning. In: Zitzler, E., Thiele, L., Deb, K., Coello, C.A.C., Corne, D. (eds.) Evolutionary Multi-criterion Optimization, pp. 696–710. Springer, Heidelberg (2001)CrossRefGoogle Scholar
  11. 11.
    Urbanowicz, R., Moore, J.: Retooling fitness for noisy problems in a supervised Michigan-style learning classifier system. In: Proceedings of the 2015 on Genetic and Evolutionary Computation Conference, pp. 591–598. ACM (2015)Google Scholar
  12. 12.
    Tan, J., Moore, J., Urbanowicz, R.: Rapid rule compaction strategies for global knowledge discovery in a supervised learning classifier system. In: Advances in Artificial Life, ECAL, vol. 12 pp. 110–117 (2013)Google Scholar
  13. 13.
    Deb, K., Pratap, A., Agarwal, S., Meyarivan, T.: A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans. Evol. Comput. 6(2), 182–197 (2002)CrossRefGoogle Scholar
  14. 14.
    Urbanowicz, R.J., Kiralis, J., Sinnott-Armstrong, N.A., Heberling, T., Fisher, J.M., Moore, J.H.: GAMETES: a fast, direct algorithm for generating pure, strict, epistatic models with random architectures. BioData Min. 5(1), 1 (2012)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2016

Authors and Affiliations

  • Ryan J. Urbanowicz
    • 1
    Email author
  • Randal S. Olson
    • 1
  • Jason H. Moore
    • 1
  1. 1.Institute for Biomedical Informatics, Perelman School of MedicineUniversity of PennsylvaniaPhiladelphiaUSA

Personalised recommendations