Advertisement

Compaction for Code Fragment Based Learning Classifier Systems

  • Isidro M. AlvarezEmail author
  • Will N. Browne
  • Mengjie Zhang
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9592)

Abstract

Learning Classifier Systems (LCSs) originated from artificial cognitive systems research, but migrated such that LCS became powerful classification techniques in single domains. Modern LCSs can extract building blocks of knowledge utilizing Code Fragments in order to scale to more difficult problems in the same or a related domain. Code Fragments (CF) are GP-like sub-trees where past learning can be reused in future CF sub-trees. However, the rich alphabet produced by the code fragments requires additional computational resources as the knowledge and functional rulesets grow. Eventually this leads to impractically long chains of CFs. The novel work here introduces methods to produce Distilled Rules to remedy this problem by compacting learned functions. The system has been tested on Boolean problems, up to the 70 bit multiplexer and 3x11 bit hidden multiplexer, which are known to be difficult problems for conventional algorithms to solve due to large and complex search spaces. The new methods have been shown to create a new layer of rules that reduce the tree length, making it easier for the system to scale to more difficult problems in the same or a related domain.

Keywords

LCS Learning Classifier Code Fragments Compaction 

References

  1. 1.
    Alvarez, I.M., Browne, W.N., Zhang, M.: Reusing learned functionality to address complex boolean functions. In: Dick, G., et al. (eds.) SEAL 2014. LNCS, vol. 8886, pp. 383–394. Springer, Heidelberg (2014)Google Scholar
  2. 2.
    Alvarez, I.M., Browne, W.N., Zhang, M.: Reusing learned functionality in XCS: code fragments with constructed functionality and constructed features. In: Genetic and Evolutionary Computation Conference GECCO ’14 Companion, pp. 969–976. ACM (2014)Google Scholar
  3. 3.
    Butz, M.V., Wilson, S.W.: An algorithmic description of XCS. Soft Comput. 6, 144–153 (2002)CrossRefzbMATHGoogle Scholar
  4. 4.
    Butz, M.V.: Rule-Based Evolutionary Online Learning Systems: A Principal Approach to LCS Analysis and Design. Springer, Berlin (2006)Google Scholar
  5. 5.
    Dixon, W.: An Investigation of the XCS Classifier System in Data Mining. The University of Reading, Reading (2004)Google Scholar
  6. 6.
    Goldberg, D.E., Korb, B., Deb, K.: Messy genetic algorithms: motivation, analysis, and first results. Complex Syst. 3, 493–530 (1989)MathSciNetzbMATHGoogle Scholar
  7. 7.
    Holland, J.H.: Adaptation. In: Rosen, R., Snell, F.M. (eds.) Progress in Theoretical Biology, vol. 4, pp. 263–293. Academic Press, New York (1976)CrossRefGoogle Scholar
  8. 8.
    Holland, J.H.: Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence. The University of Michigan Press, Ann Arbor (1975)Google Scholar
  9. 9.
    Koza, J.R.: A hierarchical approach to learning the boolean multiplexer function. In: Foundations of Genetic Algorithms, pp. 171–192. Morgan Kaufmann (1991)Google Scholar
  10. 10.
    Lanzi, P.L.: Extending the representation of classifier conditions Part I : from binary to messy coding. In: Proceedings of the Genetic and Evolutionary Computation Conference (GECCO-1999), vol. 1, pp. 337–344, July 1999Google Scholar
  11. 11.
    Lanzi, P.L., Perrucci, A.: Extending the representation of classifier conditions Part II : from messy coding to S-Expressions. In: Proceedings of the Genetic and Evolutionary Computation Conference (GECCO-1999), vol. 1, pp. 345–352, July 1999Google Scholar
  12. 12.
    Iqbal, M., Browne, W.N., Zhang, M.: Comparison of Two Methods for Computing Action Values in XCS with Code-Fragment Actions. In: GECCO’13 Companion, pp. 1235–1242 (2013)Google Scholar
  13. 13.
    Iqbal, M., Browne, W.N., Zhang, M.: Extending learning classifier system with cyclic graphs for scalability on complex, large-scale boolean problems. In: Proceedings of the Genetic and Evolutionary Computation Conference, pp. 1045–1052 (2013)Google Scholar
  14. 14.
    Iqbal, M., Browne, W.N., Zhang, M.: Learning overlapping natured and niche imbalance boolean problems using XCS classifier systems. In: Proceedings of the IEEE Congress on Evolutionary Computation, pp. 1818–1825 (2013)Google Scholar
  15. 15.
    Iqbal, M., Browne, W.N., Zhang, M.: Reusing building blocks of extracted knowledge to solve complex, large-scale boolean problems. IEEE Trans. Evol. Comput. 17(3), 503–518 (2013)Google Scholar
  16. 16.
    Schaffer, J.D.: Learning multiclass pattern discrimination. In: Proceedings of the 1st International Conference on Genetic Algorithms and their Applications (ICGA85), pp. 74–79. Lawrence Erlbaum Associates (1985)Google Scholar
  17. 17.
    Liang-Yu, C., Po-Ming, L., Tzu-Chien, H.: A sensor tagging approach of knowledge in learning classifier systems. In: Proceedings of the IEEE Congress on Evolutionary Computation, pp. 2953–2960 (2015)Google Scholar
  18. 18.
    Hsuan-Ta, L., Po-Ming, L., Tzu-Chien, H.: The subsumption mechanism for XCS using code fragmented conditions. In: Proceedings Companion of the Genetic and Evolutionary Computation Conference, pp. 1275–1282 (2013)Google Scholar
  19. 19.
    Wilson, S.W.: Classifier fitness based on accuracy. Evol. Comput. 3(2), 149–175 (1995)CrossRefGoogle Scholar
  20. 20.
    Wilson, S.W.: Compact rulesets from XCSI. In: Lanzi, P.L., Stolzmann, W., Wilson, S.W. (eds.) Advances in Learning Classifier Systems 2002. LNCS, vol. 2321, pp. 196–208. Springer, Berlin (2002)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Isidro M. Alvarez
    • 1
    Email author
  • Will N. Browne
    • 1
  • Mengjie Zhang
    • 1
  1. 1.School of Engineering and Computer ScienceVictoria University of WellingtonWellingtonNew Zealand

Personalised recommendations