Advertisement

Genetic Programming and Evolvable Machines

, Volume 14, Issue 4, pp 457–471 | Cite as

A new genetic programming framework based on reaction systems

  • Luca Manzoni
  • Mauro Castelli
  • Leonardo VanneschiEmail author
Article
  • 373 Downloads

Abstract

This paper presents a new genetic programming framework called Evolutionary Reaction Systems. It is based on a recently defined computational formalism, inspired by chemical reactions, called Reaction Systems, and it has several properties that distinguish it from other existing genetic programming frameworks, making it interesting and worthy of investigation. For instance, it allows us to express complex constructs in a simple and intuitive way, and it lightens the final user from the task of defining the set of primitive functions used to build up the evolved programs. Given that Evolutionary Reaction Systems is new and it has small similarities with other existing genetic programming frameworks, a first phase of this work is dedicated to a study of some important parameters and their influence on the algorithm’s performance. Successively, we use the best parameter setting found to compare Evolutionary Reaction Systems with other well established machine learning methods, including standard tree-based genetic programming. The presented results show that Evolutionary Reaction Systems are competitive with, and in some cases even better than, the other studied methods on a wide set of benchmarks.

Keywords

Genetic programming Reaction systems Evolutionary computation 

References

  1. 1.
    T. Bäck, R. Breukelaar. Using genetic algorithms to evolve behavior in cellular automata. ed. by C. Calude, M. Dinneen, G. Paun, M. Pérez-Jiménez, G. Rozenberg. UC, Lecture Notes in Computer Science, vol. 3699. (Springer, 2005) pp. 1–10Google Scholar
  2. 2.
    M. Castelli, L. Manzoni, L. Vanneschi. Parameter tuning of evolutionary reactions systems. Genetic and Evolutionary Computation Conference, GECCO 2012, (ACM, Philadelphia 2012) pp. 727–734Google Scholar
  3. 3.
    M. Caudill. Neural networks primer, part I. AI Expert 2, 46–52 (1987)Google Scholar
  4. 4.
    N. Cristianini, J. Shawe-Taylor. An introduction to support vector machines: and other kernel-based learning methods. (Cambridge University Press, Cambridge, 2000)CrossRefGoogle Scholar
  5. 5.
    A. Ehrenfeucht, G. Rozenberg. Basic notions of reaction systems. Developments in Language Theory 8th International Conference, DLT 2004, LNCS, vol. 3340. (Springer, 2004) pp. 27–29Google Scholar
  6. 6.
    A. Ehrenfeucht, G. Rozenberg. Reaction systems. Fundamenta Informaticae 75, 263–280 (2007)MathSciNetzbMATHGoogle Scholar
  7. 7.
    A. Ehrenfeucht, G. Rozenberg. Introducing time in reaction systems. Theoret. Comput. Sci. 410, 310–322 (2009)MathSciNetzbMATHCrossRefGoogle Scholar
  8. 8.
    D. Fogel. Evolving computer programs. ed. by D. Fogel. Evolutionary Computation: The Fossil Record, chap. 5. (MIT Press, Cambridge, 1998) pp. 143–144CrossRefGoogle Scholar
  9. 9.
    L. Fogel, A. Owens, M. Walsh. Artificial Intelligence Through Simulated Evolution. (Wiley, New York, 1966)zbMATHGoogle Scholar
  10. 10.
    R. Friedberg. A learning machine: Part 1. IBM J. Res. Dev. 2:1, 2–13 (1958)MathSciNetGoogle Scholar
  11. 11.
    D. Heckerman. A tutorial on learning with bayesian networks. Innovations in Bayesian Networks, Studies in Computational Intelligence, vol. 156, (Springer, Berlin, 2008) pp. 33–82Google Scholar
  12. 12.
    W. Kantschik, W. Banzhaf. Linear-tree GP and its comparison with other GP structures. Genetic Programming, Proceedings of EuroGP’2001, LNCS, vol. 2038. (Springer, Lake Como, 2001) pp. 302–312Google Scholar
  13. 13.
    W. Kantschik, W. Banzhaf. Linear-graph GP—A new GP structure. In: Genetic Programming, Proceedings of the 5th European Conference, EuroGP 2002, LNCS, vol. 2278, pp. 83–92. (Springer, Kinsale, 2002)Google Scholar
  14. 14.
    J. Koza. A hierarchical approach to learning the boolean multiplexer function. ed. by G.J.E. Rawlins. Foundations of genetic algorithms. (Morgan Kaufmann, Indiana University, 1991) pp. 171–192Google Scholar
  15. 15.
    J. Koza. Genetic Programming: On the Programming of Computers by Means of Natural Selection. (MIT Press, Cambridge, 1992)zbMATHGoogle Scholar
  16. 16.
    L. Manzoni, M. Castelli, L. Vanneschi. Evolutionary reaction systems. ed. by M. Giacobini, L. Vanneschi, W.S. Bush. Evolutionary Computation, Machine Learning and Data Mining in Computational Biology, EvoBIO 2012, Lecture Notes in Computer Science, vol. 7246. (Springer, Màlaga, Spain, 2012) pp. 13–25Google Scholar
  17. 17.
    J. McDermott, D.R. White, S. Luke, L. Manzoni, M. Castelli, L. Vanneschi, W. Jaskowski, K. Krawiec, R. Harper, K.D. Jong, U.M. OReilly. Genetic programming needs better benchmarks. Genetic and Evolutionary Computation Conference, GECCO 2012. (ACM, Philadelphia, 2012) pp. 791–798Google Scholar
  18. 18.
    J. Miller, P. Thomson. Cartesian genetic programming. Genetic Programming, Proceedings of EuroGP’2000, LNCS, vol. 1802. (Springer, Edinburgh, 2000) pp. 121–132Google Scholar
  19. 19.
    D. Montana. Strongly typed genetic programming. Evolutionary Computation 3(2), 199–230 (1995)CrossRefGoogle Scholar
  20. 20.
    M. O’Neill, C. Ryan. Grammatical evolution. IEEE Trans. Evol. Comput. 5(4), 349–358 (2001)CrossRefGoogle Scholar
  21. 21.
    M. O’Neill, L. Vanneschi, S. Gustafson, W. Banzhaf. Open issues in genetic programming. Genet. Program. Evolvable Mach. 11, 339–363 (2010)CrossRefGoogle Scholar
  22. 22.
    M. Orr. Introduction to radial basis function networks. (Tech. rep., Centre For Cognitive Science, University of Edinburgh, Edinburgh, Scotland, 1996)Google Scholar
  23. 23.
    J. Platt. A fast algorithm for training support vector machines. (Tech. rep., Microsoft Research, Redmond, USA, 1998)Google Scholar
  24. 24.
    R. Poli, W. Langdon, N. McPhee. A field guide to genetic programming. Published via http://lulu.com and freely available at http://www.gp-field-guide.org.uk (2008). http://www.gp-field-guide.org.uk. (With contributions by J. R. Koza)
  25. 25.
    I. Rish. An empirical study of the naive bayes classifier. IJCAI-01 workshop on “Empirical Methods in AI” (2001)Google Scholar
  26. 26.
    S. Smit, A. Eiben. Comparing parameter tuning methods for evolutionary algorithms. In: Evolutionary Computation, 2009. CEC ’09. IEEE Congress on (2009) pp. 399–406Google Scholar
  27. 27.
    A. Teller, M. Veloso, in PADO: A New Learning Architecture for Object Recognition, ed. by K. Ikeuchi, M. Veloso. Symbolic Visual Learning. (Oxford University Press, Oxford, 1996), pp. 81–116Google Scholar
  28. 28.
    P.A. Whigham. Grammatical bias for evolutionary learning. Ph.D. thesis, School of Computer Science, University College, University of New South Wales, Australian Defence Force Academy, Canberra, Australia (14 October 1996)Google Scholar

Copyright information

© Springer Science+Business Media New York 2013

Authors and Affiliations

  • Luca Manzoni
    • 1
  • Mauro Castelli
    • 2
  • Leonardo Vanneschi
    • 1
    • 2
    • 3
    Email author
  1. 1.Dipartimento di Informatica, Sistemistica e Comunicazione (D.I.S.Co.)Università di Milano-BicoccaMilanItaly
  2. 2.ISEGIUniversidade Nova de LisboaLisboaPortugal
  3. 3.INESC-IDIST / Universidade Técnica de LisboaLisboaPortugal

Personalised recommendations