Springer Nature is making SARS-CoV-2 and COVID-19 research free. View research | View latest news | Sign up for updates

Automated Discovery of Numerical Approximation Formulae via Genetic Programming

  • 152 Accesses

  • 8 Citations

Abstract

This paper describes the use of genetic programming to perform automated discovery of numerical approximation formulae. We present results involving rediscovery of known approximations for Harmonic numbers, discovery of rational polynomial approximations for functions of one or more variables, and refinement of existing approximations through both approximation of their error function and incorporation of the approximation as a program tree in the initial GP population. Evolved rational polynomial approximations are compared to Padé approximations obtained through the Maple symbolic mathematics package. We find that approximations evolved by GP can be superior to Padé approximations given certain tradeoffs between approximation cost and accuracy, and that GP is able to evolve approximations in circumstances where the Padé approximation technique cannot be applied. We conclude that genetic programming is a powerful and effective approach that complements but does not replace existing techniques from numerical analysis.

This is a preview of subscription content, log in to check access.

References

  1. 1.

    D. Andre and J. R. Koza, “Parallel genetic programming: A scalable implementation using the transputer network architecture, ” in Advances in Genetic Programming 2, P. J. Angeline and K. E. Kinnear Jr. (eds.), MIT Press: Cambridge, MA, 1996, pp. 317-338.

  2. 2.

    G. A. Baker, Essentials of Padé Approximants, Academic Press: New York, 1975.

  3. 3.

    W. Banzhaf, “Genotype-phenotype mapping and neutral variation—A case study in genetic programming, ” in Parallel Problem Solving from Nature III programming, Y. Davidor, H.-P. Schwefel, and R. Männer (eds.), 1994, pp. 322-332.

  4. 4.

    C. M. Bender and S. A. Orszag, Advanced Mathematical Methods for Scientists and Engineers, McGraw-Hill: New York, 1978.

  5. 5.

    S. Bleuler, M. Brack, L. Thiele, and E. Zitzler, “Multiobjective genetic programming: Reducing bloat using SPEA2, ” in Proceedings of the 2001 Congress on Evolutionary Computation, IEEE Press, 2001, pp. 536-543.

  6. 6.

    T. Blickle and L. Thiele, “A comparison of selection schemes used in genetic algorithms, ” TIK-Report 11, TIK Institut fur Technische Informatik und Kommunikationsnetze, Computer Engineering and Networks Laboratory, ETH, Swiss Federal Institute of Technology, 1995.

  7. 7.

    R. L. Burden and J. D. Faires, Numerical Analysis, Brooks/Cole Publishing Company: Pacific Grove, 1997.

  8. 8.

    E. D. De Jong, R. A. Watson, and J. B. Pollack, “Reducing bloat and promoting diversity using multi-objective methods, ” in Proceedings of the 2001 Genetic and Evolutionary Computation Conference, L. Spector, E. D. Goodman, A. Wu, W. B. Langdon, H.-M. Voigt, M. Gen, S. Sen, M. Dorigo, S. Pezeshk, M. H. Garzon, and E. Burke (eds.), Morgan Kaufmann: San Francisco, CA, 2001, pp. 11-18.

  9. 9.

    S. Dzeroski and L. Todorovski, “Discovering dynamics: from inductive logic programming to machine discovery, ” Journal of Intelligent Information Systems, vol. 4, pp. 89-108, 1995.

  10. 10.

    L. Eulero, “De progressionibus harmonicus observationes, ” Comentarii Academiæ Scientarum Imperialis Petropolitanæ, 1734, vol. 7–1734, pp. 150-161.

  11. 11.

    M. Evett and T. Fernandez, “Numeric mutation improves the discovery of numeric constants in genetic programming, ” in Proceedings of the Genetic and Evolutionary Computation Conference, J. R. Koza, W. Banzhaf, K. Chellapilla, K. Deb, M. Dorigo, D. B. Fogel, M. H. Garzon, D. E. Goldberg, H. Iba, and R. L. Riolo (eds.), Morgan Kaufmann: San Francisco: CA, 1998, pp. 66-71.

  12. 12.

    B. Falkenhainer and R. Michalski, “Integrating quantitative and qualitative discovery in the ABACUS system, ” in Machine Learning: An Artificial Intelligence Approach, Y. Kodratoff, and R. Michalski (eds.), Morgan Kaufmann: San Mateo, CA, 1990, pp. 153-190.

  13. 13.

    D. E. Goldberg, Genetic Algorithms in Search, Optimization and Machine Learning, Addison-Wesley: Reading, MA, 1989.

  14. 14.

    G. H. Gonnet, Handbook of Algorithms and Data Structures, Addison-Wesley: London, 1984.

  15. 15.

    K. M. Heal, M. L. Hansen, and K. M. Rickard, Maple V Learning Guide, Springer-Verlag, 1998.

  16. 16.

    B. R. Hunt, R. L. Lipsman, and J. M. Rosenberg, A Guide to MATLAB: For Beginners and Experienced Users, Cambridge University Press, 2001.

  17. 17.

    H. Iba, H. de Garis, and T. Sato, “Genetic programming using a minimum description length principle, ” in Advances in Genetic Programming principle, ” in K. E. Kinnear, Jr. (ed.), MIT Press: Cambridge, MA, 1994, pp. 265-284.

  18. 18.

    M. A. Keane, J. R. Koza, and J. P. Rice, “Finding an impulse response function using genetic programming, ” in Proceedings of the 1993 American Control Conference American Automatic Control Council: Evanston, IL, pp. 2345-2350, 1993.

  19. 19.

    M. M. Kokar, “Determining arguments of invariant functional descriptions, ” Machine Learning, vol. 1,no. 4, pp. 403-422, 1986.

  20. 20.

    J. R. Koza, Genetic Programming: On the Programming of Computers by Means of Natural Selection, MIT Press: Cambridge, MA, 1992.

  21. 21.

    J. R. Koza, “Genetic Programming II: Automatic Discovery of Reusable Programs, ” MIT Press: Cambridge, MA, 1994.

  22. 22.

    J. R. Koza, D. Andre, F. H. Bennett III, and M. A. Keane, Genetic Programming 3: Darwinian Invention and Problem Solving, Morgan Kaufmann: San Francisco, CA, 1999.

  23. 23.

    P. Langley, G. L. Bradshaw, and H. A. Simon, “Bacon.5: The discovery of conservation laws, ” in Proceedings of the Seventh International Joint Conference on Artificial Intelligence, Vancouver, British Columbia, Morgan Kaufmann: San Francisco, CA, 1981, pp. 121-126.

  24. 24.

    P. Langley and J. Zytkow, “Data-driven approaches to empirical discovery, ” Artificial Intelligence, vol. 40, pp. 283-312, 1989.

  25. 25.

    G. Y. Lee, “Genetic recursive regression for modeling and forecasting real-world chaotic time series, ” in Advances in Genetic Programming 3, L. Spector, W. B. Langdon, U. O'Reilly, and P. J. Angeline (eds.), MIT Press: Cambridge, MA, 1999, pp. 400-423.

  26. 26.

    S. Luke and L. Spector, “A comparison of crossover and mutation in genetic programming, ” in Genetic Programming 1997: Proceedings of the Second Annual Conference, J. R. Koza, K. Deb, M. Dorigo, D. B. Fogel, M. Garzon, H. Iba, and R. L. Riolo (eds.), Morgan Kaufmann: San Francisco, CA, 1997, pp. 240-248.

  27. 27.

    R. E. Moustafa, K. A. De Jong, and E. J. Wegman, “Using genetic algorithms for adaptive function approximation and mesh generation, ” in Proceedings of the Genetic and Evolutionary Computation Conference, W. Banzhaf, J. Daida, A. E. Eiben, M. H. Garzon, V. Honavar, M. Jakiela, and R. E. Smith, (eds.), Morgan Kaufmann: San Francisco, CA, 1999, p. 798.

  28. 28.

    P. Nordin, “A compiling genetic programming system that directly manipulates the machine code, ” in Advances in Genetic Programming, K. E. Kinnear, Jr. (ed.), MIT Press: Cambridge, MA, 1994, pp. 311-331.

  29. 29.

    C. Ryan, J. J. Collins, and M. O'Neill, “Grammatical evolution: Evolving programs for an arbitrary language, ” in Proceedings of the First European Workshop on Genetic Programming, W. Banzhaf, R. Poli, M. Schoenauer, and T. C. Fogarty (eds.), Springer-Verlag: New York, Vol. 1391 of LNCS, 1998, pp. 83.

  30. 30.

    J. D. Schaffer, Multiple Objective Optimization with Vector Valued Genetic Algorithms, PhD Thesis, Vanderbilt University, 1984.

  31. 31.

    M. Streeter and L. A. Becker, “Automated discovery of numerical approximation formulae via genetic programming, ” in Proceedings of the 2001 Genetic and Evolutionary Computation Conference, L. Spector, E. D. Goodman, A. Wu, W. B. Langdon, H.-M. Voigt, M. Gen, S. Sen, M. Dorigo, S. Pezeshk, M. H. Garzon, and E. Burke (eds.), Morgan Kaufmann: San Francisco, CA, 2001, pp. 147-154.

  32. 32.

    M. Streeter, Automated Discovery of Numerical Approximation Formulae via Genetic Programming, Masters thesis, Worcester Polytechnic Institute, Worcester, MA, 2001.

  33. 33.

    L. Todorovski and S. Dzeroski, “Declarative bias in equation discovery, ” in Proceedings of Fourteenth International Conference on Machine Learning, Morgan Kaufmann: San Mateo, CA, 1997, pp. 376-384.

  34. 34.

    R. Zembowitz and J. Zytkow, “Discovery of equations: experimental evaluation of convergence, ” in Proceedings of Tenth National Conference on Artificial Intelligence, Morgan Kaufmann: San Mateo, CA, 1992, pp. 101-117.

  35. 35.

    B.-T. Zhang and H. Mühlenbein, “Balancing accuracy and parsimony in genetic programming, ” Evolutionary Computation, vol. 3,no. 1, pp. 17-38, 1995.

  36. 36.

    E. Zitzler and L. Thiele, “Multiobjective evolutionary algorithms: A comparative case study and the strength Pareto approach, ” In IEEE Transactions on Evolutionary Computation, vol. 3,no. 4, pp. 257-271, 1999.

Download references

Author information

Rights and permissions

Reprints and Permissions

About this article

Cite this article

Streeter, M., Becker, L.A. Automated Discovery of Numerical Approximation Formulae via Genetic Programming. Genetic Programming and Evolvable Machines 4, 255–286 (2003). https://doi.org/10.1023/A:1025176407779

Download citation

  • genetic programming
  • approximations
  • symbolic regression
  • Pareto optimality