Skip to main content

A Novel Neural Network-Based Symbolic Regression Method: Neuro-Encoded Expression Programming

Part of the Lecture Notes in Computer Science book series (LNTCS,volume 11728)

Abstract

Neuro-encoded expression programming (NEEP) that aims to offer a novel continuous representation of combinatorial encoding for genetic programming methods is proposed in this paper. Genetic programming with linear representation uses nature-inspired operators (e.g., crossover, mutation) to tune expressions and finally search out the best explicit function to simulate data. The encoding mechanism is essential for genetic programmings to find a desirable solution efficiently. However, the linear representation methods manipulate the expression tree in discrete solution space, where a small change of the input can cause a large change of the output. The unsmooth landscapes destroy the local information and make difficulty in searching. The neuro-encoded expression programming constructs the gene string with recurrent neural network (RNN) and the weights of the network are optimized by powerful continuous evolutionary algorithms. The neural network mappings smoothen the sharp fitness landscape and provide rich neighborhood information to find the best expression. The experiments indicate that the novel approach improves training efficiency and reduces test errors on several well-known symbolic regression problems.

Keywords

  • Genetic programming
  • Symbolic regression
  • Neural network
  • Gene expression programming
  • Evolutionary algorithm

A. Anjum and S. Sun—Contribute equally to this article.

This is a preview of subscription content, access via your institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • DOI: 10.1007/978-3-030-30484-3_31
  • Chapter length: 14 pages
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
eBook
USD   89.00
Price excludes VAT (USA)
  • ISBN: 978-3-030-30484-3
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
Softcover Book
USD   119.99
Price excludes VAT (USA)
Fig. 1.
Fig. 2.
Fig. 3.
Fig. 4.

References

  1. Auger, A., Hansen, N.: A restart CMA evolution strategy with increasing population size. In: 2005 IEEE Congress on Evolutionary Computation, vol. 2, pp. 1769–1776 (2005). https://doi.org/10.1109/CEC.2005.1554902

  2. Bowman, S.R., Vilnis, L., Vinyals, O., Dai, A., Jozefowicz, R., Bengio, S.: Generating sentences from a continuous space. In: Proceedings of the 20th SIGNLL Conference on Computational Natural Language Learning, pp. 10–21 (2016). https://doi.org/10.18653/v1/K16-1002

  3. Brameier, M.F., Banzhaf, W.: Basic concepts of linear genetic programming. In: Linear Genetic Programming, pp. 13–34. Springer, Boston (2007). https://doi.org/10.1007/978-0-387-31030-5_2

  4. Chatzarakis, G.E., Li, T.: Oscillation criteria for delay and advanced differential equations with nonmonotone arguments. Complexity 2018, 1–18 (2018). https://doi.org/10.1155/2018/8237634

    CrossRef  MATH  Google Scholar 

  5. Chen, C.L.P., Zhang, T., Chen, L., Tam, S.C.: I-Ching divination evolutionary algorithm and its convergence analysis. IEEE Trans. Cybern. 47(1), 2–13 (2017). https://doi.org/10.1109/TCYB.2015.2512286

    CrossRef  Google Scholar 

  6. Dheeru, D., Karra Taniskidou, E.: UCI machine learning repository (2017)

    Google Scholar 

  7. Dick, G.: Sensitivity-like analysis for feature selection in genetic programming. In: Proceedings of the Genetic and Evolutionary Computation Conference, pp. 401–408. ACM (2017). https://doi.org/10.1145/3071178.3071338

  8. Fan, P., Xin, W., Ouyang, Y.: Approximation of discrete spatial data for continuous facility location design. Integr. Comput. Aided Eng. 21(4), 311–320 (2014). https://doi.org/10.3233/ICA-140466

    CrossRef  Google Scholar 

  9. Ferreira, C.: Automatically defined functions in gene expression programming. In: Nedjah, N., Mourelle, L.M., Abraham, A. (eds.) Genetic Systems Programming. SCI, vol. 13, pp. 21–56. Springer, Heidelberg (2006). https://doi.org/10.1007/3-540-32498-4_2

    CrossRef  Google Scholar 

  10. Ferreira, C.: Gene expression programming: a new adaptive algorithm for solving problems. Complex Syst. 13(2), 87–129 (2001). https://doi.org/10.1007/3-540-32849-1

    CrossRef  MathSciNet  MATH  Google Scholar 

  11. Holland, J.H.: Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence. MIT Press, Cambridge (1992)

    CrossRef  Google Scholar 

  12. Huang, W., Oh, S.K., Pedrycz, W.: Hybrid fuzzy wavelet neural networks architecture based on polynomial neural networks and fuzzy set/relation inference-based wavelet neurons. IEEE Trans. Neural Netw. Learn. Syst. 29(8), 3452–3462 (2018). https://doi.org/10.1109/TNNLS.2017.2729589

    CrossRef  Google Scholar 

  13. Kataoka, Y., Matsubara, T., Uehara, K.: Image generation using generative adversarial networks and attention mechanism. In: IEEE/ACIS 15th International Conference on Computer and Information Science (ICIS), pp. 1–6 (2016). https://doi.org/10.1109/ICIS.2016.7550880

  14. Kennedy, J., Eberhart, R.: Particle swarm optimization. In: Proceedings of the International Conference on Neural Networks, ICNN 1995, vol. 4, pp. 1942–1948 (1995). https://doi.org/10.1109/ICNN.1995.488968

  15. Koza, J.R., Andre, D., Bennett III, F.H., Keane, M.A.: Use of automatically defined functions and architecture-altering operations in automated circuit synthesis with genetic programming. In: Proceedings of the First Annual Conference on Genetic Programming, pp. 132–140 (1996)

    Google Scholar 

  16. Langdon, W.B., Poli, R., McPhee, N.F., Koza, J.R.: Genetic programming: an introduction and tutorial, with a survey of techniques and applications. In: Fulcher, J., Jain, L.C. (eds.) Computational Intelligence: A Compendium. SCI, vol. 115, pp. 927–1028. Springer, Heidelberg (2008). https://doi.org/10.1007/978-3-540-78293-3_22

    CrossRef  Google Scholar 

  17. Levy, O., Goldberg, Y.: Neural word embedding as implicit matrix factorization. In: Advances in Neural Information Processing Systems, pp. 2177–2185 (2014)

    Google Scholar 

  18. Liskowski, P., Bładek, I., Krawiec, K.: Neuro-guided genetic programming: prioritizing evolutionary search with neural networks. In: Proceedings of the Genetic and Evolutionary Computation Conference, pp. 1143–1150 (2018). https://doi.org/10.1145/3205455.3205629

  19. Maehara, T., Marumo, N., Murota, K.: Continuous relaxation for discrete DC programming. Math. Programm. 169(1), 199–219 (2018). https://doi.org/10.1007/s10107-017-1139-2

    CrossRef  MathSciNet  MATH  Google Scholar 

  20. Manjunath, G., Jaeger, H.: Echo state property linked to an input: exploring a fundamental characteristic of recurrent neural networks. Neural Comput. 25(3), 671–696 (2013). https://doi.org/10.1162/NECO_a_00411

    CrossRef  MathSciNet  MATH  Google Scholar 

  21. McConaghy, T.: FFX: fast, scalable, deterministic symbolic regression technology. In: Riolo, R., Vladislavleva, E., Moore, J. (eds.) Genetic Programming Theory and Practice IX. Genetic and Evolutionary Computation, pp. 235–260. Springer, New York (2011). https://doi.org/10.1007/978-1-4614-1770-5_13

    CrossRef  Google Scholar 

  22. McDermott, J., et al.: Genetic programming needs better benchmarks. In: Proceedings of the 14th Annual Conference on Genetic and Evolutionary Computation, pp. 791–798 (2012). https://doi.org/10.1145/2330163.2330273

  23. Mogren, O.: C-RNN-GAN: continuous recurrent neural networks with adversarial training (2016). arXiv preprint: arXiv:1611.09904

  24. Nicolau, M., Agapitos, A., O’Neill, M., Brabazon, A.: Guidelines for defining benchmark problems in genetic programming. In: 2015 IEEE Congress on Evolutionary Computation (CEC), pp. 1152–1159 (2015). https://doi.org/10.1109/CEC.2015.7257019

  25. Oltean, M., Grosan, C.: Evolving evolutionary algorithms using multi expression programming. In: The 7th European Conference on Artificial Life, vol. 2801, pp. 651–658 (2003). https://doi.org/10.1007/978-3-540-39432-7_70

    CrossRef  Google Scholar 

  26. O’Neill, M., Ryan, C.: Under the hood of grammatical evolution. In: Proceedings of the 1st Annual Conference on Genetic and Evolutionary Computation, vol. 2, pp. 1143–1148 (1999)

    Google Scholar 

  27. Orzechowski, P., La Cava, W., Moore, J.H.: Where are we now?: A large benchmark study of recent symbolic regression methods. In: Proceedings of the Genetic and Evolutionary Computation Conference, pp. 1183–1190 (2018). https://doi.org/10.1145/3205455.3205539

  28. Pardalos, P.M., Prokopyev, O.A., Busygin, S.: Continuous approaches for solving discrete optimization problems. In: Appa, G., Pitsoulis, L., Williams, H.P. (eds.) Handbook on Modelling for Discrete Optimization. International Series in Operations Research & Management Science, vol. 88, pp. 39–60. Springer, Boston (2006). https://doi.org/10.1007/0-387-32942-0_2

    CrossRef  MATH  Google Scholar 

  29. Poli, R.: A simple but theoretically-motivated method to control bloat in genetic programming. In: Ryan, C., Soule, T., Keijzer, M., Tsang, E., Poli, R., Costa, E. (eds.) EuroGP 2003. LNCS, vol. 2610, pp. 204–217. Springer, Heidelberg (2003). https://doi.org/10.1007/3-540-36599-0_19

    CrossRef  MATH  Google Scholar 

  30. Rothlauf, F.: Analysis and design of representations for trees. In: Representations for Genetic and Evolutionary Algorithms, pp. 141–215. Springer, Heidelberg (2006). https://doi.org/10.1007/3-540-32444-5_6

  31. Sussillo, D., Abbott, L.F.: Generating coherent patterns of activity from chaotic neural networks. Neuron 63(4), 544–557 (2009). https://doi.org/10.1016/j.neuron.2009.07.018

    CrossRef  Google Scholar 

  32. Wang, H., Qin, Z., Wan, T.: Text generation based on generative adversarial nets with latent variables. In: Phung, D., Tseng, V.S., Webb, G.I., Ho, B., Ganji, M., Rashidi, L. (eds.) 2018 Pacific-Asia Conference on Knowledge Discovery and Data Mining, pp. 92–103 (2018). https://doi.org/10.1007/978-3-319-93037-4_8

    CrossRef  Google Scholar 

  33. Wang, L., Orchard, J.: Investigating the evolution of a neuroplasticity network for learning. IEEE Trans. Syst. Man Cybern. Syst. (2017). https://doi.org/10.1109/TSMC.2017.2755066

  34. Wang, L., Yang, B., Chen, Y., Zhang, X., Orchard, J.: Improving neural-network classifiers using nearest neighbor partitioning. IEEE Trans. Neural Netw. Learn. Syst. 28(10), 2255–2267 (2017). https://doi.org/10.1109/TNNLS.2016.2580570

    CrossRef  MathSciNet  Google Scholar 

  35. Wang, L., Yang, B., Orchard, J.: Particle swarm optimization using dynamic tournament topology. Appl. Soft Comput. 48, 584–596 (2016). https://doi.org/10.1016/j.asoc.2016.07.041

    CrossRef  Google Scholar 

  36. Wang, L., Yang, B., Wang, S., Liang, Z.: Building image feature kinetics for cement hydration using gene expression programming with similarity weight tournament selection. IEEE Trans. Evol. Comput. 19(5), 679–693 (2015). https://doi.org/10.1109/TEVC.2014.2367111

    CrossRef  Google Scholar 

  37. Xin, L., Chi, Z., Weimin, X., Peter, C.N.: Prefix gene expression programming. In: Late Breaking Paper at Genetic and Evolutionary Computation Conference, pp. 25–29 (2005)

    Google Scholar 

  38. Yin, J., Meng, Y., Jin, Y.: A developmental approach to structural self-organization in reservoir computing. IEEE Trans. Auton. Mental Dev. 4(4), 273–289 (2012). https://doi.org/10.1109/TAMD.2012.2182765

    CrossRef  Google Scholar 

  39. Zelinka, I., Oplatkova, Z., Nolle, L.: Analytic programming-symbolic regression by means of arbitrary evolutionary algorithms. Int. J. Simul. Syst. Sci. Technol. 6(9), 44–56 (2005)

    Google Scholar 

  40. Zhong, J., Feng, L., Ong, Y.S.: Gene expression programming: a survey. IEEE Comput. Intell. Mag. 12(3), 54–72 (2017). https://doi.org/10.1109/MCI.2017.2708618

    CrossRef  Google Scholar 

Download references

Acknowledgments

This work was supported by National Natural Science Foundation of China under Grant No. 61573166, No. 61572230, No. 61872419, No. 61873324, No. 81671785, No. 61672262. Project of Shandong Province Higher Educational Science and Technology Program under Grant No. J16LN07. Shandong Provincial Natural Science Foundation No. ZR2019MF040, No. ZR2018LF005. Shandong Provincial Key R&D Program under Grant No. 2019GGX101041, No. 2018GGX101048, No. 2016ZDJS01A12, No. 2016GGX101001, No. 2017CXZC1206. Taishan Scholar Project of Shandong Province, China.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Lin Wang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and Permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Verify currency and authenticity via CrossMark

Cite this paper

Anjum, A., Sun, F., Wang, L., Orchard, J. (2019). A Novel Neural Network-Based Symbolic Regression Method: Neuro-Encoded Expression Programming. In: Tetko, I., Kůrková, V., Karpov, P., Theis, F. (eds) Artificial Neural Networks and Machine Learning – ICANN 2019: Deep Learning. ICANN 2019. Lecture Notes in Computer Science(), vol 11728. Springer, Cham. https://doi.org/10.1007/978-3-030-30484-3_31

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-30484-3_31

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-30483-6

  • Online ISBN: 978-3-030-30484-3

  • eBook Packages: Computer ScienceComputer Science (R0)