A Proposal on Machine Learning via Dynamical Systems

Abstract

We discuss the idea of using continuous dynamical systems to model general high-dimensional nonlinear functions used in machine learning. We also discuss the connection with deep learning.

This is a preview of subscription content, log in to check access.

References

  1. 1.

    Fan, J., Gijbels, I.: Local Polynomial Modeling and Its Applications. Chapman & Hall, London (1996)

    Google Scholar 

  2. 2.

    Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction a, Springer Series in Statistics, second edition, (2013)

  3. 3.

    LeCun, Y., Bengio, Y., Hinton, G.: Deep learning. Nature 521(7553), 436–444 (2015)

    Article  Google Scholar 

  4. 4.

    Han, J., E, W.: in preparation

  5. 5.

    Li, Q., Tai, C., E, W.: in preparation

  6. 6.

    Almeida, L.B.: A learning rule for asynchronous perceptrons with feedback in a combinatorial environment. In: Proceedings ICNN 87. San Diego, IEEE (1987)

  7. 7.

    LeCun, Y.: A theoretical framework for back propagation. In: Touretzky, D., Hinton, G., Sejnouski, T. (eds.) Proceedings of the 1988 connectionist models summer school, Carnegie-Mellon University, Morgan Kaufmann, (1989)

  8. 8.

    Pineda, F.J.: Generalization of back propagation to recurrent and higher order neural networks. In: Proceedings of IEEE conference on neural information processing systems, Denver, November, IEEE (1987)

  9. 9.

    Recht, B.: http://www.argmin.net/2016/05/18/mates-of-costate/

  10. 10.

    E, W., Ming, P.: Calculus of Variations and Differential Equations, lecture notes, to appear

  11. 11.

    He, K., Zhang, X., Ren, S., Sun, J.: Identity mapping in deep residual networks. (July, 2016) arXiv:1603.05027v3

  12. 12.

    Lambert, J.D.: Numerical Methods for Ordinary Differential Systems: The Initial Value Problem. Wiley, New York (1992)

    Google Scholar 

  13. 13.

    Stroock, D.W., Varadhan, S.R.S.: Multi-Dimensional Diffusion Processes. Springer, Berlin (2006)

    Google Scholar 

  14. 14.

    Wang, C., Li, Q., E, W., Chazelle, B.: Noisy Hegselmann–Krause systems: phase transition and the 2R-conjecture. In: Proceedings of 55th IEEE Conference on Decision and Control, Las Vegas, (2016) (Full paper at arXiv:1511.02975v3, 2015)

  15. 15.

    Tabak, E.G., Vanden-Eijnden, E.: Density estimation by dual ascent of the log-likelihood. Commun. Math. Sci. 8(1), 217–233 (2010)

    MathSciNet  Article  MATH  Google Scholar 

Download references

Acknowledgements

This is part of an ongoing project with several collaborators, including Jiequn Han, Qianxiao Li, Jianfeng Lu and Cheng Tai. The author benefitted a great deal from discussions with them, particularly Jiequn Han. This work is supported in part by the Major Program of NNSFC under Grant 91130005, ONR N00014-13-1-0338 and DOE DE-SC0009248.

Author information

Affiliations

Authors

Corresponding author

Correspondence to Weinan E.

Additional information

Dedicated to Professor Chi-Wang Shu on the occasion of his 60th birthday.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

E, W. A Proposal on Machine Learning via Dynamical Systems. Commun. Math. Stat. 5, 1–11 (2017). https://doi.org/10.1007/s40304-017-0103-z

Download citation

Keywords

  • Deep learning
  • Machine learning
  • Dynamical systems

Mathematics Subject Classification

  • 37N99