Skip to main content
Log in

Extreme Learning Machine with Kernels for Solving Elliptic Partial Differential Equations

  • Published:
Cognitive Computation Aims and scope Submit manuscript

Abstract

Finding solutions for partial differential equations based on machine learning methods is an ongoing challenge in applied mathematics. Although machine learning methods have been successfully applied to solve partial differential equations, their practical applications are limited by a large number of variables. In this study, after a strict theoretical derivation, a new method is proposed for solving standard elliptic partial differential equations based on an extreme learning machine with kernels. The parameters of the proposed method (i.e., regularization coefficients and kernel parameters) were obtained by a grid search approach. Three numerical cases combined with some comprehensive indices (mean absolute error, mean squared error and standard deviation) were used to test the performance of the proposed method. The results show that the performance of the proposed method is superior to that of existing methods, including the wavelet neural network optimized with the improved butterfly optimization algorithm. In addition, the proposed method has fewer unknown parameters than previous methods, which makes its calculations more convenient. In this study, the effect of the number of training points on the calculation results is also discussed, and the advantage of the proposed method is that only a few training points are needed to achieve high computational accuracy. In addition, as a case study, the proposed method is successfully applied to simulate the water flow in unsaturated soils. The proposed method provides new insight for solving elliptic partial differential equations.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

Data Availability

All data generated or analyzed during this study are included within the article.

References

  1. Alzubi JA, Alzubi OA, Chen TM. Forward error correction based on algebraic-geometric theory. Springer. 2014.

  2. Collins BD, Znidarcic D. Stability analyses of rainfall induced landslides. Journal of Geotechnical and Geoenvironmental Engineering. 2004;130(4):362–72.

    Article  Google Scholar 

  3. Iverson RM. Landslide triggering by rain infiltration. Water Resour Res. 2000;36(7):1897–910.

    Article  Google Scholar 

  4. Lu N, Godt J W. Hillslope hydrology and stability. Cambridge University Press. 2013.

  5. Schmidhuber J. Deep learning in neural networks: an overview. Neural Netw. 2015;61:85–117.

    Article  Google Scholar 

  6. Smola AJ, Schölkopf B. A tutorial on support vector regression. Stat Comput. 2004;14(3):199–222.

    Article  MathSciNet  Google Scholar 

  7. Vapnik VN. An overview of statistical learning theory. IEEE Trans Neural Networks. 1999;10(5):988–99.

    Article  Google Scholar 

  8. Tan LS, Zainuddin Z, Ong P. Wavelet neural networks based solutions for elliptic partial differential equations with improved butterfly optimization algorithm training. Appl Soft Comput. 2020;95:106518.

    Article  Google Scholar 

  9. Lagaris IE, Likas A, Fotiadis DI. Artificial neural networks for solving ordinary and partial differential equations. IEEE Trans Neural Networks. 1998;9(5):987–1000.

    Article  Google Scholar 

  10. Lagaris IE, Likas AC, Papageorgiou DG. Neural-network methods for boundary value problems with irregular boundaries. IEEE Trans Neural Networks. 2000;11(5):1041–9.

    Article  Google Scholar 

  11. Alli H, Uçar A, Demir Y. The solutions of vibration control problems using artificial neural networks. J Franklin Inst. 2003;340(5):307–25.

    Article  MathSciNet  MATH  Google Scholar 

  12. Filici C. On a neural approximator to ODEs. IEEE Trans Neural Networks. 2008;19(3):539–43.

    Article  Google Scholar 

  13. Effati S, Pakdaman M. Artificial neural network approach for solving fuzzy differential equations. Inf Sci. 2010;180(8):1434–57.

    Article  MathSciNet  MATH  Google Scholar 

  14. Alzubi OA, Alzubi JA, Dorgham O, Alsayyed M. Cryptosystem design based on Hermitian curves for IoT security. J Supercomput. 2020;76:8566–89.

    Article  Google Scholar 

  15. Alzubi OA, Alzubi JA, Tedmori S, Rashaideh H, Almomani O. Consensus-based combining method for classifier ensembles. The International Arab Journal of Information Technology. 2018;15(1):76–86.

    Google Scholar 

  16. Gheisari M, Panwar D, Tomar P, Harsh H, Zhang X, Solanki A, Alzubi JA. An optimization model for software quality prediction with case study analysis using MATLAB. IEEE Access. 2019;7:85123–38.

    Article  Google Scholar 

  17. Huang GB, Zhu QY, Siew CB. Extreme learning machine: a new learning scheme of feedforward neural networks. IEEE International Joint Conference on Neural Networks (IEEE Cat. No.04CH37541) 2004;2:985–90.

  18. Huang GB, Zhu QY, Siew CB. Extreme learning machine: Theory and applications. Neurocomputing. 2006;70:489–501.

    Article  Google Scholar 

  19. Huang FM, Huang JS, Jiang SH, Zhou CB. Landslide displacement prediction based on multivariate chaotic model and extreme learning machine. Eng Geol. 2017a;218:173–86.

    Article  Google Scholar 

  20. Huang FM, Yin KL, Huang JS, Gui L, Wang P. Landslide susceptibility mapping based on self-organizing-map network and extreme learning machine. Eng Geol. 2017b;223:11–22.

    Article  Google Scholar 

  21. Liu C, Sun B, Zhang CH, Li F. A hybrid prediction model for residential electricity consumption using holt-winters and extreme learning machine. Appl Energy. 2020;275:115383.

    Article  Google Scholar 

  22. Albu F, Hagiescu D, Vladutu L, Puica MA. Neural network approaches for children’s emotion recognition in intelligent learning applications. In EDULEARN15 7th Annu Int Conf Educ New Learn Technol Barcelona, Spain, 6th-8th. 2015.

  23. Malathi V, Marimuthu NS, Baskar S, Ramar K. Application of extreme learning machine for series compensated transmission line protection. Eng Appl Artif Intell. 2011;24(5):880–7.

    Article  Google Scholar 

  24. Huang GB. An Insight into Extreme Learning Machines: Random Neurons. Random Features and Kernels Cognitive Computation. 2014;6(3):376–90.

    Article  Google Scholar 

  25. Zhou C, Yin KL, Cao Y, Intrieri E, Ahmed B, Catani F. Displacement prediction of step-like landslide by applying a novel kernel extreme learning machine method. Landslides. 2018;15(11):2211–25.

    Article  Google Scholar 

  26. Alizamir M, Kim S, Zounemat-Kermani M, Heddam S, Kim NW, Singh VP. Kernel extreme learning machine: an efficient model for estimating daily dew point temperature using weather data. Water. 2020;12(9):2600.

    Article  Google Scholar 

  27. Li YF, Shi HP, Liu H. A hybrid model for river water level forecasting: cases of Xiangjiang River and Yuanjiang River, China. J Hydrol. 2020;124934.

  28. Chen SY, Gu CS, Lin CN, Hariri-Ardebili MA. Prediction of arch dam deformation via correlated multi-target stacking. Appl Math Model. 2021;91:1175–93.

    Article  MathSciNet  MATH  Google Scholar 

  29. Mall S, Chakraverty S. Single layer Chebyshev neural network model for solving elliptic partial differential equations. Neural Process Lett. 2017;45(3):825–40.

    Article  Google Scholar 

  30. Peng X. TSVR: an efficient twin support vector machine for regression. Neural Netw. 2010;23(3):365–72.

    Article  MATH  Google Scholar 

  31. Javadi AA, Ahangar-Asr A, Johari A, Faramarzi A, Toll D. Modelling stress–strain and volume change behaviour of unsaturated soils using an evolutionary based data mining technique, an incremental approach. Eng Appl Artif Intell. 2012;25(5):926–33.

    Article  Google Scholar 

  32. Wu LZ, Zhu SR, Peng PJ. Application of the Chebyshev spectral method to the simulation of groundwater flow and rainfall-induced landslides. Appl Math Model. 2020;80:408–25.

    Article  MathSciNet  MATH  Google Scholar 

  33. Tracy FT. Clean two- and three-dimensional analytical solutions of Richards’ equation for testing numerical solvers. Water Resour Res. 2006;42(8):W08503.

    Article  Google Scholar 

  34. Kennedy J, Eberhart R. Particle swarm optimization. Proceedings of IEEE International Conference Neural Networks, IEEE Service Center. 1995;1995:1942–8.

    Google Scholar 

  35. Chau KW. Particle swarm optimization training algorithm for ANNs in stage prediction of Shing Mun River. J Hydrol. 2006;329(3–4):363–7.

    Article  Google Scholar 

  36. Arora S, Singh S. Butterfly optimization algorithm: a novel approach for global optimization. Soft Comput. 2019;23:715–34.

    Article  Google Scholar 

  37. Wen L, Cao Y. A hybrid intelligent predicting model for exploring household CO2 emissions mitigation strategies derived from butterfly optimization algorithm. Sci Total Environ. 2020;727:138572.

    Article  Google Scholar 

Download references

Acknowledgements

This work was supported by the National Natural Science Foundation of China (Grant No. 51578466).

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Guoguo Liu or Shiguo Xiao.

Ethics declarations

Conflict of Interest

The authors declare no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix

Appendix

A1 Expressions of T j in Eq. (21) and \(\Omega _{ij}\) in Eq. (23)

In Eq. (21),

$$\begin{array}{c}\frac{\partial A({X}_{j})}{\partial x}=-\frac{{h}_{0}({y}_{j})}{b-a}+\frac{{h}_{1}({y}_{j})}{b-a}+\frac{(d-{y}_{j})}{d-c}\left\{\frac{\partial {q}_{0}({x}_{j})}{\partial x}-\left[-\frac{{q}_{0}(a)}{b-a}+\frac{{q}_{0}(b)}{b-a}\right]\right\}\\ +\frac{{y}_{j}-c}{d-c}\left\{\frac{\partial {q}_{1}({x}_{j})}{\partial x}-\left[-\frac{{q}_{1}(a)}{b-a}+\frac{{q}_{1}(b)}{b-a}\right]\right\}\end{array}$$
$$\frac{{\partial }^{2}A({X}_{j})}{\partial {x}^{2}}=\frac{(d-{y}_{j})}{d-c}\left\{\frac{{\partial }^{2}{q}_{0}({x}_{j})}{\partial {x}^{2}}\right\}+\frac{{y}_{j}-c}{d-c}\left\{\frac{{\partial }^{2}{q}_{1}({x}_{j})}{\partial {x}^{2}}\right\}$$
$$\begin{aligned}\frac{\partial A({X}_{j})}{\partial y}&=\frac{(b-{x}_{j})}{b-a}\frac{\partial {h}_{0}({y}_{j})}{\partial y}+\frac{{x}_{j}-a}{b-a}\frac{\partial {h}_{1}({y}_{j})}{\partial y}\\&\quad-\frac{1}{d-c}\left\{{q}_{0}({x}_{j})-\left[\frac{\left(b-{x}_{j}\right)}{b-a}{q}_{0}(a)+\frac{{x}_{j}-a}{b-a}{q}_{0}(b)\right]\right\}\\&\quad+\frac{1}{d-c}\left\{{q}_{1}({x}_{j})-\left[\frac{\left(b-{x}_{j}\right)}{b-a}{q}_{1}(a)+\frac{{x}_{j}-a}{b-a}{q}_{1}(b)\right]\right\}\end{aligned}$$
$$\frac{{\partial }^{2}A({X}_{j})}{\partial {y}^{2}}=\frac{(b-{x}_{j})}{b-a}\frac{{\partial }^{2}{h}_{0}({y}_{j})}{\partial {y}^{2}}+\frac{{x}_{j}-a}{b-a}\frac{{\partial }^{2}{h}_{1}({y}_{j})}{\partial {y}^{2}}$$

In Eq. (23),

$$K({X}_{i},{X}_{j})=\mathrm{exp}\left(-\frac{{\Vert {X}_{i}-{X}_{j}\Vert }^{2}}{{\gamma }^{2}}\right)$$
$$\frac{\partial K\left({X}_{i},{X}_{j}\right)}{\partial x}=-2({x}_{i}-{x}_{j})\mathrm{exp}\left(-\frac{{\Vert {X}_{i}-{X}_{j}\Vert }^{2}}{{\gamma }^{2}}\right)/{\gamma }^{2}$$
$$\frac{\partial K\left({X}_{i},{X}_{j}\right)}{\partial y}=-2({y}_{j}-{y}_{i})\mathrm{exp}\left(-\frac{{\Vert {X}_{i}-{X}_{j}\Vert }^{2}}{{\gamma }^{2}}\right)/{\gamma }^{2}$$
$$\begin{aligned}\frac{{\partial }^{2}K\left({X}_{i},{X}_{j}\right)}{\partial {x}^{2}}&=-2({\gamma }^{2}-2{x}_{j}{}^{2}+4{x}_{j}{x}_{i}-2{x}_{i}{}^{2})\mathrm{exp}\left(-\frac{{\Vert {X}_{i}-{X}_{j}\Vert }^{2}}{{\gamma }^{2}}\right)/{\gamma }^{4} \end{aligned}$$
$$\begin{aligned}\frac{{\partial }^{2}K\left({X}_{i},{X}_{j}\right)}{\partial {y}^{2}}=-2({\gamma }^{2}-2{y}_{j}{}^{2}+4{y}_{j}{y}_{i}-2{y}_{i}{}^{2})\mathrm{exp}\left(-\frac{{\Vert {X}_{i}-{X}_{j}\Vert }^{2}}{{\gamma }^{2}}\right)/{\gamma }^{4} \end{aligned}$$
$$\frac{\partial B({X}_{j})}{\partial x}=-(c-{y}_{j})(d-{y}_{j})\left(a+b-2{x}_{j}\right)$$
$$\frac{\partial B({X}_{j})}{\partial y}=-(a-{x}_{j})(b-{x}_{j})\left(c+d-2{y}_{j}\right)$$
$$\frac{{\partial }^{2}B({x}_{j},{y}_{j})}{\partial {x}^{2}}=2(c-{y}_{j})(d-{y}_{j})$$
$$\frac{{\partial }^{2}B({X}_{j})}{\partial {y}^{2}}=2(a-{x}_{j})(b-{x}_{j})$$

A2 Brief Introductions to the Methods Used for Comparison

ChNN: This method used the Chebyshev neural network to construct a trial solution and used a gradient-based optimization algorithm to learn the connection weight and bias of the network.

WNNPSO: The wavelet neural network was used to construct the trial solution. The particle swarm optimization algorithm was used as the learning algorithm to obtain the connection weights and biases of the wavelet neural network. The particle swarm optimization algorithm is a classic heuristic optimization algorithm pioneered by Kennedy and Eberhart [34] and has been widely used in practice [35].

WNNBOA: The wavelet neural network was used to construct the trial solution. The butterfly optimization algorithm  [36, 37] was used to learn the connection weights and biases of wavelet neural networks.

WNNIBOA: The wavelet neural network was used to construct the trial solution. The improved butterfly optimization algorithm was used to learn the connection weights and biases of wavelet neural networks. Compared with the standard butterfly optimization algorithm, the adjustment parameters in the improved butterfly optimization algorithm change dynamically, which theoretically enhances the algorithm's global optimization capability.

The above four methods all need to determine a large number of parameters (network connection weight and bias).

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Li, S., Liu, G. & Xiao, S. Extreme Learning Machine with Kernels for Solving Elliptic Partial Differential Equations. Cogn Comput 15, 413–428 (2023). https://doi.org/10.1007/s12559-022-10026-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12559-022-10026-2

Keywords

Navigation