Skip to main content

An MLP Neural Network for Approximation of a Functional Dependence with Noise

  • Conference paper
  • First Online:
Third Congress on Intelligent Systems (CIS 2022)

Part of the book series: Lecture Notes in Networks and Systems ((LNNS,volume 613))

Included in the following conference series:

  • 259 Accesses

Abstract

Multilayer perceptron (MLP) neural networks used for approximation of the functional dependency are capable of generalization and thus to a limited noise removal, for example from measured data. The following text shows the effect of noise on the results obtained when data is interpolated by a neural network on several functions of two and one function of three variables. The function values obtained from the trained neural network showed on average ten times lower deviations from the correct value than the data on which the network was trained, especially for higher noise levels. The obtained results confirm the suitability of using a neural network for an interpolation of unknown functional dependencies from measured data, even when the noise load cannot be removed.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 189.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 249.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Rumelhart DE, Hinton GE, Williams RJ (1986) Learning representations by back-propagating errors. Nature 323(6088):533–536

    Google Scholar 

  2. Cybenko G (1989) Approximation by superpositions of a sigmoidal function. Math Control Signals Syst 4(2):303–314

    Google Scholar 

  3. Gallant W (1988) There exists a neural network that does not make avoidable mistakes. In: IEEE 1988 international conference on neural networks, San Diego, CA, USA

    Google Scholar 

  4. Marquez LO, Hill T (1993) Function approximation using backpropagation and general regression neural networks. In: Hawaii international conference on system sciences

    Google Scholar 

  5. Steege FF, Stephan V, Groß HM (2012) Effects of noise-reduction on neural function approximation. In: Proceedings of 20th European symposium on artificial neural networks, computational intelligence and machine learning (ESANN 2012)

    Google Scholar 

  6. Badri L (2010) Development of neural networks for noise reduction. Int Arab J Inform Technol 7(3)

    Google Scholar 

  7. Goyal P, Benner P (2022) Neural ODEs with irregular and noisy data. Preprint on Researchgate.net, May 2022. https://doi.org/10.48550/arXiv.2205.09479

  8. Cocianu C, Stan A (2016) A novel noise removal method using neural networks. Informatica Economică 20(3)

    Google Scholar 

  9. Borodinov N, Neumayer S, Kalinin S (2019) Deep neural networks for understanding noisy data applied to physical property extraction in scanning probe microscopy. In: NPJ Comput Mater 5(25). https://doi.org/10.1038/s41524-019-0148-5

  10. Balaji Prabhu B, Narasipura O (2020) Improved image super-resolution using enhanced generative adversarial network a comparative study. In: Sharma H, Saraswat M, Kumar S, Bansal J (eds) Lecture notes on data engineering and communications technologies. Springer, Singapore

    Google Scholar 

  11. Carozza M, Rampone S (2000) Function approximation from noisy data by an incremental RBF network. Pattern Recogn 32(12). https://doi.org/10.1016/S0031-3203(99)00101-6

  12. Kratsios A (2021) The universal approximation property. Ann Math Artif Intell 89:435–469

    Google Scholar 

  13. Song H, Kim M, Park D, Shin Y, Lee JG (2022) Learning from noisy labels with deep neural networks: a survey. IEEE Trans Neural Netw Learn Syst

    Google Scholar 

  14. Hu S, Pei Y, Liang PP, Liang YC (2019) Robust modulation classification under uncertain noise condition using recurrent neural network. In: 2018 IEEE global communications conference (GLOBECOM)

    Google Scholar 

  15. Samson A, Chandra S, Manikant M (2021) A deep neural network approach for the prediction of protein subcellular localization. Neural Netwk World 29–45. https://doi.org/10.14311/NNW.2021.31.002

  16. Abeska Y, Cavas L (2022) Artificial neural network modelling of green synthesis of silver nanoparticles by honey. Neural Netw World 1–4. https://doi.org/10.14311/NNW.2022.32.001

  17. Sarveswara RP, Lohith K, Satwik K, Neelima N (2022) Qualitative classification of wheat grains using supervised learning. In: Saraswat M, Sharma H, Balachandran K, Kim JH, Bansal JC (eds) Congress on intelligent systems. Lecture notes on data engineering and communications technologies, vol 111. Springer, Singapore. https://doi.org/10.1007/978-981-16-9113-3_7

  18. Elshafiey I, Udpa L, Udpa S (1992) A neural network approach for solving inverse problems in NDE. In: Review of progress in quantitative nondestructive evaluation. advances in cryogenic engineering, vol 28

    Google Scholar 

  19. Bar-Sinai Y, Hoyer S, Hickey J, Brenner MP (2019) Learning data-driven discretizations for partial differential equations. Appl Math 116(31):15344–15349

    Google Scholar 

  20. Yuan L, Ni Y-Q, Deng X-Y, Hao S (2022) A-PINN: auxiliary physics informed neural networks for forward and inverse problems of nonlinear integro-differential equations. J Comput Phys 462

    Google Scholar 

  21. Yang L, Meng X, Karniadakis GE (2021) B-PINNs: bayesian physics-informed neural networks for forward and inverse PDE problems with noisy data. J Comput Phys 425

    Google Scholar 

  22. Shah J, Rattan SS, Nakra BC (2012) Kinematic analysis of a planar robot using artificial neural network. Int J Rob Autom 1(3):145–151

    Google Scholar 

  23. Hlavac V (2022) MLP neural network for a kinematic control of a redundant planar manipulator. In: Mechanisms and machine science. Springer, Cham

    Google Scholar 

  24. Shah SK, Mishra R, Ray LS (2020) Solution and validation of inverse kinematics using deep artificial neural network. Mater Today Proc 26(2):1250–1254

    Google Scholar 

  25. Rivas CEA (2022) Kinematics and control of a 3-DOF industrial manipulator robot. In: Congress on intelligent systems. Lecture notes on data engineering and communications technologies

    Google Scholar 

  26. Chembulya VV, Satish MJ, Vorugantia HK (2018) Trajectory planning of redundant manipulators moving along constrained path and avoiding obstacles. Procedia Comput Sci 133(2018):627–634. In: International conference on robotics and smart manufacturing

    Google Scholar 

  27. Hlavac V (2021) Neural network for the identification of a functional dependence using data preselection. Neural Netw World 2:109–124

    Google Scholar 

  28. Hlavac V (2021) Kinematics control of a redundant planar manipulator with a MLP neural network. In: Proceedings of the international conference on electrical, computer, communications and mechatronics engineering, mauritius

    Google Scholar 

  29. Brandejsky T (2019) GPA-ES algorithm modification for large data. In: Proceedings of the computational methods in systems and software. Springer, Cham

    Google Scholar 

  30. Nicolau M, Agapitos A (2021) Choosing function sets with better generalisation performance for symbolic regression models. In: Genetic programming and evolvable machines, vol 22, pp 73–100

    Google Scholar 

  31. Zhong J, Feng WCL, Ong Y-S (2020) Multifactorial genetic programming for symbolic regression problems. In: IEEE transactions on systems, man, and cybernetics: systems, vol 50, no 11, pp 4492–4505

    Google Scholar 

  32. Aldeia GSI, França FOD (2020) A Parametric study of interaction-transformation evolutionary algorithm for symbolic regression. In: 2020 IEEE congress on evolutionary computation (CEC)

    Google Scholar 

  33. McDermott J (2012) Genetic programming needs better benchmarks. In: GECCO ‘12: Proceedings of the 14th annual conference on Genetic and evolutionary computation, July, 2012

    Google Scholar 

  34. Hlavac V (2016) A program searching for a functional dependence using genetic programming with coefficient adjustment. In: Smart cities symposium Prague 2016, Prague

    Google Scholar 

  35. Hlavac V (2017) Accelerated genetic programming. In: MENDEL 2017. Advances in intelligent systems and computing, Brno

    Google Scholar 

  36. Davidson J, Savic D, Walters G (2003) Symbolic and numerical regression: experiments and applications. Inf Sci 150:95–117

    Google Scholar 

  37. Dhar VK, Tickoo AK, Koul R, Dubey BP (2010) Comparative performance of some popular artificial neural network algorithms on benchmark and function approximation problems. Pramana J Phys 74(2):307–324, 2010

    Google Scholar 

  38. Yang S, Ting T, Man K, Guan S-U (2013) Investigation of neural networks for function approximation. Procedia Comput Sci 17:586–594

    Google Scholar 

  39. Malan K, Cleghorn C (2022) A continuous optimisation benchmark suite from neural network regression. In: Rudolph G, Kononova AV, Aguirre H, Kerschke P, Ochoa G, Tušar T (eds) Parallel problem solving from nature—PPSN XVII., PPSN 2022. Lecture notes in computer science, vol 13398. Springer, Cham. https://doi.org/10.1007/978-3-031-14714-2_13

  40. Matlab documentation (2022) Add white Gaussian noise. Available: https://www.mathworks.com/help/comm/ref/awgn.html. Last accessed 17 June 2022

  41. Matlab sources (2022) (Online). Available: http://users.fs.cvut.cz/hlavac/MLP&noise.zip. Last accessed 07 July 2022

  42. Liu J, Ni F, Du M, Zhang X, Que Z, Song S (2021) Upper bounds on the node numbers of hidden layers in MLPs. Neural Netw World 297–309

    Google Scholar 

  43. Sekeroglu B, Dimililer K (2020) Review and analysis of hidden neuron number effect of shallow backpropagation neural networks. Neural Netw World 97–112

    Google Scholar 

  44. Lu L, Jin P, Pang G (2021) Learning nonlinear operators via DeepONet based on the universal approximation theorem of operators. Nat Mach Intell 3:218–229

    Google Scholar 

  45. Matlab documentation (2022) Bayesian regularization backpropagation. Available: https://www.mathworks.com/help/deeplearning/ref/trainbr.html. Last accessed 09 May 2022

  46. Matlab documentation (2022) Peaks function. Available: https://www.mathworks.com/help/matlab/ref/peaks.html. Last accessed 16 June 2022

  47. Gurney K (1997) An introduction to neural networks. UCL Press

    Google Scholar 

  48. Hlavac V (2018) Genetic programming with either stochastic or deterministic constant evaluation. Neural Netw World 2:119–131

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Vladimir Hlavac .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Hlavac, V. (2023). An MLP Neural Network for Approximation of a Functional Dependence with Noise. In: Kumar, S., Sharma, H., Balachandran, K., Kim, J.H., Bansal, J.C. (eds) Third Congress on Intelligent Systems. CIS 2022. Lecture Notes in Networks and Systems, vol 613. Springer, Singapore. https://doi.org/10.1007/978-981-19-9379-4_32

Download citation

Publish with us

Policies and ethics