Abstract
Multilayer perceptron (MLP) neural networks used for approximation of the functional dependency are capable of generalization and thus to a limited noise removal, for example from measured data. The following text shows the effect of noise on the results obtained when data is interpolated by a neural network on several functions of two and one function of three variables. The function values obtained from the trained neural network showed on average ten times lower deviations from the correct value than the data on which the network was trained, especially for higher noise levels. The obtained results confirm the suitability of using a neural network for an interpolation of unknown functional dependencies from measured data, even when the noise load cannot be removed.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Rumelhart DE, Hinton GE, Williams RJ (1986) Learning representations by back-propagating errors. Nature 323(6088):533–536
Cybenko G (1989) Approximation by superpositions of a sigmoidal function. Math Control Signals Syst 4(2):303–314
Gallant W (1988) There exists a neural network that does not make avoidable mistakes. In: IEEE 1988 international conference on neural networks, San Diego, CA, USA
Marquez LO, Hill T (1993) Function approximation using backpropagation and general regression neural networks. In: Hawaii international conference on system sciences
Steege FF, Stephan V, Groß HM (2012) Effects of noise-reduction on neural function approximation. In: Proceedings of 20th European symposium on artificial neural networks, computational intelligence and machine learning (ESANN 2012)
Badri L (2010) Development of neural networks for noise reduction. Int Arab J Inform Technol 7(3)
Goyal P, Benner P (2022) Neural ODEs with irregular and noisy data. Preprint on Researchgate.net, May 2022. https://doi.org/10.48550/arXiv.2205.09479
Cocianu C, Stan A (2016) A novel noise removal method using neural networks. Informatica Economică 20(3)
Borodinov N, Neumayer S, Kalinin S (2019) Deep neural networks for understanding noisy data applied to physical property extraction in scanning probe microscopy. In: NPJ Comput Mater 5(25). https://doi.org/10.1038/s41524-019-0148-5
Balaji Prabhu B, Narasipura O (2020) Improved image super-resolution using enhanced generative adversarial network a comparative study. In: Sharma H, Saraswat M, Kumar S, Bansal J (eds) Lecture notes on data engineering and communications technologies. Springer, Singapore
Carozza M, Rampone S (2000) Function approximation from noisy data by an incremental RBF network. Pattern Recogn 32(12). https://doi.org/10.1016/S0031-3203(99)00101-6
Kratsios A (2021) The universal approximation property. Ann Math Artif Intell 89:435–469
Song H, Kim M, Park D, Shin Y, Lee JG (2022) Learning from noisy labels with deep neural networks: a survey. IEEE Trans Neural Netw Learn Syst
Hu S, Pei Y, Liang PP, Liang YC (2019) Robust modulation classification under uncertain noise condition using recurrent neural network. In: 2018 IEEE global communications conference (GLOBECOM)
Samson A, Chandra S, Manikant M (2021) A deep neural network approach for the prediction of protein subcellular localization. Neural Netwk World 29–45. https://doi.org/10.14311/NNW.2021.31.002
Abeska Y, Cavas L (2022) Artificial neural network modelling of green synthesis of silver nanoparticles by honey. Neural Netw World 1–4. https://doi.org/10.14311/NNW.2022.32.001
Sarveswara RP, Lohith K, Satwik K, Neelima N (2022) Qualitative classification of wheat grains using supervised learning. In: Saraswat M, Sharma H, Balachandran K, Kim JH, Bansal JC (eds) Congress on intelligent systems. Lecture notes on data engineering and communications technologies, vol 111. Springer, Singapore. https://doi.org/10.1007/978-981-16-9113-3_7
Elshafiey I, Udpa L, Udpa S (1992) A neural network approach for solving inverse problems in NDE. In: Review of progress in quantitative nondestructive evaluation. advances in cryogenic engineering, vol 28
Bar-Sinai Y, Hoyer S, Hickey J, Brenner MP (2019) Learning data-driven discretizations for partial differential equations. Appl Math 116(31):15344–15349
Yuan L, Ni Y-Q, Deng X-Y, Hao S (2022) A-PINN: auxiliary physics informed neural networks for forward and inverse problems of nonlinear integro-differential equations. J Comput Phys 462
Yang L, Meng X, Karniadakis GE (2021) B-PINNs: bayesian physics-informed neural networks for forward and inverse PDE problems with noisy data. J Comput Phys 425
Shah J, Rattan SS, Nakra BC (2012) Kinematic analysis of a planar robot using artificial neural network. Int J Rob Autom 1(3):145–151
Hlavac V (2022) MLP neural network for a kinematic control of a redundant planar manipulator. In: Mechanisms and machine science. Springer, Cham
Shah SK, Mishra R, Ray LS (2020) Solution and validation of inverse kinematics using deep artificial neural network. Mater Today Proc 26(2):1250–1254
Rivas CEA (2022) Kinematics and control of a 3-DOF industrial manipulator robot. In: Congress on intelligent systems. Lecture notes on data engineering and communications technologies
Chembulya VV, Satish MJ, Vorugantia HK (2018) Trajectory planning of redundant manipulators moving along constrained path and avoiding obstacles. Procedia Comput Sci 133(2018):627–634. In: International conference on robotics and smart manufacturing
Hlavac V (2021) Neural network for the identification of a functional dependence using data preselection. Neural Netw World 2:109–124
Hlavac V (2021) Kinematics control of a redundant planar manipulator with a MLP neural network. In: Proceedings of the international conference on electrical, computer, communications and mechatronics engineering, mauritius
Brandejsky T (2019) GPA-ES algorithm modification for large data. In: Proceedings of the computational methods in systems and software. Springer, Cham
Nicolau M, Agapitos A (2021) Choosing function sets with better generalisation performance for symbolic regression models. In: Genetic programming and evolvable machines, vol 22, pp 73–100
Zhong J, Feng WCL, Ong Y-S (2020) Multifactorial genetic programming for symbolic regression problems. In: IEEE transactions on systems, man, and cybernetics: systems, vol 50, no 11, pp 4492–4505
Aldeia GSI, França FOD (2020) A Parametric study of interaction-transformation evolutionary algorithm for symbolic regression. In: 2020 IEEE congress on evolutionary computation (CEC)
McDermott J (2012) Genetic programming needs better benchmarks. In: GECCO ‘12: Proceedings of the 14th annual conference on Genetic and evolutionary computation, July, 2012
Hlavac V (2016) A program searching for a functional dependence using genetic programming with coefficient adjustment. In: Smart cities symposium Prague 2016, Prague
Hlavac V (2017) Accelerated genetic programming. In: MENDEL 2017. Advances in intelligent systems and computing, Brno
Davidson J, Savic D, Walters G (2003) Symbolic and numerical regression: experiments and applications. Inf Sci 150:95–117
Dhar VK, Tickoo AK, Koul R, Dubey BP (2010) Comparative performance of some popular artificial neural network algorithms on benchmark and function approximation problems. Pramana J Phys 74(2):307–324, 2010
Yang S, Ting T, Man K, Guan S-U (2013) Investigation of neural networks for function approximation. Procedia Comput Sci 17:586–594
Malan K, Cleghorn C (2022) A continuous optimisation benchmark suite from neural network regression. In: Rudolph G, Kononova AV, Aguirre H, Kerschke P, Ochoa G, Tušar T (eds) Parallel problem solving from nature—PPSN XVII., PPSN 2022. Lecture notes in computer science, vol 13398. Springer, Cham. https://doi.org/10.1007/978-3-031-14714-2_13
Matlab documentation (2022) Add white Gaussian noise. Available: https://www.mathworks.com/help/comm/ref/awgn.html. Last accessed 17 June 2022
Matlab sources (2022) (Online). Available: http://users.fs.cvut.cz/hlavac/MLP&noise.zip. Last accessed 07 July 2022
Liu J, Ni F, Du M, Zhang X, Que Z, Song S (2021) Upper bounds on the node numbers of hidden layers in MLPs. Neural Netw World 297–309
Sekeroglu B, Dimililer K (2020) Review and analysis of hidden neuron number effect of shallow backpropagation neural networks. Neural Netw World 97–112
Lu L, Jin P, Pang G (2021) Learning nonlinear operators via DeepONet based on the universal approximation theorem of operators. Nat Mach Intell 3:218–229
Matlab documentation (2022) Bayesian regularization backpropagation. Available: https://www.mathworks.com/help/deeplearning/ref/trainbr.html. Last accessed 09 May 2022
Matlab documentation (2022) Peaks function. Available: https://www.mathworks.com/help/matlab/ref/peaks.html. Last accessed 16 June 2022
Gurney K (1997) An introduction to neural networks. UCL Press
Hlavac V (2018) Genetic programming with either stochastic or deterministic constant evaluation. Neural Netw World 2:119–131
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Hlavac, V. (2023). An MLP Neural Network for Approximation of a Functional Dependence with Noise. In: Kumar, S., Sharma, H., Balachandran, K., Kim, J.H., Bansal, J.C. (eds) Third Congress on Intelligent Systems. CIS 2022. Lecture Notes in Networks and Systems, vol 613. Springer, Singapore. https://doi.org/10.1007/978-981-19-9379-4_32
Download citation
DOI: https://doi.org/10.1007/978-981-19-9379-4_32
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-19-9378-7
Online ISBN: 978-981-19-9379-4
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)