A New Type of a Wavelet Neural Network
- 4 Downloads
Wavelet transformation uses a special basis widely known for its unique properties, the most important of which are its compactness and multiresolution (wavelet functions are produced from the mother wavelet by transition and dilation). Wavelet neural networks (WNN) use wavelet functions to decompose the approximated function. However, for a standard wavelet basis with fixed transition and dilation coefficients, the decomposition may be not optimal. If no inverse transformation is needed, the values of transition and dilation coefficients may be determined during network training, and the windows corresponding to various wavelet functions may overlap. In this study, we suggest a new type of a WNN—Adaptive Window WNN (AWWNN), designed primarily for signal processing, in which window positions and wavelet levels are determined with a special iterative procedure. Two modifications of this new type of WNN are tested against linear model and multi-layer perceptron on Mackey-Glass benchmark problem.
Keywords:approximation wavelet neural networks wavelet analysis group method of data handling spectroscopy
This study has been carried out with financial support of The Ministry of Education and Science of the Russian Federation, Agreement no. 14.604.21.0163, project identifier RFMEFI60417X0163.
- 1.Haykin, S., Neural Networks and Learning Machines, 3rd ed., Pearson Education, 2009.Google Scholar
- 6.Mikolov, T., Chen, K., Corrado, G., and Dean, J., Efficient Estimation of Word Representations in Vector Space; arXiv:1301.3781 [cs.CL].Google Scholar
- 8.Efitorov, A.O., Burikov, S.A., Dolenko, T.A., Persiantsev, I.G., and Dolenko, S.A., Comparison of the quality of solving the inverse problems of spectroscopy of multi-component solutions with neural network methods and with the method of projection to latent structures, Opt. Mem. Neural Networks, 2015, vol. 24, no. 2, pp. 93–101. doi 10.3103/S1060992X15020022CrossRefGoogle Scholar
- 12.Ricalde, L.J., Catzin, G.A., Alanis, A.Y., and Sanchez, E.N., Higher order wavelet neural networks with Kalman learning for wind speed forecasting, in 2011 IEEE Symposium on Computational Intelligence Applications in Smart Grid (CIASG), 2011. doi 10.1109/CIASG.2011.595333210.1109/CIASG.2011.5953332Google Scholar
- 13.Fang, Y., Fataliyev, K., Wang, L., Fu, X., and Wang, Y., Improving the genetic-algorithm-optimized wavelet neural network for stock market prediction, in 2014 International Joint Conference on Neural Networks (IJCNN), Beijing, 2014, pp. 3038–3042. doi 10.1109/IJCNN.2014.688996910.1109/IJCNN.2014.6889969Google Scholar
- 15.Ivakhnenko, A.G., Polynomial theory of complex systems, IEEE Trans. Systems, Man, and Cybernetics, 1971, vol. SMC-1(4), pp. 364–378. dop 10.1109/tsmc.1971.4308320Google Scholar
- 17.Ouahabi, A., Ed., Signal and Image Multiresolution Analysis, Wiley, 2012.Google Scholar
- 18.Ruder, S., An overview of gradient descent optimization algorithms, 2017; arXiv:1609.04747v2.Google Scholar
- 21.Keras: The Python Deep Learning Library. https://keras.io/.Google Scholar
- 22.TensorFlowTM: An open source machine learning framework for everyone. https://www.tensorflow.org/.Google Scholar
- 23.Tange, O., GNU Parallel 2018. Mar. 2018, ISBN 9781387509881. doi 10.5281/zenodo.114601410.5281/ zenodo.1146014Google Scholar