Abstract
In the present paper, we present a multi-start method with dynamic cutting of “unpromising” starts of locally optimal algorithms for solving continuous unconditional optimization problems. We also describe our results of testing of the proposed method for solving problems of feed-forward neural network training.
Similar content being viewed by others
REFERENCES
Rumelhart, D., Hinton, G., and Williams, R., Learning internal representation by error propagation, in Parallel Distributed Processing, Cambridge, MA: MIT Press, 1986, vol. 1, pp. 318–362.
Parker, D., Learning logic, in Invention Report S81-64, File 1, Stanford, CA: Stanford University, 1982.
Werbos, P., Beyond regression: New tools for prediction and analysis in the behavioral sciences, Master’s Thesis, Harvard University, 1974.
Bartsev, S.I. and Okhonin, V.A., Adaptive information processing networks, in Preprint IF SB USSR, Krasnoyarsk, 1986, no. 59.
Hagan, M. and Menhaj, M., Training feedforward networks with the Marquardt algorithm, IEEE Trans. Neural Networks, 1994, vol. 5, no. 6, pp. 989–993.
Zhiglyavskii, A.A. and Zhilinskas, A.G., Metody poiska global’nogo ekstremuma (Methods for Global Extremum Search), Moscow: Nauka, 1991.
Markin, M.I., The choice of the initial approximation in training a neural network using local optimization methods, Proceedings of the Second All-Russian Scientific and Technical Conference Neuroinformatics-2000, Moscow, 2000, part 1.
Markin, M.I., About one method of increasing the efficiency of learning a direct distribution neural network, Software Systems and Tools: Thematic Collection of the VMiK Faculty of Lomonosov Moscow State University, Moscow, 2000, no. 1, pp. 87–97.
Nguyen, D. and Widrow, B., Improving the learning speed of 2-layer neural networks by choosing initial values of the adaptive weights, Proceedings of the International Joint Conference on Neural Networks, 1990, vol. 3, pp. 21–26.
Osowski, S., New approach to selection of initial values of weights in neural function approximation, Electron. Lett., 1993, vol. 29, pp. 313–315.
Yam, J.Y.F. and Chow, T.W.S., A weight initialization method for improving training speed in feedforward neural network, Neurocomputing, 2000, vol. 30, no. 1, pp. 219–232.
Wasserman, P.D., Neural Computing: Theory and Practice, Coriolis Group, 1989.
Fahlman, S., Faster-learning variations on back-propagation: An empirical study, Proceedings of the 1988 Connectionist Models Summer School, 1989, pp. 38–51.
Riedmiller, M. and Braun, H., A direct adaptive method for faster backpropagation learning: The RPROP algorithm, Proceedings of the IEEE International Conference on Neural Networks 1993, San Francisco, 1993.
Minoux, M., Mathematical Programming: Theory and Algorithms, Wiley, 1986.
Funding
The work was done with the financial support of RBRF (grant no. 19-07-00614).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
The authors declare that they have no conflicts of interest.
About this article
Cite this article
Kostenko, V.A. Multi-Start Method with Cutting for Solving Problems of Unconditional Optimization. Opt. Mem. Neural Networks 29, 30–36 (2020). https://doi.org/10.3103/S1060992X20010099
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.3103/S1060992X20010099