Skip to main content
Log in

Multi-Start Method with Cutting for Solving Problems of Unconditional Optimization

  • Published:
Optical Memory and Neural Networks Aims and scope Submit manuscript

Abstract

In the present paper, we present a multi-start method with dynamic cutting of “unpromising” starts of locally optimal algorithms for solving continuous unconditional optimization problems. We also describe our results of testing of the proposed method for solving problems of feed-forward neural network training.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1.
Fig. 2.

Similar content being viewed by others

REFERENCES

  1. Rumelhart, D., Hinton, G., and Williams, R., Learning internal representation by error propagation, in Parallel Distributed Processing, Cambridge, MA: MIT Press, 1986, vol. 1, pp. 318–362.

    Book  Google Scholar 

  2. Parker, D., Learning logic, in Invention Report S81-64, File 1, Stanford, CA: Stanford University, 1982.

  3. Werbos, P., Beyond regression: New tools for prediction and analysis in the behavioral sciences, Master’s Thesis, Harvard University, 1974.

  4. Bartsev, S.I. and Okhonin, V.A., Adaptive information processing networks, in Preprint IF SB USSR, Krasnoyarsk, 1986, no. 59.

  5. Hagan, M. and Menhaj, M., Training feedforward networks with the Marquardt algorithm, IEEE Trans. Neural Networks, 1994, vol. 5, no. 6, pp. 989–993.

    Article  Google Scholar 

  6. Zhiglyavskii, A.A. and Zhilinskas, A.G., Metody poiska global’nogo ekstremuma (Methods for Global Extremum Search), Moscow: Nauka, 1991.

  7. Markin, M.I., The choice of the initial approximation in training a neural network using local optimization methods, Proceedings of the Second All-Russian Scientific and Technical Conference Neuroinformatics-2000, Moscow, 2000, part 1.

  8. Markin, M.I., About one method of increasing the efficiency of learning a direct distribution neural network, Software Systems and Tools: Thematic Collection of the VMiK Faculty of Lomonosov Moscow State University, Moscow, 2000, no. 1, pp. 87–97.

  9. Nguyen, D. and Widrow, B., Improving the learning speed of 2-layer neural networks by choosing initial values of the adaptive weights, Proceedings of the International Joint Conference on Neural Networks, 1990, vol. 3, pp. 21–26.

    Google Scholar 

  10. Osowski, S., New approach to selection of initial values of weights in neural function approximation, Electron. Lett., 1993, vol. 29, pp. 313–315.

    Article  Google Scholar 

  11. Yam, J.Y.F. and Chow, T.W.S., A weight initialization method for improving training speed in feedforward neural network, Neurocomputing, 2000, vol. 30, no. 1, pp. 219–232.

    Article  Google Scholar 

  12. Wasserman, P.D., Neural Computing: Theory and Practice, Coriolis Group, 1989.

    Google Scholar 

  13. Fahlman, S., Faster-learning variations on back-propagation: An empirical study, Proceedings of the 1988 Connectionist Models Summer School, 1989, pp. 38–51.

  14. Riedmiller, M. and Braun, H., A direct adaptive method for faster backpropagation learning: The RPROP algorithm, Proceedings of the IEEE International Conference on Neural Networks 1993, San Francisco, 1993.

  15. Minoux, M., Mathematical Programming: Theory and Algorithms, Wiley, 1986.

    MATH  Google Scholar 

Download references

Funding

The work was done with the financial support of RBRF (grant no. 19-07-00614).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to V. A. Kostenko.

Ethics declarations

The authors declare that they have no conflicts of interest.

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kostenko, V.A. Multi-Start Method with Cutting for Solving Problems of Unconditional Optimization. Opt. Mem. Neural Networks 29, 30–36 (2020). https://doi.org/10.3103/S1060992X20010099

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.3103/S1060992X20010099

Keywords:

Navigation