Skip to main content
Log in

Global Optimization by Monotonic Transformation

  • Published:
Computational Optimization and Applications Aims and scope Submit manuscript

Abstract

This paper addresses the problem of global optimization by means of a monotonic transformation. With an observation on global optimality of functions under such a transformation, we show that a simple and effective algorithm can be derived to search within possible regions containing the global optima. Numerical experiments are performed to compare this algorithm with one that does not incorporate transformed information using several benchmark problems. These results are also compared to best known global search algorithms in the literature. In addition, the algorithm is shown to be useful for several neural network learning problems, which possess much larger parameter spaces.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. J. Barhen, V. Protopopescu, and D. Reister, “TRUST: A deterministic algorithm for global optimization,” Science, vol. 276, pp. 1094–1097, 1997.

    Google Scholar 

  2. G.L. Bilbro, “Fast stochastic global optimization,” IEEE Tran. Systems, Man, and Cybernetics, vol. 24, no. 4, pp. 684–689, 1994.

    Google Scholar 

  3. S.M. Carroll and B.W. Dickinson, “Construction of neural nets using Radon transform,” in Proceedings of International Joint Conference on Neural Networks (IJCNN), 1989, vol. I, pp. 607–611.

    Google Scholar 

  4. D. Cvijović and J. Klinowski, “Taboo search: An approach to the multiple minima problem,” Science, vol. 267, pp. 664–666, 1995.

    Google Scholar 

  5. G. Cybenko, “Approximations by superpositions of a sigmoidal function,” Math. Cont. Signal & Systems, vol. 2, pp. 303–314, 1989.

    Google Scholar 

  6. J. Friedman and W. Stuetzle, “Projection persuit regression,” J. Amer. Statist. Assoc., vol. 76, pp. 817–823, 1981.

    Google Scholar 

  7. Ken-Ichi Funahashi, “On the approximation realization of continuous mappings by neural network,” Neural Networks, vol. 2, pp. 183–192, 1989.

    Google Scholar 

  8. A.R. Gallant and H. White, “There exists a neural network that does not make avoidable mistakes,” in Proceedings of International Conference on Neural Networks (ICNN), 1988, vol. 1, pp. 657–664.

    Google Scholar 

  9. R. Hecht-Nielsen, “Kolmogorov's mapping neural network existence theorem,” in Proceedings of IEEE First International Conference on Neural Networks (ICNN), 1987, vol. III, pp. 11–14.

    Google Scholar 

  10. J.-B. Hiriart-Urruty, “Conditions for global optimality 2,” J. Global Optimization, vol. 13, pp. 349–367, 1998.

    Google Scholar 

  11. K. Hornik, M. Stinchcombe, and H. White, “Multi-layer feedforward networks are universal approximators,” Neural Networks, vol. 2, no. 5, pp. 359–366, 1989.

    Google Scholar 

  12. R. Horst and P.M. Pardalos (Eds.), Handbook of Global Optimization, Kluwer Academics: Dordrecht, 1995.

    Google Scholar 

  13. R. Horst and H. Tuy, Global Optimization: Deterministic Approaches, Springer-Verlag: Berlin, 1996.

    Google Scholar 

  14. P.J. Huber, “Projection pursuit,” The Annals of Statistics, vol. 13, no. 2, pp. 435–475, 1985.

    Google Scholar 

  15. B. Irie and S. Miyake, “Capabilities of three-layered perceptrons,” in Proceedings of IEEE First International Conference on Neural Networks (ICNN), 1988, vol. I, pp. I641–I647.

    Google Scholar 

  16. A.H.G.R. Kan and G.T. Timmer, “Stochastic global optimization methods part I: Clustering methods,” Mathematical Programming, vol. 39, pp. 27–56, 1987 (North-Holland).

    Google Scholar 

  17. A.H.G.R. Kan and G.T. Timmer, “Stochastic global optimization methods part II: Multi level methods,” Mathematical Programming, vol. 39, pp. 57–78, 1987 (North-Holland).

    Google Scholar 

  18. A.N. Kolmogorov, “On the representation of continuous functions of many variables by superposition of continuous functions of one variable and addition,” Doklalady Akademii Nauk SSSR, vol. 114, pp. 953–956, 1957 (American Mathematical Society Translation, vol. 28, pp. 55-59, 1963).

    Google Scholar 

  19. K. Levenberg, “A method for the solution of certain non-linear problems in least squares,” Quarterly Journal of Applied Mathematics, vol. 2, pp. 164–168, 1944.

    Google Scholar 

  20. D.G. Luenberger, Linear and Nonlinear Programming, Addison-Wesley: Reading, MA, 1984.

    Google Scholar 

  21. D.W. Marquardt, “An algorithm for least-squares estimation of nonlinear parameters,” Journal of the Society for Industrial & Applied Mathematics, vol. 11, no. 2, pp. 431–441, 1963.

    Google Scholar 

  22. H.L. Royden, Real Analysis, 2nd edn., MacMillan: New York, 1988.

    Google Scholar 

  23. K.A. Toh, “Fast deterministic global optimization for FNN training,” in Proceedings of IEEE International Conference on Systems, Man, and Cybernetics, Tokyo, Japan, Oct. 1999, pp. V413–V418.

  24. A. Törn and A. Žilinskas, Global Optimization, Springer-Verlag: Berlin, 1989 (Lecture Notes in Computer Science).

    Google Scholar 

  25. B.W. Wah and Yao-Jen Chang, “Traced-based methods for solving nonlinear global optimization and satisfi-ability problems,” J. Global Optimization, vol. 10, no. 2, pp. 107–141, 1997.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Toh, KA. Global Optimization by Monotonic Transformation. Computational Optimization and Applications 23, 77–99 (2002). https://doi.org/10.1023/A:1019976724755

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1023/A:1019976724755

Navigation