Abstract
It is known that there are feasible algorithms for minimizing convex functions, and that for general functions, global minimization is a difficult (NP-hard) problem. It is reasonable to ask whether there exists a class of functions that is larger than the class of all convex functions for which we can still solve the corresponding minimization problems feasibly. In this paper, we prove, in essence, that no such more general class exists. In other words, we prove that global optimization is always feasible only for convex objective functions.
Similar content being viewed by others
References
L. Jaulin M. Kieffer O. Didrit E. Walter (2001) Applied Interval Analysis, with Examples in Parameter and State Estimation, Robust Control and Robotics Springer-Verlag London
R.B. Kearfott (1996) Rigorous Global Search: Continuous Problems Kluwer Boston, MA
R.B. Kearfott V. Kreinovich (Eds) (1996) Applications of Interval Computations Kluwer Boston, Massachusetts
V. Kreinovich A. Lakeyev J. Rohn P. Kahl (1997) Computational Complexity and Feasibility of Data Processing and Interval Computations Kluwer Dordrecht
C.H. Papadimitriou (1994) Computational Complexity Addison-Wesley Reading, Massachusetts
S. Sahni (1974) ArticleTitleComputationally related problems SIAM J. Comput. 3 267–279 Occurrence Handle10.1137/0203021
S.A. Vavasis (1991) Nonlinear Optimization: Complexity Issues Oxford University Press New York
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Kreinovich, V., Kearfott, R.B. Beyond Convex? Global Optimization is Feasible Only for Convex Objective Functions: A Theorem. J Glob Optim 33, 617–624 (2005). https://doi.org/10.1007/s10898-004-2120-1
Issue Date:
DOI: https://doi.org/10.1007/s10898-004-2120-1