Journal of Global Optimization

, Volume 30, Issue 2, pp 253–270

Smooth Convex Approximation to the Maximum Eigenvalue Function



In this paper, we consider smooth convex approximations to the maximum eigenvalue function. To make it applicable to a wide class of applications, the study is conducted on the composite function of the maximum eigenvalue function and a linear operator mapping ℝm to \({\mathcal{S}}_n \), the space of n-by-n symmetric matrices. The composite function in turn is the natural objective function of minimizing the maximum eigenvalue function over an affine space in \({\mathcal{S}}_n \). This leads to a sequence of smooth convex minimization problems governed by a smoothing parameter. As the parameter goes to zero, the original problem is recovered. We then develop a computable Hessian formula of the smooth convex functions, matrix representation of the Hessian, and study the regularity conditions which guarantee the nonsingularity of the Hessian matrices. The study on the well-posedness of the smooth convex function leads to a regularization method which is globally convergent.

Matrix representation spectral function Symmetric function Tikhonov regularization 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Kluwer Academic Publishers 2004

Authors and Affiliations

  1. 1.Department of Mechanical and Industrial EngineeringUniversity of Illinois at Urbana-Champaign, 224 Mechanical Engineering Building, mc-244UrbanaUSA
  2. 2.School of MathematicsUniversity of SouthamptonSouthampton, S017 1BJGreat Britain
  3. 3.Department of Applied MathematicsThe Hong Kong Polytechnic University, Hung HomKowloonHong Kong

Personalised recommendations