Large scale optimization problems often require an approximation to the Hessian matrix. If the Hessian matrix is sparse then estimation by differences of gradients is attractive because the number of required differences is usually small compared to the dimension of the problem. The problem of estimating Hessian matrices by differences can be phrased as follows: Given the sparsity structure of a symmetric matrixA, obtain vectorsd_{1},d_{2}, …d_{p} such thatAd_{1},Ad_{2}, …Ad_{p} determineA uniquely withp as small as possible. We approach this problem from a graph theoretic point of view and show that both direct and indirect approaches to this problem have a natural graph coloring interpretation. The complexity of the problem is analyzed and efficient practical heuristic procedures are developed. Numerical results illustrate the differences between the various approaches.

Key words

Graph ColoringEstimation of Hessian MatricesSparsityDifferentiationNumerical DifferencesNP-Complete ProblemsUnconstrained Minimization