Computational Economics

, Volume 28, Issue 3, pp 277–290

Optimizing the Garch Model–An Application of Two Global and Two Local Search Methods

  • Kwami Adanu
Article

DOI: 10.1007/s10614-006-9048-0

Cite this article as:
Adanu, K. Comput Econ (2006) 28: 277. doi:10.1007/s10614-006-9048-0

Abstract

Results from our optimization exercise clearly show the advantage of using the random search algorithms when we anticipate the search for the global optimum to be difficult. When the number of parameters in the model is relatively small (nine parameters) Differential Evolution performs better than Genetic Algorithm. However, when the number of parameters in the model is relatively large (fifteen parameters) the reverse case is true. A comparison of the Quasi-Newton and Simplex methods also shows that both the Quasi-Newton algorithm of shazam and the simplex algorithm of fminsearch are sensitive to starting values. However, allowing shazam to set its starting values or using the PRESAMP option to set the starting values produced the best results for shazam. The general conclusion of this paper is that the choice of optimization technique for difficult optimization problems like the one attempted here should be based on problem attributes. When in doubt, multiple techniques should be applied and the estimated results evaluated.

Keywords

GARCHglobal optimumgenetic algorithmdifferential evolutionQuasi-Newton algorithmSimplex method

Copyright information

© Springer Science + Business Media, Inc. 2006

Authors and Affiliations

  • Kwami Adanu
    • 1
  1. 1.Michigan State UniversityEast LansingUSA