Handbook of Optimization pp 395-422
Theory and Applications of Hybrid Simulated Annealing
Local optimization techniques such as gradient-based methods and the expectation-maximization algorithm have an advantage of fast convergence but do not guarantee convergence to the global optimum. On the other hand, global optimization techniques based on stochastic approaches such as evolutionary algorithms and simulated annealing provide the possibility of global convergence, which is accomplished at the expense of computational and time complexity. This chapter aims at demonstrating how these two approaches can be effectively combined for improved convergence speed and quality of the solution. In particular, a hybrid method, called hybrid simulated annealing (HSA), is presented, where a simulated annealing algorithm is combined with local optimization methods. First, its general procedure and mathematical convergence properties are described. Then, its two example applications are presented, namely, optimization of hidden Markov models for visual speech recognition and optimization of radial basis function networks for pattern classification, in order to show how the HSA algorithm can be successfully adopted for solving real-world problems effectively. As an appendix, the source code for multi-dimensional Cauchy random number generation is provided, which is essential for implementation of the presented method.
Unable to display preview. Download preview PDF.
- 1.Bahl, L.R., Brown, P.F., de Souza, P.V., Mercer, R.L.: Maximum mutual information estimation of hidden Markov model parameters for speech recognition. In: Proc. Int. Conf. Acoustics, Speech and Signal Processing, Tokyo, Japan, pp. 49–52 (1986)Google Scholar
- 3.Benoudjit, N., Archambeau, C., Lendasse, A., Lee, M.V.J.A.: Width optimization of the Gaussian kernels in radial basis function networks. In: Proc. European Symposium on Artificial Neural Networks, Bruges, Belgium, pp. 425–432 (2002)Google Scholar
- 4.Bishop, C.M.: Neural Networks for Pattern Recognition. Oxford Univ. Press, Inc., New York (1995)Google Scholar
- 5.Blake, C.L., Merz, C.J.: UCI repository of machine learning database. Dept. Information and Computer Science. Univ. California (1998)Google Scholar
- 22.Nam, D., Lee, J.S., Park, C.H.: n-dimensional Cauchy neighbor generation for the fast simulated annealing. IEICE Trans. Inf. Syst. E87-D(11), 2499–2502 (2004)Google Scholar
- 24.Orr, M.J.L.: Introduction to radial basis function networks. Tech. rep., Center for Cognitive Science, Univ. Edinburgh (1996)Google Scholar
- 26.Paul, D.: Training of HMM recognizers by simulated annealing. In: Proc. Int. Conf. Acoustics, Speech and Signal Processing, Tampa, FL, pp. 13–16 (1985)Google Scholar
- 27.Rabiner, L., Juang, B.H.: Fundamentals of Speech Recognition. Prentice Hall, Englewood Cliffs (1993)Google Scholar
- 29.Rodrígues, L.J., Torres, I.: Comparative Study of the Baum-Welch and Viterbi Training Algorithms Applied to Read and Spontaneous Speech Recognition. In: Perales, F.J., Campilho, A.C., Pérez, N., Sanfeliu, A. (eds.) IbPRIA 2003. LNCS, vol. 2652, pp. 847–857. Springer, Heidelberg (2003)CrossRefGoogle Scholar