Abstract
The simulated annealing algorithm was proven in Chapter 4 to converge asymptotically to configurations with minimum cost if the number of transitions made at each temperature is sufficiently large. When applied to Hopfield neural networks, the resulting stochastic machines also exhibit the desirable attributes of preventing the networks from becoming stuck in local minima because simulated annealing allows the state of a stochastic machine to evolve through perturbations. Such an advantage is attained, however, at the expense of excessive computation required for the stochastic relaxation of variables. In searching for the equilibrium state of a machine at each temperature, the stochastic relaxation process is lengthy. By replacing the stochastic neurons by mean field approximation, which is often used in statistical physics [62], [158], it is expected that a faster relaxation to reach thermal equilibrium can be attained. That is, stochastic binary neurons of the stochastic machines are replaced by deterministic continuous ones, and a set of deterministic updating equations are used instead of the stochastic updating process in simulated annealing. Though this approximation method may not guarantee the search to reach global minima, it does provide a good approximation in finding near-optimal solutions with much less computing effort. The concept of mean field approximation is not new, but it was C. Peterson and his colleagues [128], [129], [130], [131], [132] who first introduced and applied this concept to neural networks in solving optimization problems. Much literature [15], [19], [32], [45], [80], [97], [119], [120], [144], [149], [162], [165], [166], [167], [176], [177] has since covered great details of mean field annealing, its implementations, and its applications.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 1997 Springer Science+Business Media New York
About this chapter
Cite this chapter
Ansari, N., Hou, E. (1997). Mean Field Annealing. In: Computational Intelligence for Optimization. Springer, Boston, MA. https://doi.org/10.1007/978-1-4615-6331-0_5
Download citation
DOI: https://doi.org/10.1007/978-1-4615-6331-0_5
Publisher Name: Springer, Boston, MA
Print ISBN: 978-1-4613-7907-2
Online ISBN: 978-1-4615-6331-0
eBook Packages: Springer Book Archive