Abstract
This chapter discusses a number of recent results in evolutionary optimization. In particular, we show that the search step size of a variation operator plays a vital role in its efficient search of a landscape. We have derived the optimal search step size of mutation operators in evolutionary optimization. Based on this theoretical analysis, we have developed several new evolutionary algorithms which outperform existing evolutionary algorithms significantly on many benchmark functions.
Most of the existing work in evolutionary optimization concentrates on different variation (i.e., search) operators, such as crossover and mutation. However, there may be a better way to solve a complex problem by transforming it into a simpler one first and then solving it. The key issue here is how to approximate the problem without changing the nature of the problem (i.e., the optima we wish to find). This chapter will present the latest results on landscape approximation and hybrid evolutionary algorithms.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Yao, X. (1996) An overview of evolutionary computation. Chinese Journal of Advanced Software Research (Allerton Press, Inc., New York, NY 10011), 3, 12–29
Kirkpatrick, S., Gelatt, CD., Vecchi, M.P. (1983) Optimization by simulated annealing. Science, 220, 671–680
Szu, H.H., Hartley, R.L. (1987) Fast simulated annealing. Physics Letters A, 122, 157–162
Ingber, L. (1989) Very fast simulated re-annealing. Mathl. Comput. Modelling, 12, 967–973
Yao, X. (1995) A new simulated annealing algorithm. Int. J. of Computer Math., 56, 161–168
Grefenstette, J.J. (1987) Incorporating problem specific knowledge into genetic algorithms. In L. Davis, editor, Genetic Algorithms and Simulated Annealing, chapter 4, 42–60. Morgan Kaufmann, San Mateo, CA
Fogel, D.B. (1991) System Identification Through Simulated Evolution: A Machine Learning Approach to Modeling. Ginn Press, Needham Heights, MA
Fogel, D.B. (1992) Evolving Artificial Intelligence. PhD thesis, University of California, San Diego, CA
Fogel, D.B. (1993) Applying evolutionary programming to selected traveling salesman problems. Cybernetics and Systems, 24, 27–36
Yao, X., Liu, Y., Lin, G. (1999) Evolutionary programming made faster. IEEE Transactions on Evolutionary Computation 3, 82–102
Fogel, D.B. (1994) An introduction to simulated evolutionary optimisation. IEEE Trans, on Neural Networks, 5, 3–14
Bäck, T., Schwefel, H.P. (1993) An overview of evolutionary algorithms for parameter optimization. Evolutionary Computation, 1, 1–23
Fogel, D.B. (1995) Evolutionary Computation: Towards a New Philosophy of Machine Intelligence. IEEE Press, New York, NY
Gehlhaar, D.K., Fogel, D.B. (1996) Tuning evolutionary programming for conformationally flexible molecular docking. In L.J. Fogel, P.J. Angeline, and T. Bäck, editors, Evolutionary Programming V: Proc. of the Fifth Annual Conference on Evolutionary Programming, 419–429. MIT Press, Cambridge, MA
Feller, W. (1971) An Introduction to Probability Theory and Its Applications, volume 2. John Wiley& Sons, 2nd edition
Yao, X., Liu, Y. (1996) Fast evolutionary programming. In L. J. Fogel, P.J. Angeline, and T. Back, editors, Evolutionary Programming V: Proc. of the Fifth Annual Conference on Evolutionary Programming, 451–460, MIT Press, Cambridge, MA.
Törn, A., Zilinskas, A. (1989) Global Optimisation. Lecture Notes in Computer Science, 350. Springer-Verlag, Berlin.
Schwefel, H.P. (1995) Evolution and Optimum Seeking. John Wiley& Sons, New York
Wolpert, D.H., Macready, W.G. (1995) No free lunch theorems for search. Technical Report SFI-TR-95-02-010, Santa Fe Institute, 1399 Hyde Park Road, Santa Fe, NM 87501, USA
Wolpert, D.H., Macready, W.G. (1997) No free lunch theorems for optimization. IEEE Transactions on Evolutionary Computation, 1, 67–82
Devroye, L. (1986) Non-Uniform Random Variate Generation. Springer-Verlag, New York, NY
Hunt, R.A. (1986) Calculus with Analytic Geometry. Harper& Row, New York, NY
Yao, X. (1994) Introduction. Informatica (Special Issue on Evolutionary Computation), 18, 375–376
Chellapilla, K. (1998) Combining mutation operators in evolutionary programming. IEEE Transactions on Evolutionary Computation, 2, 91–96
Törn, A.A. (1978) A search-clustering approach to global optimization. In L.C.W. Dixon and G.P. Szegö, editors, Towards Global Optimization 2, 49–62, North-Holland, Amsterdam
Rinnooy Kan, A.H.G., Timmer, G.T. (1987) Stochastic global optimization methods part II: Multi level methods. Mathematical Programming, 39, 57–78
Dixon, L.C.W., Szegö, G.P. (1978) The global optimization problem: An introduction. In L.C.W. Dixon and G.P. Szegö, editors, Towards Global Optimization 2, 1–15, Amsterdam. North-Holland
Ali, M.M., Storey, C., Tör n, A. (1997) Application of stochastic global optimization algorithms to practical problems. Journal of Optimization and Application, 95, 545–563
Whitley, D., Gordon, V.S., Mathias, K. (1994) Lamarkian evolution, the baldwin effect and function optimization. In Y. Davidor, H.-P. Schwefel, and R. Männer, editors, Parallel Problem Solving from Nature-PPSN III, Lecture Notes in Computer Science, 866, 6–15, Springer-Verlag, Berlin.
Hart, W.E., Belew, R.K. (1996) Optimization with genetic algorithm hybrids that use local search. In R.K. Belew and M. Mitchell, editors, Adaptive Individuals in Evolving Populations: Models and Algorithms, volume 26 of SFI Studies in the Sciences of Complexity, 483-496, Addison-Wesley, Reading, MA.
Baldwin, J.M. (1896) A new factor in evolution. American Naturalist, 30, 441–451
Hinton, G.E., Nolan, S.J. (1987) How learning can guide evolution. Complex Systems, 1, 495–502
Powell, M.J.D. (1964) An efficient method for finding the minimum of a function of several variables without calculating derivatives. The Computer Journal, 7, 155–162
Neider, J.A., Mead, R. (1965) A simplex method for function minimization. The Computer Journal, 7, 308–313
Powell, M.J.D. (1994) A direct search optimization method that models the objective and constraint functions by linear interpolation. In S. Gomez and J.-P. Hennart, editors, Advances in Optimization and Numerical Analysis, Proceedings of the Sixth Workshop on Optimization and Numerical Analysis, Oaxaca, Mexico, 275, 51–67, Kluwer Academic, Dordrechth, NL.
H.-M. Voigt and J.M. Lange. Local evolutionary search enhancement by random memorizing. In Proceedings of the 1998 IEEE International Conference on Evolutionary Computation (ICEC ’98), 547–552, Piscataway, NJ, 1998. IEEE Press.
Winfield, D. (1973) Function minimization by interpolation in a data table. Journal of the Institute of Mathematics and its Applications, 12, 339–347
Powell, M.J.D. (1994) A direct search optimization method that models the objective by quadratic interpolation. Presentation at the 5th Stockholm Optimization Days
Conn, A.R., Toint, Ph.L. (1996) An algorithm using quadratic interpolation for unconstrained derivative free optimization. In G. Di Pillo and F. Gianessi, editors, Nonlinear Optimization and Applications, 27–47, Plenum Publishing, New York.
Liang, K.H., Yao, X., Newton, C. (1999) Combining landscape approximation and local search in global optimization. In Proceedings of the 1999 Congress on Evolutionary Computation, 2, 1514–1520, IEEE Press, Piscataway, NJ.
Schaffer, J. D., Caruana, R. A., Eshelman, L.J., Das, R. (1989) A study of control parameters affecting online performance of genetic algorithms for function optimization. In J.D. Schaffer, editor, Proceedings of the third International Conference on Genetic Algorithms (ICGA’ 89)‚ 51–60, Morgan Kaufmann, San Mateo, CA.
Yao, X., Lin, G., Liu, Y. (1997) An analysis of evolutionary algorithms based on neighbourhood and step sizes. In P.J. Angeline, R.G. Reynolds, J.R. McDonnell, and R. Eberhart, editors, Evolutionary Programming VI: Proc. of the Sixth Annual Conference on Evolutionary Programming, Lecture Notes in Computer Science, 1213, 297–307, Springer, Berlin.
Back, Th., Eiben, A. E. (1999) Generalizations of intermediate recombination in evolution strategies. In Proceedings of the 1999 Congress on Evolutionary Computation, 2, 1566–1573, IEEE Press, Piscataway, NJ.
Born, J. (1996) An evolution strategy with adaptation of the step sizes by a variance function. In H.-M. Voigt, W. Ebeling, I. Rechenberg, and H.-P. Schwefel, editors, Parallel Problem Solving from Nature (PPSN) IV, Lecture Notes in Computer Science, 1141, 388–397, Springer-Verlag, Berlin.
Kappler, C. (1996) Are evolutionary algorithms improved by large mutations? In H.-M. Voigt, W. Ebeling, I. Rechenberg, and H.-P. Schwefel, editors, Parallel Problem Solving from Nature (PPSN) IV, Lecture Notes in Computer Science, 1141, 346–355, Springer-Verlag, Berlin.
Yao, X., Liu, Y. (1997) Fast evolution strategies. Control and Cybernetics, 26, 467–496
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2003 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Yao, X., Liu, Y., Liang, KH., Lin, G. (2003). Fast Evolutionary Algorithms. In: Ghosh, A., Tsutsui, S. (eds) Advances in Evolutionary Computing. Natural Computing Series. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-18965-4_2
Download citation
DOI: https://doi.org/10.1007/978-3-642-18965-4_2
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-62386-8
Online ISBN: 978-3-642-18965-4
eBook Packages: Springer Book Archive