Advertisement

Single-Funnel and Multi-funnel Landscapes and Subthreshold-Seeking Behavior

  • Darrell WhitleyEmail author
  • Jonathan Rowe
Chapter
  • 1.3k Downloads
Part of the Natural Computing Series book series (NCS)

Abstract

Algorithms for parameter optimization display subthreshold-seeking behavior when the majority of the points that the algorithm samples have an evaluation less than some target threshold. Subthreshold-seeking algorithms avoid the curse of the general and Sharpened No Free Lunch theorems in the sense that they are better than random enumeration on a specific (but general) family of functions. In order for subthreshold-seeking search to be possible, most of the solutions that are below threshold must be localized in one or more regions of the search space. Functions with search landscapes that can be characterized as single-funnel or multi-funnel landscapes have this localized property. We first analyze a simple “Subthreshold-Seeker” algorithm. Further theoretical analysis details conditions that would allow a Hamming neighborhood local search algorithm using a Gray or binary representation to display subthreshold-seeking behavior. A very simple modification to local search is proposed that improves its subthreshold-seeking behavior.

Keywords

Single Funnel Free Lunch Result Local Search Algorithm Hamming Neighborhood Search Space 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Notes

Acknowledgements

This research was sponsored by the Air Force Office of Scientific Research, Air Force Materiel Command, USAF, under grant number FA9550-07-1-0403. The U.S. Government is authorized to reproduce and distribute reprints for Governmental purposes notwithstanding any copyright notation thereon. We would also like to thank Dagstuhl for giving us the chance to meet and exchange ideas.

References

  1. 1.
    D. Ackley, A Connectionist Machine for Genetic Hillclimbing (Kluwer Academic, Boston, 1987)CrossRefGoogle Scholar
  2. 2.
    K.D. Boese, A.B. Kahng, S. Muddu. On the big valley and adaptive multi-start for discrete global optimizations. Technical report, UCLA CS Department, 1993Google Scholar
  3. 3.
    K.D. Boese, A.B. Kahng, S. Muddu, A new adaptive multi-start technique for combinatorial global optimizations. Oper. Res. Lett. 16, 101–113 (1994)CrossRefzbMATHMathSciNetGoogle Scholar
  4. 4.
    S. Christensen, F. Oppacher, What can we learn from no free lunch? in GECCO-01, San Francisco, 2001 (Morgan Kaufmann, 2001), pp. 1219–1226Google Scholar
  5. 5.
    J. Culberson, On the futility of blind search. Evolut. Comput. 6(2), 109–127 (1998)CrossRefGoogle Scholar
  6. 6.
    J. Doye, M. Miller, D. Wales, The double-funnel energy landscape of the 38-atom Lennard-Jones cluster. J. Chem. Phys. 110(14), (1999)Google Scholar
  7. 7.
    J. Doye, R. Leary, M. Locatelli, F. Schoen, Global optimization of Morse clusters by potential energy transforms. INFORMS J. Comput. 16(4), 371–379 (2004)CrossRefzbMATHGoogle Scholar
  8. 8.
    N. Hansen, S. Kern, Evaluating the CMA evolution strategy on multimodal test functions, in Proceedings of 8th International Conference on Parallel Problem Solving from Nature, Birmingham (Springer, 2004), pp. 282–291Google Scholar
  9. 9.
    M. Lunacek, L.D. Whitley, The dispersion metric and the CMA evolution strategy, in GECCO’06: proceedings of the 8th Annual Conference on Genetic and Evolutionary Computation, Seattle (ACM, New York, 2006), pp. 477–484Google Scholar
  10. 10.
    R. Marcia, J. Mitchell, J. Rosen, Multi-funnel optimization using Gaussian underestimation. J. Glob. Optim. 39(1), 39–48 (2007)CrossRefzbMATHMathSciNetGoogle Scholar
  11. 11.
    N. Radcliffe, P. Surry, Fundamental limitations on search algorithms: evolutionary computing in perspective, in Computer Science Today, ed. by J. van Leeuwen. Lecture Notes in Computer Science, vol. 1000 (Springer, Berlin/Heidelberg, 1995)Google Scholar
  12. 12.
    C. Schumacher, M. Vose, L. Whitley, The no free lunch and problem description length, in Proceedings of the Genetic and Evolutionary Computation Conference (GECCO 2001), San Francisco, 2001, pp. 565–570Google Scholar
  13. 13.
    H.-P. Schwefel, Evolution and Optimum Seeking (Wiley, New York, 1995)Google Scholar
  14. 14.
    A.M. Sutton, L.D. Whitley, M. Lunacek, A.E. Howe, PSO and multi-funnel landscapes: how cooperation might limit exploration, in Genetic and Evolutionary Computation Conference (GECCO 2006), Seattle, 2006Google Scholar
  15. 15.
    D. Whitley, K. Mathias, S. Rana, J. Dzubera, Evaluating evolutionary algorithms. Artif. Intell. J. 85, 1–32 (1996)CrossRefGoogle Scholar
  16. 16.
    D. Whitley, J. Rowe, Subthreshold-seeking local search. Theor. Comput. Sci. 361(1), 2–17 (2006)CrossRefzbMATHMathSciNetGoogle Scholar
  17. 17.
    D. Whitley, J. Rowe, Focused no free lunch theorems, in GECCO-08, Atlanta (ACM, 2008)Google Scholar
  18. 18.
    D. Whitley, J. Rowe, A no free lunch tutorial: sharpened and focused no free lunch, eds. A. Auger, B. Doerr. Theory of Randomized Search Heuristics (World Scientific, Singapore, 2010)Google Scholar
  19. 19.
    L.D. Whitley, M. Lunacek, J. Knight, Ruffled by ridges: how evolutionary algorithms can fail, in Genetic and Evolutionary Computation Conference, Seattle, vol. 2, 2004, pp. 294–306Google Scholar
  20. 20.
    D. H. Wolpert, W.G. Macready, No free lunch theorems for search. Technical report, SFI-TR-95-02-010, Santa Fe Institute, July 1995Google Scholar
  21. 21.
    D.H. Wolpert, W.G. Macready, No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 4, 67–82 (1997)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2014

Authors and Affiliations

  1. 1.Department of Computer ScienceColorado State UniversityFort CollinsUSA
  2. 2.Department of Computer ScienceUniversity of BirminghamBirminghamUK

Personalised recommendations