Advertisement

A Time-Sensitive System for Black-Box Combinatorial Optimization

  • Vinhthuy Phan
  • Pavel Sumazin
  • Steven Skiena
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 2409)

Abstract

When faced with a combinatorial optimization problem, practitioners often turn to black-box search heuristics such as simulated annealing and genetic algorithms. In black-box optimization, the problem-specific components are limited to functions that (1) generate candidate solutions, and (2) evaluate the quality of a given solution. A primary reason for the popularity of black-box optimization is its ease of implementation. The basic simulated annealing search algorithm can be implemented in roughly 30–50 lines of any modern programming language, not counting the problem-specific local-move and cost-evaluation functions. This search algorithm is so simple that it is often rewritten from scratch for each new application rather than being reused.

Keywords

Simulated Annealing Search Heuristic Combinatorial Optimization Problem Vertex Cover Greedy Heuristic 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [1]
    E. Aarts and J.-K. Lenstra. Local Search in Combinatorial Optimization. Wiley-Interscience, Chichester, England, 1997.zbMATHGoogle Scholar
  2. [2]
    D. Corne, M. Dorigo, and F. Glover. New Ideas in Optimization. McGraw-Hill, London, 1999.Google Scholar
  3. [3]
    T. Csendes and D. Ratz. Subdivision direction selection in interval methods for global optimization. SIAM Journal on Numerical Analysis, 34(3):922–938, 1997.zbMATHCrossRefMathSciNetGoogle Scholar
  4. [4]
    F. Glover. Tabu search-part I. ORSA Journal on Computing, 1(3):190–206, 1989.zbMATHGoogle Scholar
  5. [5]
    J. H. Holland. Adaptation in natural artificial systems. University of Michigan Press, Ann Arbor, 1975.Google Scholar
  6. [6]
    L. Ingber. Adaptive simulated annealing (asa): Lessons learned. Control and Cybernetics, 25(1):33–54, 1996.zbMATHGoogle Scholar
  7. [7]
    M. Jelasity. Towards automatic domain knowledge extraction for evolutionary heuristics. In Parallel Problem Solving from Nature — PPSN VI, 6th International Conference, volume 1917 of Lecture Notes in Computer Science, pages 755–764, Paris, France, Sept. 2000. Springer.CrossRefGoogle Scholar
  8. [8]
    D. S. Johnson, C. R. Aragon, L. A. McGeoch, and C. Schevon. Optimization by simulated annealing: an experimental evaluation; part 1, graph partitioning. Operations Research, 37(6):865–892, 1989.zbMATHCrossRefGoogle Scholar
  9. [9]
    D. S. Johnson, C. R. Aragon, L. A. McGeoch, and C. Schevon. Optimization by simulated annealing: an experimental evaluation; part 2, graph coloring and number partitioning. Operations Research, 39(3):878–406, 1991.Google Scholar
  10. [10]
    S. Kirpatrick, C. Gelatt, Jr., and M. Vecchi. Optimization by simulated annealing. Science, 220:671–680, May 1983.Google Scholar
  11. [11]
    A. V. Kuntsevich. Fortran-77 and fortran-90 global optimization toolbox: User’s guide. Technical Report A-8010, Institut fur Mathematic, Karl Franzens Universitat, Graz, Austria, 1995.Google Scholar
  12. [12]
    L. Lukšan, M. Tuma, M. Šiška, J. Vlček, and N. Ramešová. Interactive system for universal functional optimization (ufo). Technical Report 826, Institute of computer science, Academy of sciences of the Czech Republic, Prague, Czech Republic, 2000.Google Scholar
  13. [13]
    M. Mongeau, H. Karsenty, V. Rouzé, and J.-B. Hiriart-Urruty. Comparison of public-domain software for black box global optimization. Optimization Methods and Software, 13(3):203–226, 2000.zbMATHCrossRefMathSciNetGoogle Scholar
  14. [14]
    V. Phan, S. Skiena, and P. Sumazin. A model for analyzing black box optimization, in preparation, 2001.Google Scholar
  15. [15]
    V. Phan, P. Sumazin, and S. Skiena. Discropt web page. http://www.cs.sunysb.edu/~discropt.
  16. [16]
    G. Reinelt. TSPLIB. University of Heidelberg, http://www.iwr.uni-heidelberg.de/groups/comopt/software/TSPLIB95.
  17. [17]
    G. Reinelt. TSPLIB— A traveling salesman problem library. ORSA Journal on Computing, 3(4):376–384, 1991.zbMATHGoogle Scholar
  18. [18]
    M. Resende. Max-Satisfiability Data. Information Sciences Research Center, AT&T, http://www.research.att.com/~mgcr.
  19. [19]
    D. H. Wolpert and W. G. Macready. No free lunch theorems for optimization. IEEE Transactions on Evolutionary Computation, 1(1):67–82, 1997.CrossRefGoogle Scholar
  20. [20]
    M. Yagiura and T. Ibaraki. On metaheuristic algorithms for combinatorial optimization problems. The Transactions of the Institute of Electronics, Information and Communication Engineers, J83-D-1(1):3–25, 200.Google Scholar
  21. [21]
    Q. Zheng and D. Zhuang. Integral global optimization: Algorithms, implementations and numerical tests. Journal of Global Optimization, 7(4):421–454, 1995.zbMATHCrossRefMathSciNetGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2002

Authors and Affiliations

  • Vinhthuy Phan
    • 1
  • Pavel Sumazin
    • 1
  • Steven Skiena
    • 1
  1. 1.State University of New York at Stony BrookStony BrookUSA

Personalised recommendations