Advertisement

Offline Learning for Selection Hyper-heuristics with Elman Networks

  • William B. Yates
  • Edward C. Keedwell
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10764)

Abstract

Offline selection hyper-heuristics are machine learning methods that are trained on heuristic selections to create an algorithm that is tuned for a particular problem domain. In this work, a simple selection hyper-heuristic is executed on a number of computationally hard benchmark optimisation problems, and the resulting sequences of low level heuristic selections and objective function values are used to construct an offline learning database. An Elman network is trained on sequences of heuristic selections chosen from the offline database and the network’s ability to learn and generalise from these sequences is evaluated. The networks are trained using a leave-one-out cross validation methodology and the sequences of heuristic selections they produce are tested on benchmark problems drawn from the HyFlex set. The results demonstrate that the Elman network is capable of intra-domain learning and generalisation with 99% confidence and produces better results than the training sequences in many cases. When the network was trained using an inter-domain training set, the Elman network did not exhibit generalisation indicating that inter-domain generalisation is a harder problem and that strategies learned on one domain cannot necessarily be transferred to another.

Keywords

Hyper-heuristics Elman networks Offline learning 

References

  1. 1.
    Burke, E.K., Hyde, M., Kendall, G., Ochoa, G., Özcan, E., Woodward, J.: A classification of hyper-heuristic approaches. In: Gendreau, M., Potvin, J.Y. (eds.) Handbook of Metaheuristics. Springer, Boston (2010).  https://doi.org/10.1007/978-1-4419-1665-5_15 Google Scholar
  2. 2.
    Ross, P., Schulenburg, S., Marín-Bläzquez, J.G., Hart, E.: Hyper-heuristics: learning to combine simple heuristics in bin-packing problems. In: Proceedings of the 4th Annual Conference on Genetic and Evolutionary Computation, GECCO 2002, pp. 942–948. Morgan Kaufmann Publishers Inc., San Francisco (2002)Google Scholar
  3. 3.
    Burke, E.K., Petrovic, S., Qu, R.: Case-based heuristic selection for timetabling problems. J. Sched. 9(2), 115–132 (2006)CrossRefzbMATHGoogle Scholar
  4. 4.
    Terashima-Marín, H., Ortiz-Bayliss, J.C., Ross, P., Valenzuela-Rendón, M.: Hyper-heuristics for the dynamic variable ordering in constraint satisfaction problems. In: Proceedings of the 10th Annual Conference on Genetic and Evolutionary Computation, GECCO 2008, pp. 571–578. ACM, New York (2008)Google Scholar
  5. 5.
    Kheiri, A., Keedwell, E.: A sequence-based selection hyper-heuristic utilising a hidden Markov model. In: Proceedings of the 2015 Annual Conference on Genetic and Evolutionary Computation, GECCO 2015, pp. 417–424. ACM (2015)Google Scholar
  6. 6.
    Yates, W.B., Keedwell, E.C.: Clustering of hyper-heuristic selections using the Smith-Waterman algorithm for offline learning. In: Proceedings of the Genetic and Evolutionary Computation Conference (GECCO), Berlin, pp. 119–120. ACM (2017)Google Scholar
  7. 7.
    Ochoa, G., et al.: HyFlex: a benchmark framework for cross-domain heuristic search. In: Hao, J.-K., Middendorf, M. (eds.) EvoCOP 2012. LNCS, vol. 7245, pp. 136–147. Springer, Heidelberg (2012).  https://doi.org/10.1007/978-3-642-29124-1_12 CrossRefGoogle Scholar
  8. 8.
    Elman, J.L.: Finding structure in time. Cogn. Sci. 14(2), 179–211 (1990)CrossRefGoogle Scholar
  9. 9.
    Walker, J.D., Ochoa, G., Gendreau, M., Burke, E.K.: Vehicle routing and adaptive iterated local search within the HyFlex hyper-heuristic framework. In: Hamadi, Y., Schoenauer, M. (eds.) LION 2012. LNCS, pp. 265–276. Springer, Heidelberg (2012).  https://doi.org/10.1007/978-3-642-34413-8_19 CrossRefGoogle Scholar
  10. 10.
    Drake, J.H., Özcan, E., Burke, E.K.: An improved choice function heuristic selection for cross domain heuristic search. In: Coello, C.A.C., Cutello, V., Deb, K., Forrest, S., Nicosia, G., Pavone, M. (eds.) PPSN 2012. LNCS, vol. 7492, pp. 307–316. Springer, Heidelberg (2012).  https://doi.org/10.1007/978-3-642-32964-7_31 CrossRefGoogle Scholar
  11. 11.
    Mısır, M., Verbeeck, K., Causmaecker, P.D., Berghe, G.V.: A new hyper-heuristic as a general problem solver: an implementation in HyFlex. J. Sched. 16(3), 291–311 (2013)MathSciNetCrossRefGoogle Scholar
  12. 12.
    Drake, J.H., Özcan, E., Burke, E.K.: A comparison of crossover control mechanisms within single-point selection hyper-heuristics using HyFlex. In: IEEE Congress on Evolutionary Computation (CEC), Sendai, Japan, pp. 3397–3403, May 2015Google Scholar
  13. 13.
    Dempster, P., Drake, J.H.: Two frameworks for cross-domain heuristic and parameter selection using harmony search. In: Kim, J.H., Geem, Z.W. (eds.) Harmony Search Algorithm. AISC, vol. 382, pp. 83–94. Springer, Heidelberg (2016).  https://doi.org/10.1007/978-3-662-47926-1_10 CrossRefGoogle Scholar
  14. 14.
    Elman, J.L.: Distributed representations, simple recurrent networks, and grammatical structure. Mach. Learn. 7, 195–224 (1991)Google Scholar
  15. 15.
    Bai, R., Burke, E.K., Gendreau, M., Kendall, G., McCollum, B.: Memory length in hyper-heuristics: an empirical study. In: Proceedings of the 2007 IEEE Symposium on Computational Intelligence in Scheduling, pp. 173–178 (2007)Google Scholar
  16. 16.
    Bishop, C.M.: Pattern Recognition and Machine Learning. Springer, New York (2006)zbMATHGoogle Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Computer Science, College of Engineering, Mathematics and Physical SciencesUniversity of ExeterExeterUK

Personalised recommendations