Cluster Computing

, Volume 22, Supplement 3, pp 7165–7179 | Cite as

Deja Vu: a hyper heuristic framework with Record and Recall (2R) modules

  • Hammad MajeedEmail author
  • Samina Naz


Despite the success of heuristic methods in solving real-world problems, there are still some difficulties in terms of easily applying them to newly encountered problems, or even new instances of similar problems. In addition, the little or no understanding of why different heuristics work effectively (or not) in certain situations does not facilitate simple choices of which approach to use in which situation. This paper proposes a new hyper heuristic framework named Deja Vu to address these issues. As the names suggests, it retrieves the stored solution of already solved problems for the new but similar problems. This makes the our system efficient and knowledge rich. The performance of Deja Vu is tested on the data sets with varying difficulty. Deja Vu has shown promising results on almost all the occasions.


Hyper heuristics Online learning Knowledge rich framework Record and Recall Problem similarity 


  1. 1.
    Boskovitz, V., Guterman, H.: An adaptive neuro-fuzzy system for automatic image segmentation and edge detection. IEEE Trans. on Fuzzy Syst. 10(2), 247–262 (2002)CrossRefGoogle Scholar
  2. 2.
    Burke, E., Curtois, T., Hyde, M., Ochoa, G., Vázquez Rodríguez, J.A.: Hyflex: a benchmark framework for cross-domain heuristic search. CoRR, abs/1107.5462 (2011)Google Scholar
  3. 3.
    Burke, E.K., Gendreau, M., Hyde, M., Kendall, G., Ochoa, G., Özcan, E., Qu, R.: Hyper-heuristics: a survey of the state of the art. J. Oper. Res. Soc. 64(12), 1695–1724 (2013)CrossRefGoogle Scholar
  4. 4.
    Cowling, P., Kendall, G., Soubeiga, E.: A hyperheuristic approach to scheduling a sales summit. In: International Conference on the Practice and Theory of Automated Timetabling, Springer, pp. 176–190 (2000)Google Scholar
  5. 5.
    Crowston, W.B., Glover, F., Trawick, J.D.: Probabilistic and Parametric Learning Combinations of Local Job Shop Scheduling Rules. Carnegie Institute of Technology, Graduate School of Industrial Administration, Pittsbuggh, PA (1963)Google Scholar
  6. 6.
    de Sá, A.G.C., Pinto, W.J.G.C., Oliveira, L.O.V.B., Pappa, G.L.: RECIPE: a grammar-based framework for automatically evolving classification pipelines. Springer, Cham (2017)Google Scholar
  7. 7.
    Elyasaf, A., Vaks, P., Milo, N., Sipper, M., Ziv-Ukelson, M.: Learning heuristics for mining RNA sequence-structure motifs, pp. 21–38. Springer, Cham (2016)Google Scholar
  8. 8.
    Fisher, H., Thompson, G.L.: Probabilistic learning combinations of local job-shop scheduling rules. In: Industrial Scheduling, pp. 225–251. Prentice-Hall (1963)Google Scholar
  9. 9.
    Hart, E., Ross, P.: Solving a real-world problem using an evolving heuristically driven schedule builder. Evol. Comput. 6(1), 61–80 (1998)CrossRefGoogle Scholar
  10. 10.
    Ho, T.K., Baird, H.S.: Pattern classification with compact distribution maps. Comput. Vis. Image Underst. 70(1), 101–110 (1998)CrossRefGoogle Scholar
  11. 11.
    Ho, T.K., Basu, M.: Complexity measures of supervised classification problems. IEEE Trans. Pattern Anal. Mach. Intell. 24(3), 289–300 (2002)CrossRefGoogle Scholar
  12. 12.
    Keijzer, M.: Improving symbolic regression with interval arithmetic and linear scaling. In: European Conference on Genetic Programming, pp. 70–82. Springer, Berlin (2003)Google Scholar
  13. 13.
    Kheiri, A., Özcan, E.: An iterated multi-stage selection hyper-heuristic. Eur. J. Oper. Res. 250(1), 77–90 (2016)MathSciNetCrossRefGoogle Scholar
  14. 14.
    Maashi, M., Kendall, G., Özcan, E.: Choice function based hyper-heuristics for multi-objective optimization. Appl. Soft Comput. 28, 312–326 (2015)CrossRefGoogle Scholar
  15. 15.
    Maashi, M., Özcan, E., Kendall, G.: A multi-objective hyper-heuristic based on choice function. Expert Syst. Appl. 41(9), 4475–4493 (2014)CrossRefGoogle Scholar
  16. 16.
    Macià Antolínez, N,. et al. Data complexity in supervised learning: A far-reaching implication. (2011)Google Scholar
  17. 17.
    Mariani, T., Guizzo, G., Vergilio, S.R., Pozo, A.T.R.: Grammatical evolution for the multi-objective integration and test order problem. In: Proceedings of the Genetic and Evolutionary Computation Conference 2016, GECCO ’16, pp. 1069–1076, New York, NY, USA, 2016. ACMGoogle Scholar
  18. 18.
    Mendes, A., Togelius, J., Nealen, A.: Hyper-heuristic general video game playing. In: Computational Intelligence and Games (CIG), 2016 IEEE Conference on, pp. 1–8. IEEE (2016)Google Scholar
  19. 19.
    Mladenović, N., Hansen, P.: Variable neighborhood search. Comput Oper Res 24(11), 1097–1100 (1997)MathSciNetCrossRefGoogle Scholar
  20. 20.
    Montazeri, M., Baghshah, M.S., Enhesari, A.: Hyper-heuristic algorithm for finding efficient features in diagnose of lung cancer disease. arXiv preprint arXiv:1512.04652 (2015)
  21. 21.
    Orriols-Puig, A., Macia, N., Ho, T.K.: Documentation for the data complexity library in c++, p. 196. Universitat Ramon Llull La Salle, Barcelona (2016)Google Scholar
  22. 22.
    Platt, J.C.: Probabilistic outputs for support vector machines and comparisons to regularized likelihood methods. Adv. Large Margin Class. 10, 61–74 (1999)Google Scholar
  23. 23.
    Rankhambe, J., Pandharpatte, R.M.: A survey on examination scheduling problem (esp) and hyper-heuristics approaches for solving esp. In: Information Processing (ICIP), 2015 International Conference on, pp. 254–259. IEEE, (2015)Google Scholar
  24. 24.
    Ryser-Welch, P., Miller, J.E.: A review of hyper-heuristic frameworks. In: Proceedings of the Evo20 Workshop, AISB (2014)Google Scholar
  25. 25.
    Samulowitz, H., Reddy, C., Sabharwal, A., Sellmann, M.: Snappy: a simple algorithm portfolio. In: International Conference on Theory and Applications of Satisfiability Testing, pp. 422–428. Springer (2013)Google Scholar
  26. 26.
    Sim, K., Hart, E., Paechter, B.: A lifelong learning hyper-heuristic method for bin packing. Evol. Comput. 23(1), 37–67 (2015)CrossRefGoogle Scholar
  27. 27.
    Soria-Alcaraz, J.A., Espinal, A., Sotelo-Figueroa, M.A.: Evolvability metric estimation by a parallel perceptron for on-line selection hyper-heuristics. IEEE Access 5, 7055–7063 (2017)CrossRefGoogle Scholar
  28. 28.
    Soria-Alcaraz, J.A., Özcan, E., Swan, J., Kendall, G., Carpio, M.: Iterated local search using an add and delete hyper-heuristic for university course timetabling. Appl. Soft Comput. 40, 581–593 (2016)CrossRefGoogle Scholar
  29. 29.
    Swan, J., Özcan, E., Kendall, G.: Hyperion—A Recursive Hyper-Heuristic Framework, pp. 616–630. Springer, Berlin (2011)Google Scholar
  30. 30.
    Tsang, E., Voudouris, C.: Fast local search and guided local search and their application to british telecom’s workforce scheduling problem. Oper. Res. Lett. 20(3), 119–127 (1997)CrossRefGoogle Scholar
  31. 31.
    Tyasnurita, R., Ozcan, E., John, R.: Learning heuristic selection using a time delay neural network for open vehicle routing. (2017)Google Scholar
  32. 32.
    Van Onsem, W., Demoen, B.: Parhyflex: a framework for parallel hyper-heuristics. In: BNAIC 2013: Proceedings of the 25th Benelux Conference on Artificial Intelligence, Delft, The Netherlands, November 7-8, 2013. Delft University of Technology (TU Delft); under the auspices of the Benelux Association for Artificial Intelligence (BNVKI) and the Dutch Research School for Information and Knowledge Systems (SIKS), (2013)Google Scholar
  33. 33.
    Wolpert, D.H., Macready, W.G.: No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1(1), 67–82 (1997)CrossRefGoogle Scholar
  34. 34.
    Xu, L., Hutter, F., Hoos, H.H., Leyton-Brown, K.: SATzilla: portfolio-based algorithm selection for SAT. J. Artif. Intell. Res. 32, 565–606 (2008)CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC 2017

Authors and Affiliations

  1. 1.Department of Computer ScienceNational University of Computer and Emerging SciencesIslamabadPakistan

Personalised recommendations