Using Datamining Techniques to Help Metaheuristics: A Short Survey

  • Laetitia Jourdan
  • Clarisse Dhaenens
  • El-Ghazali Talbi
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4030)


Hybridizing metaheuristic approaches becomes a common way to improve the efficiency of optimization methods. Many hybridizations deal with the combination of several optimization methods. In this paper we are interested in another type of hybridization, where datamining approaches are combined within an optimization process. Hence, we propose to study the interest of combining metaheuristics and datamining through a short survey that enumerates the different opportunities of such combinations based on literature examples.


Genetic Algorithm Local Search Association Rule Evolutionary Computation Short Survey 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Agrawal, R., Srikant, R.: Fast algorithms for mining association rules. In: Bocca, J.B., Jarke, M., Zaniolo, C. (eds.) Proceeding 20th International Conference Very Large Data Bases, VLDB, pp. 12–15. Morgan Kaufmann, San Francisco (1994)Google Scholar
  2. 2.
    Baluja, S.: Population based incremental learning. Technical Report CMU-CS-94-163, Carnegie Mellon University (1994),
  3. 3.
    Chan, Z.S.H., Kasabov, N.: Gene trajectory clustering with a hybrid genetic algorithm and expectation maximization method. In: IEEE International Joint Conference on Neural Networks, pp. 1669–1674 (2004)Google Scholar
  4. 4.
    Dalboni, F.L., Ochi, L.S., Drummond, L.M.A.: On improving evolutionary algorithms by using data mining for the oil collector vehicle routing problem. In: International Network Optimization Conference (2003)Google Scholar
  5. 5.
    Falkenauer, E.: A new representation and operators for genetic algorithms applied to grouping problems. Evolutionary Computation 2(2), 123–144 (1994)CrossRefGoogle Scholar
  6. 6.
    Gaspar-Cunha, A., Vieira, A.S.: A hybrid multi-objective evolutionary algorithm using an inverse neural network. In: Hybrid Metaheuristic, pp. 25–30 (2004)Google Scholar
  7. 7.
    Hall, L.O., Özyurt, I.B., Bezdek, J.C.: Clustering with a genetically optimized approach. IEEE Trans. on Evolutionary Computation 3(2), 103–112 (1999)CrossRefGoogle Scholar
  8. 8.
    Handa, H., Baba, N., Katai, O., Sawaragi, T.: Coevolutionary genetic algorithm with effective exploration and exploitation of useful schemata. In: Proceedings of the International Conference on Neural Information Systems, vol. 1, pp. 424–427 (1997)Google Scholar
  9. 9.
    Handa, H., Horiuchi, T., Katai, O., Baba, M.: A novel hybrid framework of coevolutionary GA and machine learning. International Journal of Computational Intelligence and Applications (2002)Google Scholar
  10. 10.
    Handa, H., Horiuchi, T., Katai, O., Kaneko, T., Konishi, T., Baba, M.: Fusion of coevolutionary ga and machine learning techniques through effective schema extraction. In: Spector, L., Goodman, E.D., Wu, A., Langdon, W.B., Voigt, H.-M., Gen, M., Sen, S., Dorigo, M., Pezeshk, S., Garzon, M.H., Burke, E. (eds.) Proceedings of the Genetic and Evolutionary Computation Conference (GECCO-2001), July 7-11, 2001, p. 764. Morgan Kaufmann, San Francisco (2001)Google Scholar
  11. 11.
    Handl, J., Knowles, J.: Improvements to the scalability of multiobjective clustering. In: IEEE (ed.), IEEE Congress on Evolutionary Computation, pp. 438–445 (2005)Google Scholar
  12. 12.
    Hong, T.P., Wang, H., Chen, W.: Simultaneously applying multiple mutation operators in genetic algorithms. Journal of heuristics 6, 439–455 (2000)MATHCrossRefGoogle Scholar
  13. 13.
    Huyet, A.-L.: Extraction de connaissances pertinentes sur le comportement des systemes de production: une approche conjointe par optimisation evolutionniste via simulation et apprentissage. PhD thesis, Université Blaise Pascal Clermont II (October 2004)Google Scholar
  14. 14.
    Huyet, A.-L., Paris, J.-L.: Configuration and analysis of a multiproduct kanban system using evolutionary optimisation coupled to machine learning (ISBN 2-9512309-5-8, CDROM). In: Proceedings of CESA 2003, the IMACS Multiconference Computational Engineering in Systems Applications (July 2003) ISBN 2-9512309-5-8, CDROMGoogle Scholar
  15. 15.
    Jin, Y.: A comprehensive survey of fitness approximation in evolutionary computation. Soft Computing Journal 9(1), 3–12 (2005)CrossRefGoogle Scholar
  16. 16.
    Jin, Y., Sendhoff, B.: Reducing fitness evaluations using clustering techniques and neural networks ensembles. In: Deb, K., et al. (eds.) GECCO 2004. LNCS, vol. 3102, pp. 688–699. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  17. 17.
    Jourdan, L., Corne, D., Savic, D.A., Walters, G.A.: Preliminary investigation of the learnable evolution model for faster/better multiobjective water systems design. In: Coello Coello, C.A., Hernández Aguirre, A., Zitzler, E. (eds.) EMO 2005. LNCS, vol. 3410, pp. 841–855. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  18. 18.
    Kim, H.-S., Cho, S.-B.: An efficient genetic algorithms with less fitness evaluation by clustering. In: Proceedings of IEEE Congress on Evolutionary Computation, pp. 887–894. IEEE, Los Alamitos (2001)Google Scholar
  19. 19.
    Larranaga, P., Lozano, J.A.: Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation. Kluwer Academic Publishers, Dordrecht (2002)MATHGoogle Scholar
  20. 20.
    Louis, S.J.: Genetic learning from experience. In: IEEE (ed.), Congress on Evolutionary Computation (CEC 2003), Australia, December 2003, pp. 2118–2125. IEEE, Los Alamitos (2003)CrossRefGoogle Scholar
  21. 21.
    Louis, S.J.: Learning for evolutionary design. In: Proceedings of the 2003 Nasa/DoD Conference on Evolutionary Hardware, July 2003, pp. 17–23 (2003)Google Scholar
  22. 22.
    Michalski, R.S.: Learnable evolution model: Evolutionary processes guided by machine learning. Machine Learning 38(1–2), 9–40 (2000)MATHCrossRefGoogle Scholar
  23. 23.
    Michalski, R.S., Cervon, G., Kaufman, K.A.: Speeding up evolution through learning: Lem. In: Intelligent Information Systems 2000, pp. 243–256 (2000)Google Scholar
  24. 24.
    Michalski, R.S., Larson, J.B.: Selection of most representative training examples and incremental generation of vl1 hypothesis: The underlying methodology and the descriptions of programs esel and aq11. Technical Report Report No. 867, Urbana, Illinois: Department of Computer Science, University of Illinois (1978)Google Scholar
  25. 25.
    Michalski, R.S., Mozetic, I., Hong, J., Lavrac, N.N.: The multipurpose incremental learning system aq15 and its testing application to three medical domains. In: Proc. of the Fifth National Conference on Artificial Intelligence, pp. 1041–1045. Morgan Kaufmann, PA (1986)Google Scholar
  26. 26.
    Muhlenbein, H., Paass, G.: From recombination of genes to the estimation of distributions: I. binary parameters. In: Ebeling, W., Rechenberg, I., Voigt, H.-M., Schwefel, H.-P. (eds.) PPSN 1996. LNCS, vol. 1141, pp. 178–187. Springer, Heidelberg (1996)CrossRefGoogle Scholar
  27. 27.
    Pelikan, M., Goldberg, D.E., Cantú-Paz, E.: BOA: The Bayesian optimization algorithm. In: Banzhaf, W., Daida, J., Eiben, A.E., Garzon, M.H., Honavar, V., Jakiela, M., Smith, R.E. (eds.) Proceedings of the Genetic and Evolutionary Computation Conference GECCO-1999, Orlando, FL, 13-17, vol. I, pp. 525–532. Morgan Kaufmann Publishers, San Francisco (1999)Google Scholar
  28. 28.
    Ramsey, C., Grefenstette, J.: Case-based initialization of genetic algorithms. In: Fifth International Conference on Genetic Algorithms, pp. 84–91 (1993)Google Scholar
  29. 29.
    Rasheed, K.: An incremental-approximate-clustering approach for developing dynamic reduced models for design optimization. In: Proceedings of the 2000 Congress on Evolutionary Computation (CEC 2000), pp. 6–9. IEEE Press, Los Alamitos (2000)Google Scholar
  30. 30.
    Rasheed, K., Hirsh, H.: Using case based learning to improve genetic algorithm based design optimization. In: Bäck, T. (ed.) Proceedings of the Seventh International Conference on Genetic Algorithms (ICGA 1997), Morgan Kaufmann, San Francisco (1997)Google Scholar
  31. 31.
    Rasheed, K., Hirsh, H.: Informed operators: Speeding up genetic-algorithm-based design optimization using reduced models. In: Whitley, L.D., Goldberg, D.E., Cantú-Paz, E., Spector, L., Parmee, I.C., Beyer, H.-G. (eds.) GECCO, pp. 628–635. Morgan Kaufmann, San Francisco (2000)Google Scholar
  32. 32.
    Rasheed, K., Vattam, S., Ni, X.: Comparison of methods for developing dynamic reduced models for design optimization. In: Proceedings of the Congress on Evolutionary Computation (CEC 2002), pp. 390–395 (2002)Google Scholar
  33. 33.
    Ravise, C., Sebag, M.: An advanced evolution should not repeat its past errors. In: International Conference on Machine Learning, pp. 400–408 (1996)Google Scholar
  34. 34.
    Ravise, C., Sebag, M., Schoenauer, M.: A genetic algorithm led by induction,
  35. 35.
    Reynolds, R.G., Michalewicz, Z., Cavaretta, M.J.: Using cultural algorithms for constraint handling in genocop. In: Evolutionary Programming, pp. 289–305 (1995)Google Scholar
  36. 36.
    Reynolds, R.G., Peng, B.: Cultural algorithms: computational modeling of how cultures learn to solve problems: an engineering example. Cybernetics and Systems 36(8), 753–771 (2005)CrossRefGoogle Scholar
  37. 37.
    Ribeiro, M., Plastino, A., Martins, S.: Hybridization of grasp metaheuristic with data mining techniques. Special Issue on Hybrid Metaheuristic of the Journal of Mathematical Modelling and Algorithms 5(1), 23–41 (2006)MATHMathSciNetGoogle Scholar
  38. 38.
    Ribeiro, M., Trindade, V., Lastino, A., Martins, S.: Hybridization of GRASP metaheuristic with data mining techniques. In: Workshop on Hybrid Metaheuristics 16th European Conference on Artificial Intelligence (ECAI), pp. 69–78 (2004)Google Scholar
  39. 39.
    Santos, H.G., Ochi, L.S., Marinho, E.H., Drummond, L.M.A.: Combining an evolutionary algorithm with data mining to solve a vehicle routing problem. NEUROCOMPUTING (to appear, 2006)Google Scholar
  40. 40.
    Santos, L., Ribeiro, M., Plastino, A., Martins, S.: A hybrid GRASP with data mining for the maximum diversity problem. In: Blesa, M.J., Blum, C., Roli, A., Sampels, M. (eds.) HM 2005. LNCS, vol. 3636, pp. 116–128. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  41. 41.
    Sebag, M., Ravise, C., Schoenauer, M.: Controlling evolution by means of machine learning. In: Evolutionary Programming, pp. 57–66 (1996)Google Scholar
  42. 42.
    Sebag, M., Schoenauer, M.: Controlling crossover through inductive learning. In: Davidor, Y., Schwefel, H.-P., Männer, R. (eds.) Parallel Problem Solving from Nature – PPSN III, pp. 209–218. Springer, Berlin (1994)Google Scholar
  43. 43.
    Sebag, M., Schoenauer, M., Ravise, C.: Toward civilized evolution: Developing inhibitions. In: Bäck, T. (ed.) Proceeding of the Seventh Int. Conf. on Genetic Algorithms, pp. 291–298. Morgan Kaufmann, San Francisco (1997)Google Scholar
  44. 44.
    Talbi, E.-G.: A taxonomy of hybrid metaheuristics. Journal of heuristics 8(2), 541–564 (2002)CrossRefGoogle Scholar
  45. 45.
    Vermeulen-Jourdan, L., Corne, D., Savic, D.A., Walters, G.A.: Hybridising rule induction and multi-objective evolutionary search for optimising water distribution systems. In: Proceeding of Fourth International Conference on Hybrid Intelligent Systems (HIS 2004), pp. 434–439 (2004)Google Scholar
  46. 46.
    Vermeulen-Jourdan, L., Dhaenens, C., Talbi, E.-G.: Clustering nominal and numerical data: A new distance concept for a hybrid genetic algorithm. In: Gottlieb, J., Raidl, G.R. (eds.) EvoCOP 2004. LNCS, vol. 3004, pp. 220–229. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  47. 47.
    Yoo, S.-H., Cho, S.-B.: Partially evaluated genetic algorithm based on fuzzy c-means algorithm. In: Yao, X., Burke, E.K., Lozano, J.A., Smith, J., Merelo-Guervós, J.J., Bullinaria, J.A., Rowe, J.E., Tiňo, P., Kabán, A., Schwefel, H.-P. (eds.) PPSN 2004. LNCS, vol. 3242, pp. 440–449. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  48. 48.
    Zitzler, E., Thiele, L.: Multiobjective Evolutionary Algorithms: A Comparative Case Study and the Strength Pareto Approach. IEEE Transactions on Evolutionary Computation 3(4), 257–271 (1999)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Laetitia Jourdan
    • 1
  • Clarisse Dhaenens
    • 2
  • El-Ghazali Talbi
    • 1
    • 2
  1. 1.INRIA FutursVilleneuve d’AscqFrance
  2. 2.LIFL, CNRSUniversité de Lille IVilleneuve d’AscqFrance

Personalised recommendations