Advertisement

The Boon of Gene-Culture Interaction for Effective Evolutionary Multitasking

  • Bingshui Da
  • Abhishek Gupta
  • Yew Soon OngEmail author
  • Liang Feng
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9592)

Abstract

Multifactorial optimization (MFO) is a recently proposed paradigm for evolutionary multitasking that is inspired by the possibility of harnessing underlying synergies between outwardly unrelated optimization problems through the process of implicit genetic transfer. In contrast to traditional single-objective and multi-objective optimization, which consider only a single problem in one optimization run, MFO aims at solving multiple optimization problems simultaneously. Through comprehensive empirical study, MFO has demonstrated notable performance on a variety of complex optimization problems. In this paper, we take a step towards better understanding the means by which MFO leads to the observed performance improvement. In particular, since (a) genetic and (b) cultural transmission across generations form the crux of the proposed evolutionary multitasking engine, we focus on how their interaction (i.e., gene-culture interaction) affects the overall efficacy of this novel paradigm.

Keywords

Assortative Mating Benchmark Function Search Region Optimization Task Cultural Bias 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Notes

Acknowledgement

This work was conducted within the Rolls-Royce@NTU Corporate Lab with support from the National Research Foundation (NRF) Singapore under the Corp Lab@University Scheme.

References

  1. 1.
    Koza, J.R.: Genetic Programming: On the Programming of Computers by Means of Natural Selection, vol. 1. MIT Press, Cambridge (1992)zbMATHGoogle Scholar
  2. 2.
    Konak, A., Coit, D.W., Smith, A.E.: Multi-objective optimization using genetic algorithms: a tutorial. Reliab. Eng. Syst. Saf. 91(9), 992–1007 (2006)CrossRefGoogle Scholar
  3. 3.
    Ong, Y.S., Keane, A.J.: Meta-Lamarckian learning in memetic algorithms. IEEE Trans. Evol. Comput. 8(2), 99–110 (2004)CrossRefGoogle Scholar
  4. 4.
    Bäck, T., Hammel, U., Schwefel, H.P.: Evolutionary computation: comments on the history and current state. IEEE Trans. Evol. Comput. 1(1), 3–17 (1997)CrossRefGoogle Scholar
  5. 5.
    Fonseca, C.M., Fleming, P.J.: An overview of evolutionary algorithms in multiobjective optimization. Evol. Comput. 3(1), 1–16 (1995)CrossRefGoogle Scholar
  6. 6.
    Ishibuchi, H., Tsukamoto, N., Nojima, Y.: Evolutionary many-objective optimization: a short review. In: IEEE Congress on Evolutionary Computation, pp. 2419–2426. Citeseer (2008)Google Scholar
  7. 7.
    Gupta, A., Ong, Y.S., Feng, L.: Multifactorial evolution. IEEE Trans. Evol. Comput. PP(99), 1–1 (2015)CrossRefGoogle Scholar
  8. 8.
    Rice, J., Cloninger, C.R., Reich, T.: Multifactorial inheritance with cultural transmission and assortative mating. I. Description and basic properties of the unitary models. Am. J. Hum. Genet. 30(6), 618 (1978)Google Scholar
  9. 9.
    Chen, X., Ong, Y.S., Lim, M.H., Tan, K.C.: A multi-facet survey on memetic computation. IEEE Trans. Evol. Comput. 15(5), 591–607 (2011)CrossRefGoogle Scholar
  10. 10.
    Ong, Y.S., Lim, M.H., Chen, X.: Research frontier-memetic computation past, present & future. IEEE Comput. Intell. Mag. 5(2), 24 (2010)CrossRefGoogle Scholar
  11. 11.
    Iqbal, M., Browne, W.N., Zhang, M.: Reusing building blocks of extracted knowledge to solve complex, large-scale Boolean problems. IEEE Trans. Evol. Comput. 18(4), 465–480 (2014)CrossRefGoogle Scholar
  12. 12.
    Mills, R., Jansen, T., Watson, R.A.: Transforming evolutionary search into higher-level evolutionary search by capturing problem structure. IEEE Trans. Evol. Comput. 18(5), 628–642 (2014)CrossRefGoogle Scholar
  13. 13.
    Bean, J.C.: Genetic algorithms and random keys for sequencing and optimization. ORSA J. Comput. 6(2), 154–160 (1994)CrossRefzbMATHGoogle Scholar
  14. 14.
    Gonçalves, J.F., Resende, M.G.C.: Biased random-key genetic algorithms for combinatorial optimization. J. Heuristics 17(5), 487–525 (2011)CrossRefGoogle Scholar
  15. 15.
    Wright, A.H., Vose, M.D., Rowe, J.E.: Implicit parallelism. In: Cantú-Paz, E., et al. (eds.) GECCO 2003. LNCS, vol. 2724, pp. 1505–1517. Springer, Heidelberg (2003)CrossRefGoogle Scholar
  16. 16.
    Ong, Y.S., Zhou, Z., Lim, D.: Curse and blessing of uncertainty in evolutionary algorithm using approximation. In: 2006 IEEE Congress on Evolutionary Computation, CEC 2006, pp. 2928–2935. IEEE (2006)Google Scholar
  17. 17.
    Fogel, D.B.: What is evolutionary computation? IEEE Spec. 37(2), 26–28 (2000)MathSciNetGoogle Scholar
  18. 18.
    Deb, K., Agrawal, R.B.: Simulated binary crossover for continuous search space. Complex Syst. 9(3), 1–15 (1994)MathSciNetGoogle Scholar
  19. 19.
    Meuth, R., Lim, M.H., Ong, Y.S., Wunsch II, D.C.: A proposition on memes and meta-memes in computing for higher-order learning. Memetic Comput. 1, 85–100 (2009)CrossRefGoogle Scholar
  20. 20.
    António, C.C.: A memetic algorithm based on multiple learning procedures for global optimal design of composite structures. Memetic Comput. 6(2), 113–131 (2014)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Bingshui Da
    • 1
  • Abhishek Gupta
    • 1
  • Yew Soon Ong
    • 2
    Email author
  • Liang Feng
    • 3
  1. 1.Computational Intelligence Lab, School of Computer EngineeringNanyang Technological UniversitySingaporeSingapore
  2. 2.Rolls-Royce@NTU Corporate Lab C/o, School of Computer EngineeringNanyang Technological UniversitySingaporeSingapore
  3. 3.College of Computer ScienceChongqing UniversityChongqingChina

Personalised recommendations