Sequential Knowledge Transfer Across Problems
In this chapter, we build upon the foundations of Chap. 4 to develop a theoretically principled optimization algorithm in the image of an adaptive memetic automaton. For the most part, we retain the abstract interpretation of memes as computationally encoded probabilistic building-blocks of knowledge that can be learned from one task and spontaneously transmitted (for reuse) to another. Most importantly, we make the assumption that the set of all tasks faced by the memetic automatons are put forth sequentially, such that the transfer of memes occurs in a unidirectional manner—from the past to the present. One of the main challenges emerging in this regard is that, given a diverse pool of memes accumulated over time, an appropriate selection and integration of (source) memes must be carried out in order to induce a search bias that suits the ongoing target task of interest. To this end, we propose a mixture modeling approach capable of adaptive online integration of all available knowledge memes—driven entirely by the data generated during the course of a search. Our proposal is particularly well-suited for black-box optimization problems where task-specific datasets may not be available for offline assessments. We conclude the chapter by illustrating how the basic idea of online mixture modeling extends to the case of computationally expensive problems as well.
- 1.Smyth, P., & Wolpert, D. (1998). Stacked density estimation. In Advances in neural information processing systems (pp. 668–674).Google Scholar
- 3.Pardoe, D., & Stone, P. (2010, June). Boosting for regression transfer. In Proceedings of the 27th International Conference on International Conference on Machine Learning (pp. 863–870).Google Scholar
- 4.Feng, L., Ong, Y. S., Tsang, I. W. H., & Tan, A. H. (2012, June). An evolutionary search paradigm that learns with past experiences. In 2012 IEEE Congress on Evolutionary Computation (CEC) (pp. 1–8). IEEE.Google Scholar
- 7.Feng, L., Ong, Y. S., & Lim, M. H. (2013). Extreme learning machine guided memetic computation for vehicle routing. IEEE Intelligent Systems, 28(6), 38–41.Google Scholar
- 11.Pelikan, M., Goldberg, D. E., & Cantú-Paz, E. (1999, July). BOA: The Bayesian optimization algorithm. In Proceedings of the 1st Annual Conference on Genetic and Evolutionary Computation (pp. 525–532). Morgan Kaufmann Publishers Inc.Google Scholar
- 12.Gallagher, M., Frean, M., & Downs, T. (1999, July). Real-valued evolutionary optimization using a flexible probability density estimator. In Proceedings of the 1st Annual Conference on Genetic and Evolutionary Computation-Volume 1 (pp. 840–846). Morgan Kaufmann Publishers Inc.Google Scholar
- 13.Van den Oord, A., & Schrauwen, B. (2014). Factoring variations in natural images with deep Gaussian mixture models. In Advances in Neural Information Processing Systems (pp. 3518–3526).Google Scholar
- 16.Blume, M. (2002). Expectation maximization: A gentle introduction. Technical University of Munich Institute for Computer Science. https://pdfs.semanticscholar.org/7954/99e0d5724613d676bf6281097709c803708c.pdf.
- 17.Devroye, L., Györfi, L., & Lugosi, G. (2013). A probabilistic theory of pattern recognition (Vol. 31). Springer Science & Business Media.Google Scholar
- 18.MacKay, D. J. (2003). Information theory, inference and learning algorithms. Cambridge University Press.Google Scholar
- 24.Lim, D., Ong, Y. S., Jin, Y., & Sendhoff, B. (2007, July). A study on metamodeling techniques, ensembles, and multi-surrogates in evolutionary computation. In Proceedings of the 9th annual conference on Genetic and evolutionary computation (pp. 1288–1295). ACM.Google Scholar
- 25.Min, A. T. W., Ong, Y. S., Gupta, A., & Goh, C. K. (2017). Multi-problem surrogates: Transfer evolutionary multiobjective optimization of computationally expensive problems. IEEE Transactions on Evolutionary Computation. Early Access.Google Scholar