Abstract
The problem of optimizing unknown costly-to-evaluate functions has been studied extensively in the context of Bayesian optimization. Algorithms in this field aim to find the optimizer of the function by requesting only a few function evaluations at carefully selected locations. An ideal algorithm should maintain a perfect balance between exploration (probing unexplored areas) and exploitation (focusing on promising areas) within the given evaluation budget. In this paper, we assume the unknown function is Lipschitz continuous. Leveraging the Lipschitz property, we propose an algorithm with a distinct exploration phase followed by an exploitation phase. The exploration phase aims to select samples that shrink the search space as much as possible, while the exploitation phase focuses on the reduced search space and selects samples closest to the optimizer. We empirically show that the proposed algorithm significantly outperforms the baseline algorithms.
Chapter PDF
Similar content being viewed by others
References
Jones, D.R.: A taxonomy of global optimization methods based on response surfaces. Journal of Global Optimization 21, 345–383 (2001)
Brochu, E., Cora, M., de Freitas, N.: A tutorial on Bayesian optimization of expensive cost functions, with application to active user modeling and hierarchical reinforcement learning. Technical Report TR-2009-23, Department of Computer Science, University of British Columbia (2009)
Rasmussen, C.E., Williams, C.K.I.: Gaussian Processes for Machine Learning. MIT (2006)
Elder IV, J.F.: Global rd optimization when probes are expensive: the grope algorithm. In: IEEE International Conference on Systems, Man and Cybernetics, pp. 577–582 (1992)
Stuckman, B.E.: A global search method for optimizing nonlinear systems. IEEE Transactions on Systems, Man, and Cybernetic 18, 965–977 (1988)
Locatelli, M.: Bayesian algorithms for one-dimensional global optimization. Journal of Global Optimization 10(1), 57–76 (1997)
Moore, A., Schneider, J., Boyan, J., Lee, M.S.: Q2: Memory-based active learning for optimizing noisy continuous functions. In: ICML, pp. 386–394 (1998)
Li, W., Wang, X., Zhang, R., Cui, Y., Mao, J., Jin, R.: Exploitation and exploration in a performance based contextual advertising system. In: Proceedings of the 16th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD 2010, pp. 27–36. ACM (2010)
Vazquez, E., Bect, J.: Convergence properties of the expected improvement algorithm with fixed mean and covariance functions. Journal of Statistical Planning and Inference 140(11), 3088–3095 (2010)
Lizotte, D.: Practical Bayesian Optimization. PhD thesis, University of Alberta, Edmonton, Alberta, Canada (2008)
Azimi, J., Fern, A., Fern, X.: Batch bayesian optimization via simulation matching. In: NIPS (2010)
Schonlau, M.: Computer Experiments and Global Optimization. PhD thesis, University of Waterloo, Waterloo, Ontario, Canada (1997)
Sasena, M.J.: Flexibility and Efficiency Enhancement for Constrained Global Design Optimization with Kriging Approximations. PhD thesis, University of Michigan, Michigan, MI (2002)
Jones, D.R., Perttunen, C.D., Stuckman, B.E.: Lipschitzian optimization without the lipschitz constant. Journal of Optimization Theory and Applications 79(1), 157–181 (1993)
Anderson, B.S., Moore, A., Cohn, D.: A nonparametric approach to noisy and costly optimization. In: ICML (2000)
Brunato, M., Battiti, R., Pasupuleti, S.: A memory-based rash optimizer. In: AAAI 2006 Workshop on Heuristic Search, Memory Based Heuristics and Their Applications (2006)
Dixon, L., Szeg, G.: The Global Optimization Problem: An Introduction Toward Global Optimization. North-Holland, Amsterdam (1978)
Michalewicz, Z.: Genetic algorithms + data structures = evolution programs, 2nd edn. Springer-Verlag New York, Inc., New York (1994)
Burrows, E.H., Wong, W.K., Fern, X., Chaplen, F.W., Ely, R.L.: Optimization of ph and nitrogen for enhanced hydrogen production by synechocystis sp. pcc 6803 via statistical and machine learning methods. Biotechnology Progress 25, 1009–1017 (2009)
Azimi, J., Fern, X., Fern, A., Burrows, E., Chaplen, F., Fan, Y., Liu, H., Jaio, J., Schaller, R.: Myopic policies for budgeted optimization with constrained experiments. In: AAAI (2010)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Jalali, A., Azimi, J., Fern, X., Zhang, R. (2013). A Lipschitz Exploration-Exploitation Scheme for Bayesian Optimization. In: Blockeel, H., Kersting, K., Nijssen, S., Železný, F. (eds) Machine Learning and Knowledge Discovery in Databases. ECML PKDD 2013. Lecture Notes in Computer Science(), vol 8188. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-40988-2_14
Download citation
DOI: https://doi.org/10.1007/978-3-642-40988-2_14
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-40987-5
Online ISBN: 978-3-642-40988-2
eBook Packages: Computer ScienceComputer Science (R0)