Skip to main content

Cascade Bayesian Optimization

Part of the Lecture Notes in Computer Science book series (LNAI,volume 9992)

Abstract

Multi-stage cascade processes are fairly common, especially in manufacturing industry. Precursors or raw materials are transformed at each stage before being used as the input to the next stage. Setting the right control parameters at each stage is important to achieve high quality products at low cost. Finding the right parameters via trial and error approach can be time consuming. Bayesian optimization is an efficient way to optimize costly black-box function. We extend the standard Bayesian optimization approach to the cascade process through formulating a series of optimization problems that are solved sequentially from the final stage to the first stage. Epistemic uncertainties are effectively utilized in the formulation. Further, cost of the parameters are also included to find cost-efficient solutions. Experiments performed on a simulated testbed of Al-Sc heat treatment through a three-stage process showed considerable efficiency gain over a naïve optimization approach.

Keywords

  • Bayesian Optimization
  • Cascade Process
  • Considerable Efficiency Gain
  • Epistemic Uncertainty
  • Expected Improvement (EI)

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

This is a preview of subscription content, access via your institution.

Fig. 1.
Fig. 2.
Fig. 3.

References

  1. Jones, D.R., Schonlau, M., Welch, W.J.: Efficient global optimization of expensive black-box functions. J. Global Optim. 13(4), 455–492 (1998)

    CrossRef  MathSciNet  Google Scholar 

  2. Lakshminarayanan, B., Roy, D.M., Teh, Y.W.: Mondrian forests for large-scale regression when uncertainty matters. arXiv preprint arXiv:1506.03805 (2015)

  3. Srinivas, N., Krause, A., Seeger, M., Kakade, S.M.: Gaussian process optimization in the bandit setting: no regret and experimental design. In: ICML (2010)

    Google Scholar 

  4. Osio, I.G., Amon, C.H.: An engineering design methodology with multistage Bayesian surrogates and optimal sampling. Res. Eng. Design 8, 189–206 (1996)

    CrossRef  Google Scholar 

  5. Wang, L., Feng, M., Zhou, B., Xiang, B., Mahadevan, S.: Efficient hyper-parameter optimization for NLP applications. In: Empirical Methods in Natural Language Processing (2015)

    Google Scholar 

  6. Quinonero-Candela, J., Girard, A., Rasmussen, C.E.: Prediction at an uncertain input for Gaussian processes and relevance vector machines-application to multiple-step ahead time-series forecasting. Technical report, IMM, Danish Technical University, Technical report (2002)

    Google Scholar 

  7. Candela, J.Q., Girard, A., Larsen, J., Rasmussen, C.E.: Propagation of uncertainty in Bayesian kernel models-application to multiple-step ahead forecasting. In: ICASSP (2003)

    Google Scholar 

  8. Wagner, R., Kampmann, R., Voorhees, P.W.: Homogeneous second-phase precipitation. In: Materials Science and Technology (1991)

    Google Scholar 

  9. Robson, J., Jones, M., Prangnell, P.: Extension of the N-model to predict competing homogeneous and heterogeneous precipitation in Al-Sc alloys. Acta Mater. 51, 1453–1468 (2003)

    CrossRef  Google Scholar 

  10. Snoek, J., Larochelle, H., Adams, R.P.: Practical Bayesian optimization of machine learning algorithms. In: NIPS, pp. 2951–2959 (2012)

    Google Scholar 

  11. Rasmussen, C.E.: Gaussian processes for machine learning (2006)

    Google Scholar 

  12. Jones, D.R., Perttunen, C.D., Stuckman, B.E.: Lipschitzian optimization without the Lipschitz constant. J. Optim. Theory Appl. 79, 157–181 (1993)

    CrossRef  MathSciNet  Google Scholar 

Download references

Acknowledgement

This work is partially supported by the Telstra-Deakin Centre of Excellence in Big Data and Machine Learning.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Thanh Dai Nguyen .

Editor information

Editors and Affiliations

Rights and permissions

Open Access This chapter is licensed under the terms of the Creative Commons Attribution-NonCommercial 2.5 International License (http://creativecommons.org/licenses/by-nc/2.5/), which permits any noncommercial use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.

The images or other third party material in this chapter are included in the chapter's Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter's Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.

Reprints and Permissions

Copyright information

© 2016 Springer International Publishing AG

About this paper

Cite this paper

Nguyen, T.D. et al. (2016). Cascade Bayesian Optimization. In: Kang, B., Bai, Q. (eds) AI 2016: Advances in Artificial Intelligence. AI 2016. Lecture Notes in Computer Science(), vol 9992. Springer, Cham. https://doi.org/10.1007/978-3-319-50127-7_22

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-50127-7_22

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-50126-0

  • Online ISBN: 978-3-319-50127-7

  • eBook Packages: Computer ScienceComputer Science (R0)