Advertisement

Uncovering Performance Envelopes Through Optimum Design of Tests

  • Tapabrata Ray
  • Ahsanul Habib
  • Hemant Kumar Singh
  • Michael Ryan
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11320)

Abstract

Test and evaluation is a process that is used to determine if a product/system satisfies its performance specifications across its entire operating regime. The operating regime is typically defined using factors such as types of terrains/sea-states/altitudes, weather conditions, operating speeds, etc., and involves multiple performance metrics. With each test being expensive to conduct and with multiple factors and performance metrics under consideration, design of a test and evaluation schedule is far from trivial. Design of experiments (DOE) still continues to be the most prevalent approach to derive the test plans, although there is significant opportunity to improve this practice through optimization. In this paper, we introduce a surrogate-assisted optimization approach to uncover the performance envelope with a small number of tests. The approach relies on principles of decomposition to deal with multiple performance metrics and employs bi-directional search along each reference vector to identify the best and worst performance simultaneously. To limit the number of tests, the search is guided by multiple surrogate models. At every iteration the approach delivers a test plan involving at most \(K_T\) tests, and the information acquired is used to generate future test plans. In order to evaluate the performance of the proposed approach, a set of scalable test functions with various Pareto front characteristics and objective space bias are introduced. The performance of the approach is quantitatively assessed and compared with two popular DOE strategies, namely Latin Hypercube Sampling (LHS) and Full Factorial Design (FFD). Further, we also demonstrate its practical use on a simulated catapult system.

Keywords

Design of tests Performance envelope Multi-objective optimization 

Notes

Acknowledgments

The authors would like to acknowledge Defence Related Research (DRR) grant from the University of New South Wales (UNSW), Canberra, Australia.

References

  1. 1.
    Bhattacharjee, K.S., Singh, H.K., Ray, T.: Multi-objective optimization with multiple spatially distributed surrogates. J. Mech. Des. 138(9), 091401 (2016)CrossRefGoogle Scholar
  2. 2.
    Bhattacharjee, K.S., Singh, H.K., Ray, T.: Multiple surrogate-assisted many-objective optimization for computationally expensive engineering design. J. Mech. Des. 140(5), 051403 (2018)CrossRefGoogle Scholar
  3. 3.
    Chugh, T., Jin, Y., Meittinen, K., Hakanen, J., Sindhya, K.: A surrogate-assisted reference vector guided evolutionary algorithm for computationally expensive many-objective optimization. IEEE Trans. Evol. Comput. 22(1), 129–142 (2018)CrossRefGoogle Scholar
  4. 4.
    Das, I., Dennis, J.E.: Normal-boundary intersection: a new method for generating the Pareto surface in nonlinear multicriteria optimization problems. SIAM J. Optim. 8(3), 631–657 (1998)MathSciNetCrossRefGoogle Scholar
  5. 5.
    Deb, K., Pratap, A., Agarwal, S., Meyarivan, T.: A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans. Evol. Comput. 6(2), 182–197 (2002)CrossRefGoogle Scholar
  6. 6.
    Deb, K., Thiele, L., Laumanns, M., Zitzler, E.: Scalable test problems for evolutionary multiobjective optimization. In: Proceedings of the International Conference on Evolutionary Multiobjective Optimization, pp. 105–145 (2005)Google Scholar
  7. 7.
    Ishibuchi, H., Yu, S., Hiroyuki, M., Yusuke, N.: Performance of decomposition-based many-objective algorithms strongly depends on Pareto front shapes. IEEE Trans. Evol. Comput. 21(2), 169–190 (2017)CrossRefGoogle Scholar
  8. 8.
    Jin, Y.: A comprehensive survey of fitness approximation in evolutionary computation. Soft Comput. - Fusion Found. Methodol. Appl. 9(1), 3–12 (2005)MathSciNetGoogle Scholar
  9. 9.
  10. 10.
    Lillard, V.B.: Science of test: improving the efficiency and effectiveness of DoD test and evaluation (2014). http://fs.fish.govt.nz/Page.aspx?pk=7&sc=SUR
  11. 11.
    Pan, L., He, C., Tian, Y., Wang, H., Zhang, X., Jin, Y.: A classification based surrogate-assisted evolutionary algorithm for expensive many-objective optimization. IEEE Trans. Evol. Comput. (2018). https://ieeexplore.ieee.org/document/8281523
  12. 12.
    Storn, R., Price, K.: Differential evolution - a simple and efficient heuristic for global optimization over continuous spaces. J. Glob. Optim. 11(4), 341–359 (1997).  https://doi.org/10.1023/A:1008202821328MathSciNetCrossRefzbMATHGoogle Scholar
  13. 13.
    Stuber, M.D.: Evaluation of process systems operating envelopes. Ph.D. thesis, Massachusetts Institute of Technology (2013)Google Scholar
  14. 14.
    Trivedi, A., Srinivasan, D., Sanyal, K., Ghosh, A.: A survey of multi-objective evolutionary algorithms based on decomposition. IEEE Trans. Evol. Comput. 21(3), 440–462 (2017)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  • Tapabrata Ray
    • 1
  • Ahsanul Habib
    • 1
  • Hemant Kumar Singh
    • 1
  • Michael Ryan
    • 1
  1. 1.School of Engineering and Information TechnologyThe University of New South WalesCanberraAustralia

Personalised recommendations