Skip to main content
Log in

Metamodels for Computer-based Engineering Design: Survey and recommendations

  • Published:
Engineering with Computers Aims and scope Submit manuscript

Abstract

The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of today’s engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper, we review several of these techniques, including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning and kriging. We survey their existing application in engineering design, and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations, and how common pitfalls can be avoided.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Simpson, T., Poplinski, J., Koch, P. et al. Metamodels for Computer-based Engineering Design: Survey and recommendations. EWC 17, 129–150 (2001). https://doi.org/10.1007/PL00007198

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/PL00007198

Navigation