Skip to main content

Model Differentiation

  • Chapter
  • First Online:
  • 1550 Accesses

Abstract

In the previous chapters, we showed that an inverse problem ultimately becomes an optimization problem, regardless the type of framework (deterministic or statistical) used to formulate it. Gradient-based algorithms are efficient for solving optimization problems, but require derivatives of the objective function as inputs. In this chapter, we will consider methods for obtaining derivatives of a generic function defined by a model or by a computer code.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ne-Zheng Sun .

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer Science+Business Media, LLC

About this chapter

Cite this chapter

Sun, NZ., Sun, A. (2015). Model Differentiation. In: Model Calibration and Parameter Estimation. Springer, New York, NY. https://doi.org/10.1007/978-1-4939-2323-6_5

Download citation

Publish with us

Policies and ethics