Skip to main content

Part of the book series: Texts in Computational Science and Engineering ((TCSE,volume 13))

  • 796 Accesses

Abstract

The problem central to this chapter, and the one that follows, is easy to state: given a function \(F(\textbf{v})\), find the point \(\textbf{v}_m\in \mathbb {R}^n\) where F achieves its minimum value. This can be written as \( F(\textbf{v}_m)= \min _{ \textbf{v}\in \mathbb {R}^n} F(\textbf{v}).\) The point \(\textbf{v}_m\) is called the minimizer or optimal solution. The function \(F(\textbf{v})\) is referred to as the  objective function, or the error function, or the cost function (depending on the application). Also, because the minimization is over all of \(\mathbb {R}^n\), this is a problem in unconstrained optimization. This chapter concerns what is known as regression, which involves finding a function that best fits a set of data points. Typical examples are shown in Figure 8.1. To carry out the regression analysis, you need to specify two functions: the model function and the error function. With this you then need to specify what method to use to calculate the minimizer. We begin with the model function, which means we specify the function that will be fitted to the data. Examples are: Linear Regression \( g(x)= v_1+v_2x g(x)= v_1+v_2x+v_3x^2 +v_4x^3 g(x)= v_1e^{x} + v_2e^{-x} g(x)= \frac{v_1}{1+x}+v_2\sqrt{1+x}\) Non-linear  Regression \(g(x) = v_1+v_2e^{v_3x} \qquad \qquad \text {- asymptotic regression function} g(x)= \frac{v_1 x}{v_2+x} \qquad \qquad \qquad \text {- Michaelis-Menten function} g(x)= \frac{v_1}{1+v_2\exp (-v_3x)} \quad \,\, \text {- logistic function}\). In the above functions, the \(v_j\)’s are the parameters determined by fitting the function to the given data, and x is the independent variable in the problem. What determines whether the function is linear or nonlinear is how it depends on the \(v_j\)’s. Specifically, linear regression means that the model function g(x) is a linear function of the \(v_j\)’s.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 69.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 89.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mark H. Holmes .

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Holmes, M.H. (2023). Optimization: Regression. In: Introduction to Scientific Computing and Data Analysis. Texts in Computational Science and Engineering, vol 13. Springer, Cham. https://doi.org/10.1007/978-3-031-22430-0_8

Download citation

Publish with us

Policies and ethics