Abstract
The problem central to this chapter, and the one that follows, is easy to state: given a function \(F(\textbf{v})\), find the point \(\textbf{v}_m\in \mathbb {R}^n\) where F achieves its minimum value. This can be written as \( F(\textbf{v}_m)= \min _{ \textbf{v}\in \mathbb {R}^n} F(\textbf{v}).\) The point \(\textbf{v}_m\) is called the minimizer or optimal solution. The function \(F(\textbf{v})\) is referred to as the objective function, or the error function, or the cost function (depending on the application). Also, because the minimization is over all of \(\mathbb {R}^n\), this is a problem in unconstrained optimization. This chapter concerns what is known as regression, which involves finding a function that best fits a set of data points. Typical examples are shown in Figure 8.1. To carry out the regression analysis, you need to specify two functions: the model function and the error function. With this you then need to specify what method to use to calculate the minimizer. We begin with the model function, which means we specify the function that will be fitted to the data. Examples are: Linear Regression \( g(x)= v_1+v_2x g(x)= v_1+v_2x+v_3x^2 +v_4x^3 g(x)= v_1e^{x} + v_2e^{-x} g(x)= \frac{v_1}{1+x}+v_2\sqrt{1+x}\) Non-linear Regression \(g(x) = v_1+v_2e^{v_3x} \qquad \qquad \text {- asymptotic regression function} g(x)= \frac{v_1 x}{v_2+x} \qquad \qquad \qquad \text {- Michaelis-Menten function} g(x)= \frac{v_1}{1+v_2\exp (-v_3x)} \quad \,\, \text {- logistic function}\). In the above functions, the \(v_j\)’s are the parameters determined by fitting the function to the given data, and x is the independent variable in the problem. What determines whether the function is linear or nonlinear is how it depends on the \(v_j\)’s. Specifically, linear regression means that the model function g(x) is a linear function of the \(v_j\)’s.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this chapter
Cite this chapter
Holmes, M.H. (2023). Optimization: Regression. In: Introduction to Scientific Computing and Data Analysis. Texts in Computational Science and Engineering, vol 13. Springer, Cham. https://doi.org/10.1007/978-3-031-22430-0_8
Download citation
DOI: https://doi.org/10.1007/978-3-031-22430-0_8
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-22429-4
Online ISBN: 978-3-031-22430-0
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)