Extended Optimality Criteria for Optimum Design in Nonlinear Regression
Among the major difficulties that one may encounter when estimating parameters in a nonlinear regression model are the non-uniqueness of the estimator, its instability with respect to small perturbations of the observations and the presence of local optimizers of the estimation criterion. We show that these estimability issues can be taken into account at the design stage, through the definition of suitable design criteria. Extensions of E, c and G-optimality criteria will be considered, which, when evaluated at a given θ0 (locally optimal design), account for the behavior of the model response η(θ) for θ far from θ0. In particular, they ensure some protection against close-to-overlapping situations where ∥η(θ)−η(θ0)∥ is small for some θ far from θ0. These extended criteria are concave, their directional derivative can be computed, and necessary and sufficient conditions for optimality (Equivalence Theorems) can be formulated. They are not differentiable, but a relaxation based on maximum-entropy regularization is proposed to obtain concave and differentiable alternatives. When the design space is finite and the set of admissible θ is discretized, their optimization forms a linear programming problem.
- Chavent, G.: A new sufficient condition for the wellposedness of non-linear least-square problems arising in identification and control. In: Bensoussan, A., Lions, J.L. (eds.) Analysis and Optimization of Systems, pp. 452–463. Springer, Heidelberg (1990) Google Scholar
- Dem’yanov, V., Malozemov, V.: Introduction to Minimax. Dover, New York (1974) Google Scholar