Extended Optimality Criteria for Optimum Design in Nonlinear Regression

Conference paper
Part of the Contributions to Statistics book series (CONTRIB.STAT.)


Among the major difficulties that one may encounter when estimating parameters in a nonlinear regression model are the non-uniqueness of the estimator, its instability with respect to small perturbations of the observations and the presence of local optimizers of the estimation criterion. We show that these estimability issues can be taken into account at the design stage, through the definition of suitable design criteria. Extensions of E, c and G-optimality criteria will be considered, which, when evaluated at a given θ0 (locally optimal design), account for the behavior of the model response η(θ) for θ far from θ0. In particular, they ensure some protection against close-to-overlapping situations where ∥η(θ)−η(θ0)∥ is small for some θ far from θ0. These extended criteria are concave, their directional derivative can be computed, and necessary and sufficient conditions for optimality (Equivalence Theorems) can be formulated. They are not differentiable, but a relaxation based on maximum-entropy regularization is proposed to obtain concave and differentiable alternatives. When the design space is finite and the set of admissible θ is discretized, their optimization forms a linear programming problem.


  1. Bates, D., Watts, D.: Relative curvature measures of nonlinearity. J. R. Stat. Soc. B 42, 1–25 (1980) MathSciNetMATHGoogle Scholar
  2. Chavent, G.: Local stability of the output least square parameter estimation technique. Mat. Appl. Comput. 2, 3–22 (1983) MathSciNetMATHGoogle Scholar
  3. Chavent, G.: A new sufficient condition for the wellposedness of non-linear least-square problems arising in identification and control. In: Bensoussan, A., Lions, J.L. (eds.) Analysis and Optimization of Systems, pp. 452–463. Springer, Heidelberg (1990) Google Scholar
  4. Chavent, G.: New size × curvature conditions for strict quasiconvexity of sets. SIAM J. Control Optim. 29, 1348–1372 (1991) MathSciNetMATHCrossRefGoogle Scholar
  5. Clyde, M., Chaloner, K.: Constrained design strategies for improving normal approximations in nonlinear regression problems. J. Stat. Plan. Inference 104, 175–196 (2002) MathSciNetMATHCrossRefGoogle Scholar
  6. Demidenko, E.: Optimization and Regression. Nauka, Moscow (1989). (In Russian) MATHGoogle Scholar
  7. Demidenko, E.: Is this the least squares estimate? Biometrika 87, 437–452 (2000) MathSciNetMATHCrossRefGoogle Scholar
  8. Dem’yanov, V., Malozemov, V.: Introduction to Minimax. Dover, New York (1974) Google Scholar
  9. Li, X.S., Fang, S.C.: On the entropic regularization method for solving min-max problems with applications. Math. Methods Oper. Res. 46, 119–130 (1997) MathSciNetMATHCrossRefGoogle Scholar
  10. Pázman, A.: Nonlinear least squares—uniqueness versus ambiguity. Math. Oper.forsch. Stat., Ser. Stat. 15, 323–336 (1984) MATHGoogle Scholar
  11. Pázman, A.: Nonlinear Statistical Models. Kluwer Academic, Dordrecht (1993) MATHGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2013

Authors and Affiliations

  1. 1.Department of Applied Mathematics and Statistics, Faculty of Mathematics, Physics and InformaticsComenius UniversityBratislavaSlovakia
  2. 2.Laboratoire I3SCNRS/Université de Nice–Sophia AntipolisSophia AntipolisFrance

Personalised recommendations