, Volume 49, Issue 4, pp 761-775

Choosing a Linear Model with a Random Number of Change-Points and Outliers

Rent the article at a discount

Rent now

* Final gross prices may vary according to local VAT.

Get Access


The problem of determining a normal linear model with possible perturbations, viz. change-points and outliers, is formulated as a problem of testing multiple hypotheses, and a Bayes invariant optimal multi-decision procedure is provided for detecting at most k (k > 1) such perturbations. The asymptotic form of the procedure is a penalized log-likelihood procedure which does not depend on the loss function nor on the prior distribution of the shifts under fairly mild assumptions. The term which penalizes too large a number of changes (or outliers) arises mainly from realistic assumptions about their occurrence. It is different from the term which appears in Akaike‘s or Schwarz‘ criteria, although it is of the same order as the latter. Some concrete numerical examples are analyzed.