, Volume 48, Issue 3, pp 577-602

Asymptotically efficient autoregressive model selection for multistep prediction

Rent the article at a discount

Rent now

* Final gross prices may vary according to local VAT.

Get Access

Abstract

A direct method for multistep prediction of a stationary time series involves fitting, by linear regression, a different autoregression for each lead time, h, and to select the order to be fitted, % MathType!MTEF!2!1!+-% feaafeart1ev1aaatCvAUfeBSjuyZL2yd9gzLbvyNv2CaerbuLwBLn% hiov2DGi1BTfMBaeXafv3ySLgzGmvETj2BSbqefm0B1jxALjhiov2D% aebbfv3ySLgzGueE0jxyaibaiGc9yrFr0xXdbba91rFfpec8Eeeu0x% Xdbba9frFj0-OqFfea0dXdd9vqaq-JfrVkFHe9pgea0dXdar-Jb9hs% 0dXdbPYxe9vr0-vr0-vqpWqaaeaabiGaaiaacaqabeaadaqaaqGaaO% qaaiqadUgagaacamaaBaaaleaacaWGObaabeaaaaa!3E44!\[\tilde k_h\], from the data. By contrast, a more usual ‘plug-in’ method involves the least-squares fitting of an initial k-th order autoregression, with k itself selected by an order selection criterion. A bound for the mean squared error of prediction of the direct method is derived and employed for defining an asymptotically efficient order selection for h-step prediction, h > 1; the S h(k) criterion of Shibata (1980) is asymptotically efficient according to this definition. A bound for the mean squared error of prediction of the plug-in method is also derived and used for a comparison of these two alternative methods of multistep prediction. Examples illustrating the results are given.