# Least Squares Estimation of Scalar Models

## Abstract

In chapter 2, conditions were found for the existence of stationary solutions to equations of the form (1.1.1). In practice, however, given that a stationary time series {X(t)} satisfies such an equation, it is necessary to estimate the unknown parameters in order to provide predictors of X(t) given past values of the process. Estimation procedures for fixed coefficient autoregressions are well established, and the asymptotic properties of these estimates are well known (see, for example, chapter 6 of Hannan (1970)). Random coefficient autoregressions are, however, non-linear in nature, and any foreseeable maximum likelihood type estimation method would be an iterative procedure. Such a procedure is discussed in Chapter 4, where the asymptotic properties of the estimates obtained are determined. Iteration must, nevertheless, commence at some point, and since the likelihood will be non-linear and its domain will be of relatively high dimensions, it is likely that there will be local extrema. Hence it is desirable that iterations commence close to the global maximum of the likelihood function for otherwise convergence might be toward a local extremum. ID. this chapter, a least squares estimation procedure is proposed for univariate random coefficient autoregressions which, under certain conditions, is shown to give strongly consistent estimates of the true parameters. The estimates are also shown to obey a central limit theorem.

## Keywords

Covariance Matrix Estimation Procedure Central Limit Theorem Asymptotic Property Ergodic Theorem## Preview

Unable to display preview. Download preview PDF.