A Hilbert Transform Method for Estimating Distributed Lag Models With Randomly Missed or Distorted Observations
- 335 Downloads
Least-squares estimation of the lag coefficients of a distributed lag model is not a straightforward regression problem when the sample has missed or distorted observations. Even though the normal equations can be computed using sums of the available cross products, the asymptotic properties of a solution of these equations are unknown. Since a special computer program is required to compute these normal equations when observations are missed, it is practical to consider another approach to the problem.
KeywordsConsistent Estimator Cross Spectrum Good Trial Estimate Transfer Function Distorted Observation
Unable to display preview. Download preview PDF.
- Hinich, M.J. (1982). “Estimating Distributed Lag Coefficients when there are Errors in the Observed Time Series,” ONR Technical Report No. 26.Google Scholar
- — and W.E. Weber (1982). “A Method for Estimating Distributed Lags when Observations are Randomly Missing,” ONR Technical Report No. 21.Google Scholar
- Robinson, E.A. and S. Treitel (1980). Geophysical Signal Analysis, Englewood Cliffs: Prentice-Hall.Google Scholar