Abstract
We consider the problem of estimating the parameters of an autoregressive process based on observations with additive noise. A sequential method has been developed for constructing a fixed-size confidence domain with a given confidence factor for a vector of unknown parameters based on a finite sample. Formulas are obtained for the duration of a procedure that achieves the required performance of estimates of unknown parameters in the case of Gaussian noise. Confidence parameter estimates are constructed using a special sequential modification of the classic Yule–Walker estimates; this permits one to estimate the confidence factor for small and moderate sample sizes. The results of numerical modeling of the proposed estimates are presented and compared with the Yule–Walker estimates using the example of confidence estimation of spectral density.
Similar content being viewed by others
REFERENCES
Ljung, L. and Söderstrom, T., Theory and Practice of Recursive Identification, Cambridge, MA: MIT Press, 1986.
Anderson, T.W., The Statistical Analysis of Time Series, New York: Wiley, 1971. Translated under the title: Statisticheskii analiz vremennykh ryadov, Moscow: Mir, 1976.
Brockwell, P.J. and Davis, R.A., Time Series: Theory and Methods, New York: Springer Sci.+Business Media, 1991.
Vasil’ev, V.A., Dobrovidov, A.V., and Koshkin, G.M., Neparametricheskoe otsenivanie funktsionalov ot raspredelenii statsionarnykh posledovatel’nostei (Nonparametric Estimation of Functionals of Distributions of Stationary Sequences), Moscow: Nauka, 2004.
Kashkovskii, D.V. and Konev, V.V., Successive identification of the random-parameter linear dynamic systems, Autom. Remote Control, 2008, vol. 69, no. 8, pp. 1344–1356.
Konev, V.V. and Pergamenshchikov, S.M., Robust model selection for a semimartingale continuous time regression from discrete data, Stochastic Process. Their Appl., 2015, vol. 125, no. 1, pp. 294–326.
Emel’yanova, T.V. and Konev, V.V., On sequential estimation of the parameters of continuous-time trigonometric regression, Autom. Remote Control, 2016, vol. 77, no. 6, pp. 992–1008.
Seber, G.A.F., Linear Regression Analysis, New York: John Wiley and Sons, 1977. Translated under the title: Lineinyi regressionnyi analiz, Moscow: Mir, 1980.
Novikov, A.A., Sequential estimation of parameters of diffusion processes, Teoriya Veroyatn. Ee Primen., 1971, vol. 16, no. 2, pp. 394–396.
Liptser, R.Sh. and Shiryaev, A.N., Statistika sluchainykh protsessov (Statistics of Random Processes), Moscow: Nauka, 1974.
Lai, T.L. and Siegmund, D., Fixed accuracy estimation of an autoregressive parameter, Ann. Stat., 1983, vol. 11, pp. 478–485.
Galtchouk, L. and Konev, V., On asymptotic normality of sequential LS-estimate for unstable autoregressive process AR(2), J. Multivariate Anal., Academic Press, 2010, vol. 101, no. 10, pp. 2616–2636.
Borisov, V.Z. and Konev, V.V., On sequential parameter estimation in discrete-time processes, Autom. Remote Control, 1977, vol. 38, no. 10, pp. 1475–1480.
Vorobeichikov, S.E. and Konev, V.V., On sequential identification of stochastic systems, Izv. Akad. Nauk SSSR. Tekhn. Kibern., 1980, no. 4, pp. 91–98.
Konev, V.V. and Pergamenshchikov, S.M., Sequential plans of parameter identification in dynamic systems, Autom. Remote Control, 1981, vol. 42, no. 7 (Part 1), pp. 917–924.
Vasil’ev, V.A. and Konev, V.V., Sequential estimation of parameters of dynamic systems under incomplete observation, Izv. Akad. Nauk SSSR. Tekhn. Kibern., 1982, no. 6, pp. 145–154.
Xia, Y. and Zheng, W.X., Novel parameter estimation of autoregressive signals in the presence of noise, Automatica, 2015, vol. 62, pp. 98–105.
Kulikova, M.V., Maximum likelihood estimation of linear stochastic systems in the class of sequential square-root orthogonal filtering methods, Autom. Remote Control, 2011, vol. 72, no. 4, pp. 766–786.
Diversi, R., Guidorzi, R., and Soverini, U., Identification of autoregressive models in the presence of additive noise, Int. J. Adapt. Control Signal Process., 2008, vol. 22, no. 5, pp. 465–481.
Labarre, D., Grivel, E., Berthoumieu, Y., Todini, E., and Najim, M., Consistent estimation of autoregressive parameters from noisy observations based on two interacting Kalman filters, Signal Process., 2006, vol. 86, no. 10, pp. 2863–2876.
Zheng, W.X., Fast identification of autoregressive signals from noisy observations, IEEE Trans. Circuits Syst.–II: Express Briefs., 2005, vol. 52, no. 1, pp. 43–48.
Pagano, M., Estimation of models of autoregressive signal plus white noise, Ann. Stat., 1974, vol. 2, no. 1, pp. 99–108.
Konev, V.V., On one property of martingales with conditionally Gaussian increments and its application in the theory of nonasymptotic inference, Dokl. Math., 2016, vol. 471, no. 5, pp. 523–527.
Konev, V. and Nazarenko, B., Sequential fixed accuracy estimation for nonstationary autoregressive processes, AISM, 2020, vol. 72, no. 1, pp. 235–264.
Vorobeichikov, S.E. and Konev, V.V., On sequential confidence estimation of parameters of stochastic dynamical systems with conditionally Gaussian noises, Autom. Remote Control, 2017, vol. 78, no. 10, pp. 1803–1818.
Kurzhanskii, A.B. and Furasov, V.D., Identification of bilinear systems. Guaranteed pseudoellipsoidal estimates, Autom. Remote Control, 2000, vol. 61, no. 1, pp. 38–49.
Konev, V.V. and Pergamenshchikov, S.M., General model selection estimation of a periodic regression with a Gaussian noise, AISM, 2010, vol. 62, no. 6, pp. 1083–1111.
Shiryaev, A.N., Statisticheskii posledovatel’nyi analiz (Statistical Sequential Analysis), Moscow: Nauka, 1976.
Tartakovsky, A., Nikiforov, I., and Basseville, M., Sequential Analysis: Hypothesis Testing and Changepoint Detection, Chapman & Hall/CRC Press, 2015.
ACKNOWLEDGMENTS
The authors are grateful to anonymous referees for constructive comments.
Funding
This work was supported by the Russian Science Foundation, project no. 17-11-01049.
Author information
Authors and Affiliations
Corresponding authors
Additional information
Translated by V. Potapchouck
APPENDIX
Let us present Theorem A.1 and Theorem A.2 on the properties of stopped martingales with conditionally Gaussian increments from the papers [23] and [24], which were used in Secs. 2 and 3 when selecting the weight coefficients (2.15) and (3.9), (3.10) in the sequential Yule–Walker estimates (2.14) and (3.11) as well as when determining the duration of the procedure. We will also prove some technical results.
Theorem A.1.
Let \(\left (M_k, \mathcal {F}_k \right )_{k\geqslant 0}\) be a square integrable martingale [23] such that
-
(a)
Its quadratic characteristic satisfies the condition
$$ \mathsf {P}_\theta \left (\langle M\rangle _\infty =+\infty \right )=1. $$ -
(b)
\(\mathrm {Law}\thinspace (\varDelta M_k|\mathcal {F}_{k-1})=\mathcal {N}(0,\sigma ^2_{k-1}) \), \( k=1, 2, \dots \); i.e., the \( \mathcal {F}_{k-1}\) -conditional distribution of \(\varDelta M_k=M_k-M_{k-1} \) is Gaussian with parameters \(0\) and \(\sigma ^2_{k-1}=\mathbf {E}\left ((\varDelta M_k)^2|\mathcal {F}_{k-1} \right )\).
For each \(h \) , we define the stopping time
Then for each \(h>0 \) the variable \( m(h)\) is standard Gaussian.
Proof of Lemma 1. Let us verify whether \(\left (\zeta _1(n),\mathcal {F}_n^{(1)}\right )_{n\geqslant 3} \) is a martingale. The measurability of \(\zeta _1(n) \) with respect to \(\mathcal {F}_n^{(1)} \) follows from definitions (2.10) and (2.11). Let us show that \( \mathbf {E}\left (\zeta _1(n+1)|\mathcal {F}_n^{(1)}\right )=\zeta _n^{(1)} \). Consider, for example, the case of even \(n \). Let \(n=2l \); then
The verification of whether \(\zeta _2(n)\) is a martingale can be carried out in a similar way.
This completes the proof of Lemma 1. \(\quad \blacksquare \)
Proof of Lemma 2. Let us show that the desired result follows from Theorem A.1. We introduce the random processes
The processes \(\left (M_n^{(1)}, \mathcal {F}_n^{(1)} \right )_{n\geqslant 3}\) and \(\left (M_n^{(2)}, \mathcal {F}_n^{(2)} \right )_{n\geqslant 3}\) are martingales with
Let us verify these properties for \(M_n^{(1)} \). By the definition of \(T_1(n) \) and \(T_2(n) \) in (2.9), we obtain
Note that if \(n\in T_1(n)\), then \(\mathcal {F}_{n-1}^{(1)}=\mathcal {F}_{n-2}^{(1)}\). If, for example, \(n \) is odd, then the number \(n-1 \) is even. Consequently, \(m_1(n)=n \) and \(m_1(n-1)=m_1(n-2) \). By definition, \(\mathcal {F}_{n-1}^{(1)}=\mathcal {F}_{n-2}^{(1)}\). Since \({\xi }_n^2\) is independent of \(\mathcal {F}_{n-2}^{(1)} \), we have
Proof of Proposition 2. The model (2.1), (2.2) is written in vector form as
To analyze \(\tau (h)\), we need the asymptotic behavior of the sum
Considering (3.1), we obtain
The sequence \(\{Z_j\}\) satisfies the vector autoregression equation
Since the process \(Z_j\) is stable, one has (see, e.g., [2])
From (A.6) and (A.7), in view of (3.1), we find that
Further, we directly verify that
Now let us find the asymptotics of the stopping times \(\tau _l^{(i)}(h) \). By the definition of \(\tau _1^{(i)}(h) \) in (3.7), we have
In a similar way, we find that
Taking into account (3.8), we obtain relation (3.15) for the duration of the sequential procedure. Now let us consider the asymptotic behavior of the matrix \(G(h) \) defined in (3.13). For the stable process (2.1), one has the property [2]
Further, substituting the coefficients (3.10) into the entry of the matrix (3.13) and taking into account the fact that \(y_{k-p-l}=\langle Y_{k-p-1}\rangle _l\) and \(y_{k-s}=\langle Y_{k-1}\rangle _s\), we obtain
Taking into account the definition of the coefficients (3.9), note that the inner sum coincides, except for one term corresponding to the time \(\tau _l^{(i)}(h)\), with the sum
The proof of Proposition 2 is complete. \(\quad \blacksquare \)
Theorem A.2.
Suppose that we are given [24]
-
1.
A probability space \((\Omega , \mathcal {F},\mathsf {P})\) with a filtration \( (\mathcal {F})_{k\geqslant 0}\).
-
2.
A family \((M_k^{(l)}, \mathcal {F}_k )_{k\geqslant 0}\), \( l={1,\ldots ,p}\), of square integrable martingales with quadratic characteristics \(\{\langle M^{(l)} \rangle _{n} \}_{n\geqslant 1}\), \( l={1,\ldots ,p}\), such that
-
(a)
\(\mathsf {P}(\langle M^{(l)}\rangle _\infty =+\infty )=1 \), \( l={1,\ldots ,p}\).
-
(b)
\(\mathrm {Law}(\varDelta M_k^{(l)}|\mathcal {F}_{k-1})=\mathcal {N}(0,\sigma _l^2(k-1)) \), \( k=1, 2, \dots \), \( l={1,\ldots ,p}\); i.e., the \( \mathcal {F}_{k-1}\) -conditional distribution of the increment \(\varDelta M_k^{(l)}=M_k^{(l)}-M_{k-1}^{(l)}\) is Gaussian with parameters \(0 \) and \( \sigma _l^2(k-1)=\mathbf {E}\big ((\varDelta M_k^{(l)} )^2|\mathcal {F}_{k-1}\big )\).
-
(a)
For each \(h>0 \) , we define the stopping time
Proof of Lemma 3. Write (3.4) in the form
Based on this, we have
Proof of Theorem 2. Substituting \(y_k \) from (2.1) into (3.12) and considering (3.13), we obtain
In vector form, system (A.15) is as follows:
Substituting the weight coefficients (3.10) into (A.16), we obtain
Having introduced the vectors \(\zeta ^{(i)}(h)=\left (\zeta _1^{(i)}(h), \dots ,\zeta _p^{(i)}(h) \right )^{\mathrm {T}}\), \(i={1,\ldots ,p+1}\), we have the expansion
The proof of Theorem 2 is complete. \(\quad \blacksquare \)
Rights and permissions
About this article
Cite this article
Konev, V.V., Pupkov, A.V. Confidence Estimation of Autoregressive Parameters Based on Noisy Data. Autom Remote Control 82, 1030–1048 (2021). https://doi.org/10.1134/S0005117921060059
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1134/S0005117921060059