Abstract
Integer-valued trawl processes are a class of serially correlated, stationary and infinitely divisible processes that Ole E. Barndorff-Nielsen has been working on in recent years. In this chapter, we provide the first analysis of likelihood inference for trawl processes by focusing on the so-called exponential-trawl process, which is also a continuous time hidden Markov process with countable state space. The core ideas include prediction decomposition, filtering and smoothing, complete-data analysis and EM algorithm. These can be easily scaled up to adapt to more general trawl processes but with increasing computation efforts.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Asmussen, S.: Applied Probability and Queues. Springer, New York (2003)
Barndorff-Nielsen, O.E.: Stationary infinitely divisible processes. Braz. J. Probab. Stat. 25, 294–322 (2011)
Barndorff-Nielsen, O.E., Schmiegel, J.: Ambit processes; with applications to turbulence and tumour growth. In: Benth, F.E., Di Nunno, G., Lindstrøm, T., Øksendal, B., Zhang, T. (eds.) Stochastic Analysis and Applications, pp. 93–124. Springer, Berlin Heidelberg (2007)
Barndorff-Nielsen, O.E., Benth, F.E., Veraart, A.E.D.: Ambit processes and stochastic partial differential equations. In: Di Nunno, G., Øksendal, B. (eds.) Advanced Mathematical Methods for Finance, pp. 35–74. Springer, Berlin (2011)
Barndorff-Nielsen, O.E., Pollard, D.G., Shephard, N.: Integer-valued Lévy processes and low latency financial econometrics. Quant. Finan. 12, 587–605 (2012)
Barndorff-Nielsen, O.E., Benth, F.E., Veraart, A.E.D.: Recent advances in ambit stochastics with a view towards tempo-spatial stochastic volatility, intermittency. ArXiv e-prints. Unpublished paper. Department of Mathematics, Imperial College London (2012)
Barndorff-Nielsen, O.E., Lunde, A., Shephard, N., Veraart, A.E.D.: Integer-valued trawl processes: a class of stationary infinitely divisible processes. Scand. J. Stat. 41, 693–724 (2014)
Bartlett, M.S.: An Introduction to Stochastic Processes, with Special Reference to Methods and Applications, 3rd edn. Cambridge University Press, Cambridge (1978)
Cameron, C.A., Trivedi, P.K.: Regression Analysis of Count Data. Cambridge University Press, Cambridge (1998)
Cui, Y., Lund, R.: A new look at time series of counts. Biometrika 96, 781–792 (2009)
Daley, D., Vere-Jones, D.: Evolutionary processes and predictability. An Introduction to the Theory of Point Processes. Probability and Its Applications, pp. 355–456. Springer, New York (2008)
Davis, R.A., Wu, R.: A negative binomial model for time series of counts. Biometrika 96, 735–749 (2009)
Fokianos, K., Kedem, B.: Regression theory for categorical time series. Stat. Sci. 18, 357–376 (2003)
Jacobs, P.A., Lewis, P.A.W.: Discrete time series generated by mixtures. I: correlational and runs properties. J. Roy. Stat. Soc. Ser. B (Methodological) 40, 94–105 (1978)
Jung, R., Tremayne, A.: Useful models for time series of counts or simply wrong ones? AStA Adv. Stat. Anal. 95, 59–91 (2011)
Lindley, D.V.: The estimation of velocity distributions from counts. In: Proceedings of the Internationl Congress of Mathematicians, vol. 3, pp. 427–444. North-Holland, Amsterdam (1956)
McKenzie, D.J.: Measuring inequality with asset indicators. J. Popul. Econ. 18, 229–260 (2005)
Reynolds, J.F.: On the autocorrelation and spectral functions of queues. J. Appl. Probab. 5, 467–475 (1968)
Rudemo, M.: State estimation for partially observed Markov chains. J. Math. Anal. Appl. 44, 581–611 (1973)
Rudemo, M.: Prediction and smoothing for partially observed Markov chains. J. Math. Anal. Appl. 49, 1–23 (1975)
Shephard, N.,Yang, J.J.: Continuous time analysis of fleeting discrete price moves. ArXiv e-prints. Unpublished paper. Department of Statistics, Harvard Unviersity (2014)
Surgailis, D., Rosinski, J., Mandrekar, V., Cambanis, S.: Stable mixed moving averages. Probab. Theory Relat. Fields 97, 543–558 (1993)
Weiß, C.: Thinning operations for modeling time series of counts–a survey. AStA Adv. Stat. Anal. 92, 319–341 (2008)
Wolpert, R.L., Brown, L.D.: Stationary infinitely-divisible Markov processes with non-negative integer values. Working paper, Department of Staistics, Duke University (2011)
Wolpert, R.L., Taqqu, M.S.: Fractional Ornstein-Uhlenbeck Lévy processes and the telecom process: upstairs and downstairs. Sig. Process. 85, 1523–1545 (2005)
Zhu, R., Joe, H.: A new type of discrete self-decomposability and its application to continuous-time Markov processes for modeling count data time series. Stoch. Models 19, 235–254 (2003)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Appendix: Proofs and Derivations
Appendix: Proofs and Derivations
1.1 6.1 Heuristic Proof of Theorem 1
Our heuristic derivation starts from the following prediction decomposition of the Radon-Nikodym derivative:
where the integral over \(t\in (0,T]\) means a continuous sum of the integrand random variables. Thus,
where the first equality follows because \(X_{t-}\) is known in \(\mathscr {F} _{t-}^{X}\); the third equality follows from (5). Therefore, (20) can be rewritten as
where the second equality follows from \(\log \left( 1-x\right) \approx -x\) for small x and \(\{ t\in \left( 0,T\right] :\varDelta X_{t}\ne 0\} \) has Lebesgue measure 0.
1.2 6.2 Heuristic Proof of Theorem 2
1.2.1 6.2.1 Update by Inactivity
We want to update \(p_{\tau ,\tau }\left( \mathbf {j}\right) \) by incorporating the information \(\mathscr {F}_{\left( \tau ,t\right) }\triangleq \sigma \left( \left\{ \varDelta Y_{s}=0, \tau <s<t\right\} \right) \) using Bayes’ Theorem:
where the first equality holds because there is no activity of \(Y_{s}\) for \( s\in \left( \tau ,t\right) \) and hence the hidden state \(\mathbf {C}\) must stay the same.
Using the prediction decomposition, we have
where the second equality intuitively holds because we know the instantaneous departure probability of a size y event at time s is \(\phi C_{s-}^{\left( y\right) }\mathrm {d}s\) but \(C_{s-}^{\left( y\right) }=C_{\tau }^{\left( y\right) }=j_{y}\) under \(\mathscr {F}_{\left( \tau ,s\right) }\); the third equality follows from \(\log \left( 1-x\right) \approx -x\) for small x. Therefore,
where we throw out the term \(\exp \left( -\sum _{y\in \mathbb {Z} \backslash \left\{ 0\right\} }\nu \left( y\right) \left( t-\tau \right) \right) \) because it doesn’t depend on \(\mathbf {j}\). Normalizing the equation above leads to the desired result.
1.2.2 6.2.2 Update by Jump
We want to update \(p_{\tau -,\tau -}\left( \mathbf {j}\right) \) by incorporating the piece of information, \(\varDelta Y_{\tau }=y\). First note that
which corresponds to the arrival of a new size y event and the departure of an old size \(-y\) event.
For the first term,
where the fourth equality follows from (3) (using \(\mathscr {C}_{\tau -}\supseteq \mathscr {F}_{\tau -}\)) and (5).
Using similar arguments, the second term is
Combining all of these gives us the required result.
1.3 6.3 Heuristic Proof of Theorem 3
The case of updating smoothing distribution \(p_{\tau -,T}\left( \mathbf {j} \right) \) due to inactivity is trivial because the hidden configuration \( \mathbf {C}\) must stay unchanged because of the inactivity during the time period \([t,\tau )\).
1.3.1 6.3.1 Update by Jump
We now consider the case of (backward) updating the smoothing distribution \( p_{\tau ,T}\left( \mathbf {j}\right) \) due to the jump \(\varDelta Y_{\tau }=y\). Then
Note that
where the first equality holds due to the Markov property of \(\mathbf {C}_{t}\), a heuristic derivation is given later; the second and third equalities follow from the Bayes’ Theorem. Since
combining all of these gives us the required result.
1.3.2 6.3.2 Derivation of (21)
Let \(\mathscr {F}_{(\tau ,T]}\triangleq \sigma \left( \left\{ Y_{t}\right\} _{\tau <t\le T}\right) \) and \(\mathscr {C}_{(\tau ,T]}\triangleq \sigma \left( \left\{ \mathbf {C}_{t}\right\} _{\tau <t\le T}\right) \). Note that heuristically the Bayes’ Theorem implies
Since \(\mathscr {F}_{(\tau ,T]}\subseteq \mathscr {C}_{(\tau ,T]}\) (each \( Y_{t}=\sum _{y\in \mathbb {Z} \backslash \left\{ 0\right\} }C_{t}^{\left( y\right) }\)), the Markov property of \(\mathbf {C}_{t}\) implies
because given the current information \(\mathbf {C}_{\tau }\) the information in the past \(\mathbf {C}_{\tau -}\) is irrelevant. This then proves that
1.4 6.4 Proof of Theorem 4
Since each \(C_{t}^{\left( y\right) }\) is independent for different y, the complete-data log-likelihood can be written as
where we recall that \(\mathscr {C}_{t}^{\left( y\right) }\) is the natural filtration generated by \(C_{t}^{\left( y\right) }\),
where the first equality follows directly from Theorem 1 (ignoring the constant), and
because of \(C_{0}^{\left( y\right) }\backsim \mathrm {Poisson}\left( \nu \left( y\right) /\phi \right) \). Thus, collecting terms will give us the required result (16). The derivations of the MCLE are elementary.
Let
The ergodicity of \(D_{t-}\) implies that as \(T\rightarrow \infty \)
Since \(\dfrac{N_{T}^{\mathrm {D}}}{T}\approx \dfrac{N_{T}^{\mathrm {A}}}{T} \rightarrow \left\| \nu \right\| \), we have
Thus,
Finally, for any \(y\in \mathbb {Z} \backslash \left\{ 0\right\} \), \(\dfrac{N_{T}^{\left( y\right) }}{T} \rightarrow \nu \left( y\right) \) and \(\hat{\phi }_{\mathrm {MCLE} }^{-1}\rightarrow \phi ^{-1}<\infty \), so we easily have
1.5 6.5 Proof of Proposition 1
As \(C_{t}^{(y)}\ge 0\), (19) implies that
where we set \(N_{0}^{\left( y\right) }\triangleq 0\) conventionally. Now
so we have
Let \(N_{t}^{\left( -y\right) ,*}\) be the counting processes of \(-y\) jumps resulting from the departures of the initial events of size y that constitute \(C_{0}^{\left( y\right) }\). Let \(\tau \) be the time when \( N^{\left( -y\right) ,*}\) achieve \(C_{0}^{\left( y\right) }\). Then we have
Observe that \(N_{t}^{\left( y\right) }-\left( N_{t}^{\left( -y\right) }-N_{t}^{\left( -y\right) ,*}\right) \) is a M/G/\(\infty \) queue initiated at state 0, so by the ergodicity we must have with probability 1
This then shows that actually
where the last equality follows because \(C_{0,\tau }^{\left( y\right) , \mathrm {L}}\le C_{0}^{\left( y\right) }\). Correspondingly,
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this chapter
Cite this chapter
Shephard, N., Yang, J.J. (2016). Likelihood Inference for Exponential-Trawl Processes. In: Podolskij, M., Stelzer, R., Thorbjørnsen, S., Veraart, A. (eds) The Fascination of Probability, Statistics and their Applications. Springer, Cham. https://doi.org/10.1007/978-3-319-25826-3_12
Download citation
DOI: https://doi.org/10.1007/978-3-319-25826-3_12
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-25824-9
Online ISBN: 978-3-319-25826-3
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)