Advertisement

Robustness of Time Delay Embedding to Sampling Interval Misspecification

  • Steven M. Boker
  • Stacey S. Tiberio
  • Robert G. Moulder
Chapter

Abstract

Time delay embedding is a method that is often used when estimating continuous-time differential equation parameters from univariate and multivariate time series, and this method assumes equal time intervals between samples. But in much real-world social science data, time intervals between samples can vary widely. This chapter simulates several common types of time interval misspecification and compares three methods commonly used for accounting for unequal intervals against each other and against the case where no correction is made. Surprisingly, no correction performs almost as well as the best method: in most of the simulated conditions, there was no significant difference between no correction and a sophisticated full information maximum likelihood method where filter loadings were tailored to the actual intervals between samples for each row of data. Time delay embedding appears to be relatively robust to sampling interval misspecification. Reasons for this robustness are presented and discussed. Caveats are presented and discussed regarding cases when time misspecification may still induce bias in results from time delay embedding.

Notes

Acknowledgements

Funding for this work was provided in part by NIH Grant 1R21DA024304–01. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Institutes of Health.

References

  1. Boker, S. M. (2007). Specifying latent differential equations models. In S. M. Boker & M. J. Wenger (Eds.), Data analytic techniques for dynamical systems in the social and behavioral sciences (pp. 131–159). Mahwah, NJ: Lawrence Erlbaum Associates.Google Scholar
  2. Boker, S. M., Deboeck, P. R., Edler, C., & Keel, P. K. (2010). Generalized local linear approximation of derivatives from time series. In S.-M. Chow & E. Ferrar (Eds.), Statistical methods for modeling human dynamics: An interdisciplinary dialogue (pp. 161–178). Boca Raton, FL: Taylor & Francis.Google Scholar
  3. Boker, S. M., Neale, M. C., & Rausch, J. (2004). Latent differential equation modeling with multivariate multi-occasion indicators. In K. van Montfort, J. H. L. Oud, & A. Satorra (Eds.), Recent developments on structural equation models: Theory and applications (pp. 151–174). Dordrecht: Kluwer Academic Publishers. https://doi.org/10.1007/978-1-4020-1958-6_9 CrossRefGoogle Scholar
  4. Boker, S. M., Staples, A., & Hu, Y. (2016). Dynamics of change and changes in dynamics. Journal of Person-Oriented Research, 2(1–2), 34–55.  https://doi.org/10.17505/jpor.2016.05 CrossRefGoogle Scholar
  5. Cook, R. L. (1986). Stochastic sampling in computer graphics. ACM Transactions on Graphics, 5(1), 51–72. https://doi.org/10.1145/7529.8927 CrossRefGoogle Scholar
  6. Csikszentmihalyi, M., & Larson, R. (1987). Validity and reliability of the experience-sampling method. The Journal of Nervous and Mental Disease, 175(9), 526–536. https://doi.org/10.1097/00005053-198709000-00004 CrossRefGoogle Scholar
  7. Delsing, M. J. M. H., & Oud, J. H. L. (2008). Analyzing reciprocal relationships by means of the continuous-time autoregressive latent trajectory model. Statistica Neerlandica, 62(1), 58–82. https://doi.org/10.1111/j.1467-9574.2007.00386.x MathSciNetCrossRefGoogle Scholar
  8. Dufau, S., Duñabeitia, J. A., Moret-Tatay, C., McGonigal, A., Peeters, D., Alario, F.-X., et al. (2011). Smart phone, smart science: How the use of smartphones can revolutionize research in cognitive science. PLoS ONE, 6(9), 1–3.  https://doi.org/10.1371/journal.pone.0024974 CrossRefGoogle Scholar
  9. Erbacher, M. K., Schmidt, K. M., Boker, S. M., & Bergeman, C. S. (2012). Measuring positive and negative affect in older adults over 56 days: Comparing trait level scoring methods using the partial credit model. Applied Psychological Measurement, 13(2), 146–164.Google Scholar
  10. Goldman, R. N. (1983). An urnful of blending functions. IEEE Computer Graphics and Application, 3, 49–54.  https://doi.org/10.1109/MCG.1983.263276 CrossRefGoogle Scholar
  11. Graham, J. W., Taylor, B. J., Olchowski, A. E., & Cumsille, P. E. (2006). Planned missing data designs in psychological research. Psychological Methods, 11(4), 323–343. https://doi.org/10.1037/1082-989X.11.4.323 CrossRefGoogle Scholar
  12. Hektner, J. M., Schmidt, J. A., & Csikszentmihalyi, M. (2006). Experience sampling method: Measuring the quality of everyday life. Thousand Oaks, CA: SAGE Publications, Incorporated.Google Scholar
  13. Hennig, M. H., & Wörgötter, F. (2004). Eye micro-movements improve stimulus detection beyond the Nyquist limit in the peripheral retina. In Advances in neural information processing systems (pp. 1475–1482).Google Scholar
  14. Hu, Y., Boker, S. M., Neale, M. C., & Klump, K. (2014). Latent differential equations with moderators: Simulation and application. Psychological Methods, 19(1), 56–71. https://doi.org/10.1037/a0032476 CrossRefGoogle Scholar
  15. Jagerman, D., & Fogel, L. (1956). Some general aspects of the sampling theorem. IRE Transactions on Information Theory, 2(4), 139–146.  https://doi.org/10.1109/TIT.1956.1056821 CrossRefGoogle Scholar
  16. Klump, K. L., Keel, P. K., Kashy, D. A., Racine, S., Burt, S. A., Neale, M., et al. (2013). The interactive effects of estrogen and progesterone on changes in binge eating across the menstrual cycle. Journal of Abnormal Psychology, 122(1), 131–137. https://doi.org/10.1037/a0029524 CrossRefGoogle Scholar
  17. Little, R. J. A., & Rubin, D. B. (1989). The analysis of social science data with missing values. Socialogical Methods and Research, 18, 292–326. https://doi.org/10.1177/0049124189018002004 CrossRefGoogle Scholar
  18. McArdle, J. J. (1994). Structural factor analysis experiments with incomplete data. Multivariate Behavioral Research, 29(4), 409–454. https://doi.org/10.1207/s15327906mbr2904_5 CrossRefGoogle Scholar
  19. McArdle, J. J., & Woodcock, J. R. (1997). Expanding test rest designs to include developmental time lag components. Psychological Methods, 2(4), 403–435. https://doi.org/10.1037/1082-989X.2.4.403 CrossRefGoogle Scholar
  20. Neale, M. C., Hunter, M. D., Pritikin, J. N., Zahery, M., Brick, T. R., Kirkpatrick, R., et al. (2016). Openmx 2.0: Extended structural equation and statistical modeling. Psychometrika, 81(2), 535–549. PMCID: 25622929. https://doi.org/10.1007/s11336-014-9435-8 MathSciNetCrossRefGoogle Scholar
  21. Oud, J. H. L. (2007). Continuous time modeling of reciprocal relationships in the cross-lagged panel design. In S. M. Boker & M. J. Wenger (Eds.), Data analytic techniques for dynamical systems (pp. 87–130). Mahwah, NJ: Lawrence Erlbaum Associates.Google Scholar
  22. Oud, J. H. L., & Jansen, R. A. R. G. (2000). Continuous time state space modeling of panel data by means of SEM. Psychometrika, 65(2), 199–215. https://doi.org/10.1007/BF02294374 MathSciNetCrossRefGoogle Scholar
  23. R Core Team. (2016). R: A language and environment for statistical computing [Computer software manual]. Vienna, Austria. Retrieved from https://www.R-project.org/
  24. Salthouse, T. A., Schroeder, D. H., & Ferrer, E. (2004). Estimating retest effects in longitudinal assessments of cognitive functioning in adults between 18 and 60 years of age. Developmental Psychology, 40(5), 813–822. https://doi.org/10.1037/0012-1649.40.5.813 CrossRefGoogle Scholar
  25. Salthouse, T. A., & Tucker-Drob, E. M. (2008). Implications of short-term retest effects for the interpretation of longitudinal change. Neuropsychology, 22(6), 800–811. https://doi.org/10.1037/a0013091 CrossRefGoogle Scholar
  26. Sauer, T., Yorke, J. A., & Casdagli, M. (1991). Embedology. Journal of Statistical Physics, 65(3,4), 95–116. https://doi.org/10.1007/BF01053745 MathSciNetCrossRefGoogle Scholar
  27. Savitzky, A., & Golay, M. J. E. (1964). Smoothing and differentiation of data by simplified least squares. Analytical Chemistry, 36, 1627–1639. https://doi.org/10.1021/ac60214a047 CrossRefGoogle Scholar
  28. Shiffman, S., Stone, A. A., & Hufford, M. R. (2008). Ecological momentary assessment. Annual Review of Clinical Psychology, 4, 1–32.  https://doi.org/10.1146/annurev.clinpsy.3.022806.091415 CrossRefGoogle Scholar
  29. Stone, A. A., & Shiffman, S. (1994). Ecological momentary assessment (EMA) in behavioral medicine. Annals of Behavioral Medicine, 16, 199–202.CrossRefGoogle Scholar
  30. Takens, F. (1985). Detecting strange attractors in turbulence. In A. Dold & B. Eckman (Eds.), Lecture notes in mathematics 1125: Dynamical systems and bifurcations (pp. 99–106). Berlin: Springer.Google Scholar
  31. Trull, T. J., & Ebner-Priemer, U. (2013). Ambulatory assessment. Annual Review of Clinical Psychology, 9, 151–176.  https://doi.org/10.1146/annurev-clinpsy-050212-185510 CrossRefGoogle Scholar
  32. Voelkle, M. C., & Oud, J. H. (2013). Continuous time modelling with individually varying time intervals for oscillating and non-oscillating processes. British Journal of Mathematical and Statistical Psychology, 66(1), 103–126. https://doi.org/10.1111/j.2044-8317.2012.02043.x MathSciNetCrossRefGoogle Scholar
  33. von Oertzen, T., & Boker, S. M. (2010). Time delay embedding increases estimation precision of models of intraindividual variability. Psychometrika, 75(1), 158–175. https://doi.org/10.1007/s11336-009-9137-9 MathSciNetCrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  • Steven M. Boker
    • 1
  • Stacey S. Tiberio
    • 2
  • Robert G. Moulder
    • 1
  1. 1.Department of PsychologyThe University of VirginiaCharlottesvilleUSA
  2. 2.Oregon Social Learning CenterEugeneUSA

Personalised recommendations