Skip to main content

Mining Latent Sources of Causal Time Series Using Nonlinear State Space Modeling

  • Conference paper
Intelligent Information and Database Systems (ACIIDS 2011)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 6591))

Included in the following conference series:

Abstract

Data mining refers to use of new methods for the intelligent analysis of large data sets. This paper applies one of nonlinear state space modeling (NSSM) techniques named nonlinear dynamical factor analysis (NDFA) to mine the latent factors which are the original sources for producing the observations of causal time series. The purpose of mining indirect sources rather than the time series observation is that much better results can be obtained from the latent sources, for example, economics data driven by an "explanatory variables" like inflation, unobserved trends and fluctuations. The effectiveness of NDFA is evaluated by a simulated time series data set. Our empirical study indicates the performance of NDFA is better than the independent component analysis in exploring the latent sources of Taiwan unemployment rate time series.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Makridakis, S.: Time series prediction: Forecasting the future and understanding the past. In: Weigend, A.S., Gershenfeld, N.A. (eds.), p. 643. Addison-Wesley Publishing Company, Reading (1993), ISBN 0-201-62; International Journal of Forecasting 10, 463–466 (1994)

    Google Scholar 

  2. Hu, X., Xu, P., Wu, S., Asgari, S., Bergsneider, M.: A data mining framework for time series estimation. Journal of Biomedical Informatics 43, 190–199 (2010)

    Article  Google Scholar 

  3. Chen, C.T.: Linear System Theory and Design, 3rd edn. Oxford University Press, New York (1999)

    Google Scholar 

  4. Everitt, B.S., Dunn, G.: Applied Multivariate Data Analysis. Oxford University Press, New York (1992)

    MATH  Google Scholar 

  5. West, M., Harrison, J.: Bayesian Forecasting and Dynamic Models. Springer, New York (1990)

    MATH  Google Scholar 

  6. De Jong, P.: The diffuse Kalman filter Annals of Statistics 19 (1991)

    Google Scholar 

  7. Anderson, B.D.D., Moore, J.B.: Optimal filtering. Prentice-Hall, Englewood Cliffs (1979)

    MATH  Google Scholar 

  8. Ilin, A., Valpola, H., Oja, E.: Nonlinear dynamical factor analysis for state change detection. IEEE Transactions on Neural Networks 15, 559–575 (2004)

    Article  Google Scholar 

  9. Overschee, P.v., Moor, B.D.: Subspace Identification for Linear Systems: Theory, Implementation Applications. Springer, Heidelberg (1996)

    Book  MATH  Google Scholar 

  10. Quach, M., Brunel, N., d’Alché-Buc, F.: Estimating parameters and hidden variables in nonlinear state-space models based on ODEs for biological networks inference. Bioinformatics (2007)

    Google Scholar 

  11. Lappalainen, H., Honkela, A.: Bayesian Nonlinear Independent Component Analysis by Multi-Layer Perceptrons. In: Girolami, M. (ed.) Advances in Independent Component Analysis, pp. 93–121. Springer, Heidelberg (2000)

    Chapter  Google Scholar 

  12. Valpola, H., Karhunen, J.: An unsupervised ensemble learning method for nonlinear dynamic state-space models. Neural Comput. 14, 2647–2692 (2002)

    Article  MATH  Google Scholar 

  13. Giannakopoulos, X., Valpola, H.: Nonlinear dynamical factor analysis. In: Bayesian Inference And Maximum Entropy Methods in Science And Engineering: 20th International Workshop. AIP Conference Proceedings, vol. 568 (2001)

    Google Scholar 

  14. Barber, D., Bishop, C. (eds.): Ensemble learning in Bayesian neural networks. Springer, Berlin (1998)

    MATH  Google Scholar 

  15. Giannakopoulos, X., Valpola, H.: Nonlinear dynamical factor analysis. In: AIP Conference Proceedings, vol. 568, p. 305 (2001)

    Google Scholar 

  16. Honkela, A., Valpola, H.: Unsupervised variational Bayesian learning of nonlinear models. In: Saul, L.K., Weis, Y., Bottou, L. (eds.) Advances in Neural Information Processing Systems (NIPS 2004), vol. 17, pp. 593–600 (2005)

    Google Scholar 

  17. Valpola, H., Honkela, A., Giannakopoulos, X.: Matlab Codes for the NFA and NDFA Algorithms (2002), http://www.cis.hut.fi/projects/bayes/

  18. Takens, F.: Detecting strange attractors in turbulence. LNM, vol. 898, pp. 366–381. Springer, Heidelberg (1981)

    MATH  Google Scholar 

  19. Fraser, A.M., Swinney, H.L.: Independent coordinates for strange attractors from mutual information. Physical Review A 33, 1134 (1986)

    Article  MathSciNet  MATH  Google Scholar 

  20. Sprott, J.C.: Chaos and Time Series Analysis, vol. 507. Oxford University Press, Oxford (2003)

    MATH  Google Scholar 

  21. Naik, G.R., Kumar, D.K.: Determining Number of Independent Sources in Undercomplete Mixture. EURASIP Journal on Advances in Signal Processing 5, Article ID 694850 (2009), doi:10.1155/2009/694850

    Google Scholar 

  22. Gävert, H., Hurri, J., Särelä, J., Hyvärinen, A.: FastICA Package (2005), http://www.cis.hut.fi/projects/ica/fastica/code/dlcode.shtml

  23. Everson, R., Roberts, S.: Inferring the eigenvalues of covariance matrices from limited, noisy data. IEEE Transactions on Signal Processing 48, 2083–2091 (2000)

    Article  MathSciNet  MATH  Google Scholar 

  24. Santos, J.e.D.A., Barreto, G.A., Medeiros, C.a.M.S.: Estimating the Number of Hidden Neurons of the MLP Using Singular Value Decomposition and Principal Components Analysis: A Novel Approach. In: 2010 Eleventh Brazilian Symposium on Neural Networks, pp. 19–24 (2010)

    Google Scholar 

  25. Honkela, A.: Approximating Nonlinear Transformations of Probability Distributions for Nonlinear Independent Component Analysis. In: Proceedings of the 2004 IEEE International Joint Conference on Neural Networks (IJCNN 2004), Budapest, Hungary, pp. 2169–2174 (2004)

    Google Scholar 

  26. Chen, W.-S.: Use of recurrence plot and recurrence quantification analysis in Taiwan unemployment rate time series. Physica A: Statistical Mechanics and its Applications (in Press, 2011), doi:10.1016/j.physa.2010.12.020

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Chen, WS., Yu, FJ. (2011). Mining Latent Sources of Causal Time Series Using Nonlinear State Space Modeling. In: Nguyen, N.T., Kim, CG., Janiak, A. (eds) Intelligent Information and Database Systems. ACIIDS 2011. Lecture Notes in Computer Science(), vol 6591. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-20039-7_14

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-20039-7_14

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-20038-0

  • Online ISBN: 978-3-642-20039-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics