Abstract
Many nonlinear or chaotic time series exhibit an innate broad spectrum, which makes noise reduction difficult. Locally projective noise reduction using proper orthogonal decomposition (POD) is one of the most effective tools. It works for both map-like and continuously sampled time series. However, it only looks at geometrical or topological properties of data and does not take into account temporal characteristics of time series. Here we present a new noise reduction method using smooth orthogonal decomposition (SOD) of bundles of locally reconstructed trajectory strands, which imposes temporal smoothness on the filtered time series. It is shown that SOD based noise reduction significantly outperforms the POD based method for the continuously sampled noisy time series.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
- 2.
You may want to use sod of individual strands to get several samples of smooth subspace basis, and then apply sod to the matrix collecting all the basis to find the dimensionality of global bases.
References
Brocker J, Parlitz U, Ogorzalek M (2002) Nonlinear noise reduction. Proc IEEE 90(5):898–918
Chatterjee A, Cusumano JP, Chelidze D (2002) Optimal tracking of parameter drift in a chaotic system: experiment and theory. J Sound Vib 250(5):877–901
Chatterjee A (2000) An introduction to the proper orthogonal decomposition. Curr Sci 78(7):808–817
Chelidze D, Liu M (2008) Reconstructing slow-time dynamics from fast-time measurements. Philos Trans R Soc A 366:729–3087
Chelidze D, Zhou W (2006) Smooth orthogonal decomposition-based vibration mode identification. J Sound Vib 292(3):461–473
Farooq U, Feeny BF (2008) Smooth orthogonal decomposition for modal analysis of randomly excited systems. J Sound Vib 316(1):137–146
Gao J, Sultan H, Hu J, Tung WW (2010) Denoising nonlinear time series by adaptive filtering and wavelet shrinkage: a comparison. IEEE Signal Process Lett 17(3):237–240
Grassberger P (1983 ) Generalized dimensions of strange attractors. Phys Lett A 97(6):227–230
Grassberger P, Procaccia I (1983) Characterization of strange attractors. Phys Rev Lett 50(5):346–349
Grassberger P, Procaccia V (1983) Measuring the strangeness of strange attractors. Phys D 9(1):189–208
Kantz H, Schreiber T (2003) Nonlinear time series analysis, Cambridge University Press, Cambridge
Kantz H, Schreiber T, Hoffmann I, Buzug T, Pfister G, Flepp LG, Simonet J, Badii R, Brun E (1993) Nonlinear noise reduction: a case study on experimental data. Phys Rev E 48(2):1529
Kerschen G, Golinval J-C, Vakakis AF, Bergman LA (2005) The method of proper orthogonal decomposition for dynamical characterization and order reduction of mechanical systems: an overview. Nonlinear Dyn 41:147–169
Kostelich EJ, Schreiber T (1993) Noise reduction in chaotic time-series data: a survey of common methods. Phys Rev E 48(3):1752
Liebert W, Pawelzik K, Schuster HG (2007) Optimal embeddings of chaotic attractors from topological considerations. Europhys Lett 14(6):521
Rathinam M, Petzold LR (2003) A new look at proper orthogonal decomposition. SIAM J Numer Anal 41(5):1893–1925
Sauer T, Yorke JA, Casdagli M (1991) Embedology. J Stat Phys 65(3):579–616
Tikkanen PE (1999) Nonlinear wavelet and wavelet packet denoising of electrocardiogram signal. Biol Cybern 80(4):259–267
Wiskott L (1998) Learning invariance manifolds. In: Niklasson L, Bodén M, Ziemke T, (eds) Proceedings of the 8th international conference on artificial neural networks, ICANN’98, Skövde, Sept 1998. Perspectives in neural computing. Springer, London, pp 555–560
Wiskott L (1998) Learning invariance manifolds. In: Proceedings of the 5th joint symposium on neural computation, San Diego, vol 8. University of California, 16 May 1998, pp 196–203
Wiskott L (1999) Learning invariance manifolds. In: Proceeding of the computational neuroscience meeting, CNS’98, Santa Barbara, CA, 1999. Special issue of neurocomputing, 26/27:925–932
Wiskott L (1999) Unsupervised learning and generalization of translation invariance in a simple model of the visual system. In: Paaß G (ed) Learning and adaptivity for connectionist models and neural networks, Proceedings Meeting of the GI-Working Group 1.1.2 “Connectionism”, Magdeburg, 29 Sept, pp 56–67, Sankt Augustin, 1999. GMD-Forschungszentrum Informationstechnik GmbH. GMD Report 59
Wiskott L (2000) Unsupervised learning of invariances in a simple model of the visual system. In: Proceeding of the 9th annual computational neuroscience meeting, CNS 2000, Brugge, 16–20 July 2000, p 157
Wiskott L (2001) Unsupervised learning of invariances in a simple model of the visual system. In: Mumford D, Morel J-M, C. von der Malsburg (eds) Proceedings of the mathematical, computational and biological study of vision, Oberwolfach, Mathematisches Forschungsinstitut, 4–10 Nov 2001, pp 21–22
Wiskott L (2006) Is slowness a learning principle of visual cortex? In: Proceedings of the japan-germany symposium on computational neuroscience, Saitama, RIKEN Brain Science Institute, 1–4 Feb 2006, p 25
Wiskott L, Berkes P (2003) Is slowness a learning principle of the visual cortex? In: Proceeding of the jahrestagung der deutschen zoologischen gesellschaft, Berlin, 9–13 June 2003. Special issue of zoology, 106(4):373–382
Wolf A (1986) Quantifying chaos with lyapunov exponents. In: Chaos, pp 273–290
Wolf A, Swift JB, Swinney HL, Vastano JA (1985) Determining lyapunov exponents from a time series. Phys D 16(3):285–317
Acknowledgements
This paper is based upon work supported by the National Science Foundation under Grant No. 1100031.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 The Society for Experimental Mechanics, Inc.
About this paper
Cite this paper
Chelidze, D. (2013). Smooth Projective Noise Reduction for Nonlinear Time Series. In: Kerschen, G., Adams, D., Carrella, A. (eds) Topics in Nonlinear Dynamics, Volume 1. Conference Proceedings of the Society for Experimental Mechanics Series, vol 35. Springer, New York, NY. https://doi.org/10.1007/978-1-4614-6570-6_6
Download citation
DOI: https://doi.org/10.1007/978-1-4614-6570-6_6
Published:
Publisher Name: Springer, New York, NY
Print ISBN: 978-1-4614-6569-0
Online ISBN: 978-1-4614-6570-6
eBook Packages: EngineeringEngineering (R0)