Skip to main content
Log in

Improving the performance and accuracy of time series modeling based on autonomic computing systems

  • Original Research
  • Published:
Journal of Ambient Intelligence and Humanized Computing Aims and scope Submit manuscript

Abstract

The challenges involved in integrating, maintaining and administrating real-world systems have motivated the proposal of the Autonomic Computing area which aims at making systems capable of self-managing their tasks, avoiding or minimizing the human interference. Autonomic Computing self-managing aspects are provided by the Autonomic Manager which relies on a structure called control loop. This loop is responsible for monitoring components, analyzing information and taking decisions to optimize, configure, heal and protect systems. Among such steps, the analysis is probably the most complex of the loop, due to it needs to model data, infer situations, estimate behavior or states, and make prediction. In order to proceed with the analysis, we need to study how systems produce information over time (here called time series) and, by modeling such information, we can understand system components behavior, transitions, relations to other systems, make estimations and predictions. In that sense, many techniques were designed to model such temporal information or time series, most of them involve statistics and artificial neural networks which require specific configurations based on series properties, what may not be bearable for online systems. All this scenario motivated this paper which proposes an approach to simplify and improve the accuracy of modeling, estimating and predicting time series. This approach considers chaos theory tools to unfold time series and organize them into multidimensional vectors which better represent interdependencies among observations. Techniques (such as artificial neural networks, e.g. RBF, RRBF, TDNN and ATNN, and statistical tools, e.g. Markov Models) can take advantage of such reorganization and, therefore, improve the accuracy of modeling, estimating and predicting. Experimental results confirm that the proposed approach provides good accuracy at very low computational costs.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20
Fig. 21
Fig. 22

Similar content being viewed by others

Notes

  1. The considered dataset is available at the book Chaos and Order in the Capital Markets, second edition, Edgar Peters.

  2. This Lyapunov exponent was computed by using 10 iterations of the command lyap_k available at the TISEAN package (Hegger et al. 2009). The number of iterations is informed by using the parameters.

  3. This table shows more observations than Table 2. This is only an unfolding example to obtain the rule, or function, which generates this dynamical system.

References

  • Albertini MK, Mello RF (2007) A self-organizing neural network for detecting novelties. In: SAC ’07: Proceedings of the 2007 ACM symposium on Applied computing, ACM, Nova Iorque, EUA, pp 462–466. doi:10.1145/1244002.1244110

  • Abarbanel HDI, Brown R, Sidorowich JJ, Tsimring LS (1993) The analysis of observed chaotic data in physical systems. Rev Mod Phys 65:1331–1392

    Article  MathSciNet  Google Scholar 

  • Alligood KT, Sauer TD, Yorke JA (1997) Chaos: an introduction to dynamical systems, Springer, New York

  • Bishop C (1991) Improving the generalization properties of radial basis function neural networks. Neural Comput 3(4):579–588. doi:10.1162/neco.1991.3.4.579

    Google Scholar 

  • Bretscher O (2004) Linear Algebra with applications. Prentice Hall, Englewood Cliffs

  • Buhmann MD (2003) Radial basis functions: theory and implementations. In: Cambridge monographs on applied and computational mathematics. Cambridge University Press, Cambridge, pp x+259

  • Bianchi G, Tinnirello I (2003) Kalman filter estimation of the number of competing terminals in an ieee 802.11 network. In: INFOCOM 2003. Twenty-Second Annual Joint Conference of the IEEE Computer and Communications Societies, vol 2, pp 844–852

  • Box G, Jenkins GM, Reinsel G (1994) Time series analysis: forecasting and control. 3rd edn. Prentice Hall, Holden-Day, Inc.

  • Casdagli M (1989) Nonlinear prediction of chaotic time series. Physica D Nonlinear Phenomena 35:335–356. doi:10.1016/0167-2789(89)90074-2

  • Cai M, Cai F, Shi A, Zhou B, Zhang Y (2004) Chaotic time series prediction based on local-region multi-steps forecasting model. In: ISNN (2), pp 418–423

  • Doblinger G (1998) An adaptive Kalman filter for the enhancement of noisy AR signals. In: IEEE international symposium on circuits and systems, Monterey, CA, USA, pp 1–4

  • Day S, Davenport M (1993) Continuous-time temporal back-propagation with adaptable time delays. IEEE Trans Neural Netw 4(2):348–354. doi:10.1109/72.207622

    Google Scholar 

  • de Castro LN, Von Zuben FJ (2001) An immunological approach to initialize centers of radial basis function neural networks. In: Proc. of 5th Brazilian conference on Neural Networks, pp 79–84

  • Edmonds AN (1996) Time series prediction using supervised learning and tools from chaos theory, Ph.D. thesis, University of Luton

  • Elmer F-J (1998) The Lyapunov exponent. http://monet.unibas.ch/elmer/pendulum/lyapexp.ht. Accessed 10 Sep 2007

  • Elert G (2005) The Chaos hypertextbook: measuring Chaos. http://hypertextbook.com/chaos. Accessed 10 Sep 2007

  • Eckmann J-P, Ruelle D (1985) Ergodic theory of chaos and strange attractors. Rev Mod Phys 57:617–656. doi:10.1103/RevModPhys.57.617

    Google Scholar 

  • Feller W (1968) An introduction to probability theory and its applications, 3rd edn, vol 1. Wiley, New York

  • Fogel D (1991) An information criterion for optimal neural network selection. IEEE Trans Neural Netw 2(5):490–497. doi:10.1109/72.134286

    Google Scholar 

  • Fraser AM, Swinney HL (1986) Independent coordinates for strange attractors from mutual information. Phys Rev A 33(2):1134–1140. doi:10.1103/PhysRevA.33.1134

    Google Scholar 

  • Gibbs WW (2002) Autonomic Computing. Scientific American. http://www.sciam.com/article.cfm?id=autonomic-computin. Accessed 10 Oct 2008

  • Gholipour A, Araabi BN, Lucas C (2006) Predicting chaotic time series using neural and neurofuzzy models: a comparative study. Neural Process Lett 24(3):217–239. doi:10.1007/s11063-006-9021-x

    Google Scholar 

  • Hochreiter S, Schmidhuber J (1997) Long short-term memory. Neural Comput 9(8):1735–1780. http://citeseer.ist.psu.edu/hochreiter96long.html

    Google Scholar 

  • Hegger R, Kantz H, Schreiber T (2009) Tisean 3.0.1: nonlinear time series analysis. http://www.mpipks-dresden.mpg.de/∼tisean/Tisean_3.0.1/index.html. Accessed 26 Jan 2009

  • Kaplan I (2003) Estimating the hurst exponent. http://www.bearcave.com/misl/misl_tech/wavelets/hurst/index.html Accessed 10 Sep 2007

  • Kennel M (2002) The multiple-dimensions mutual information program. http://www-ncsl.postech.ac.kr/en/softwares/archives/mmi.tar. Accessed 10 Sep 2007

  • Klenke A (2008) Probability theory: a comprehensive course, Springer, London

  • Kohler M (1997) Using the Kalman filter to track human interactive motion: modelling and initialization of the Kalman filter for translational motion. Tech. Rep. 629/1997, Fachbereich Informatik, Universität Dortmund, Fachbereich Informatik, Universität Dortmund, 44221 Dortmund, Alemanha

  • Kennel MB, Brown R, Abarbanel HDI (1992a) Determining embedding dimension for phase-space reconstruction using a geometrical construction. Phys Rev A 45(6):3403–3411. doi:10.1103/PhysRevA.45.3403

    Google Scholar 

  • Kennel M, Brown R, Abarbanel H (1992b) Determining embedding dimension for phase space reconstruction using the method of false nearest neighbors, Tech rep, Institute for Nonlinear Science and Department of Physics, University of California, San Diego, Mail Code R-002, La Jolla, CA, EUA

  • Kumar U, Prakash A, Jain VK (2009) A multivariate time series approach to study the interdependence among O3, NOx, and VOCs in ambient urban atmosphere. Environ Model Assess 14(5):631–643

    Article  Google Scholar 

  • Lorenz E (1963) Deterministic nonperiodic flow. J Atmos Sci 20:130–141

    Article  Google Scholar 

  • Leung H, Lo T, Wang S (2001) Prediction of noisy chaotic time series using an optimal radial basis function neural network. IEEE Trans Neural Netw 12(5):1163–1172. doi:10.1109/72.950144

    Google Scholar 

  • Liebert W, Pawelzik K, Schuster HG (1991) Optimal embeddings of chaotic attractors from topological considerations. Europhys Lett 14(6):521–526

    Article  MathSciNet  Google Scholar 

  • Lin D-T, Dayhoff JE, Ligomenides PA (1995a) Trajectory production with the adaptive time-delay neural network. Neural Netw 8(3):447–461. doi:10.1016/0893-6080(94)00104-T

    Google Scholar 

  • Lin D-T, Dayhoff JE, Ligomenides PA (1995b) Trajectory production with the adaptive time-delay neural network. Neural Netw 8(3):447–461. doi:10.1016/0893-6080(94)00104-T

    Google Scholar 

  • Mañé R (1981) On the dimension of the compact invariant sets of certain nonlinear maps. In: Dynamical systems and turbulence, Warwick 1980, Lecture notes in mathematics 898. Springer, New York, pp 230–242

  • Meng A (2003) An introduction to markov and hidden markov models. http://www2.imm.dtu.dk/pubdb/p.php?3313

  • Milunovich G (2006) Modelling and valuing multivariate interdependencies in financial time series, Ph.D. thesis, School of Economics, University of New South Wales

  • Murch R (2004) Autonomic Computing. IBM Press, Upper Saddle River

  • Medio A, Gallo G (1992) Chaotic dynamics: theory and applications to economics. Cambridge University Press, Cambridge

  • Maas HLD, Verschure PF, Molenaar PC (1990) A note on chaotic behavior in simple neural networks. Neural Netw 3(1):119–122. doi:10.1016/0893-6080(90)90050-U

    Google Scholar 

  • Palis J Jr, Melo W (1978) Introdutpo aos Sistemas DinGmicos. 1st edn. Edgard Blncher

  • Park J, Sandberg IW (1991) Universal approximation using radial-basis-function networks. Neural Comput 3(2):246–257. doi:10.1162/neco.1991.3.2.246

    Google Scholar 

  • Parashar M, Hariri S (2006) Autonomic computing: concepts, infrastructure, and applications. CRC Press, Boca Raton

  • Penny W, Harrison L (2006) Multivariate autoregressive models. In: Friston K, Ashburner J, Kiebel S, Nichols T, Penny W (eds) Statistical parametric mapping: the analysis of functional brain images. Elsevier, Londres

    Google Scholar 

  • Piovoso M, Laplante PA (2003) Kalman filter recipes for real-time image processing. Real Time Imaging 9(6):433–439. doi:10.1016/j.rti.2003.09.005

  • Pilgram B, Judd K, Mees A (2002) Modelling the dynamics of nonlinear time series using canonical variate analysis. Physica D 170(2):103–117. doi:10.1016/S0167-2789(02)00534-1

  • Rafajlowicz E, Pawlak M (2004) Optimization of centers’ positions for rbf nets with generalized kernels. In: ICAISC, pp 253–259

  • Rosenstein MT, Collins JJ, Luca CJD (1993) A practical method for calculating largest lyapunov exponents from small data sets. Physica D 65:117–134. http://citeseer.ist.psu.edu/rosenstein93practical.html

  • Ribert A, Stocker E, Lecourtier Y, Ennaji A (1997) Optimizing a neural network architecture with an adaptive parameter genetic algorithm. In: IWANN ’97: Proceedings of the international work-conference on artificial and natural neural networks, Springer, London, UK, pp 527–535

  • Shenshi G, Zhiqian W, Jitai C (1999) The fractal research and predicating on the times series of sunspot relative number. Appl Math Mech 20(1):84–89. doi:10.1007/BF02459277

    Google Scholar 

  • Takens F (1980) Detecting strange attractors in turbulence. In: Dynamical systems and turbulence, Springer, Berlin, pp 366–381. doi:10.1007/BFb0091924

  • Wan E (1997) Time series prediction by using a connectionist network with internal delay lines. In: Time series prediction, Addison-Wesley, Reading, pp 195–217

  • Wan EA, Van Der Merwe R (2000) The unscented Kalman filter for nonlinear estimation. In: The IEEE 2000 Adaptive systems for signal processing, communications, and control symposium 2000 (AS-SPCC), pp 153–158

  • Whitney H (1936) Differentiable manifolds. Ann Math 37(3):645–680

    Article  MathSciNet  Google Scholar 

  • Waibel A, Hanazawa T, Hinton G, Shikano K, Lang K (1989) Phoneme recognition using time delay neural networks. IEEE Trans Acoust 37:328–339

    Article  Google Scholar 

  • Zeng M, Cheng Z (2009) A method using genetic algorithm to optimize neural networks applied in sustainable development ability appraisal. In: ICECT '09: Proceedings of the 2009 international conference on electronic computer technology. IEEE Computer Society, Washington, DC, USA, pp 519–523. doi:http://dx.doi.org/10.1109/ICECT.2009.158

  • Zeevi AJ, Meir R, Adler RJ (1998) Non-linear models for time series using mixtures of autoregressive models. Tech rep

  • Zemouri R, Racoceanu D, Zerhouni N (2003) Recurrent radial basis function network for time-series prediction. Eng Appl Artif Intell 16(5–6):453–463

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Rodrigo Fernandes de Mello.

Rights and permissions

Reprints and permissions

About this article

Cite this article

de Mello, R.F. Improving the performance and accuracy of time series modeling based on autonomic computing systems. J Ambient Intell Human Comput 2, 11–33 (2011). https://doi.org/10.1007/s12652-010-0028-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12652-010-0028-9

Keywords

Navigation