Advertisement

Time Series Linear and Nonlinear Models

  • Roberto BaragonaEmail author
  • Francesco Battaglia
  • Irene Poli
Chapter
Part of the Statistics and Computing book series (SCO)

Abstract

Modeling time series includes the three steps of identification, parameter estimation and diagnostic checking. As far as linear models are concerned model building has been extensively studied and well established both theory and practice allow the user to proceed along reliable guidelines. Ergodicity, stationarity and Gaussianity properties are generally assumed to ensure that the structure of a stochastic process may be estimated safely enough from an observed time series. We will limit in this chapter to discrete parameter stochastic processes, that is a collection of random variables indexed by integers that are given the meaning of time. Such stochastic process may be called time series though we shall denote a finite single realization of it as a time series as well. Real time series data are often found that do not conform to our hypotheses. Then we have to model non stationary and non Gaussian time series that require special assumptions and procedures to ensure that identification and estimation may be performed, and special statistics for diagnostic checking. Several devices are available that allow such time series to be handled and remain within the domain of linear models. However there are features that prevent us from building linear models able to explain and predict the behavior of a time series correctly. Examples are asymmetric limit cycles, jump phenomena and dependence between amplitude and frequency that cannot be modeled accurately by linear models. Nonlinear models may account for time series irregular behavior by allowing the parameters of the model to vary with time. This characteristic feature means by itself that the stochastic process is not stationary and cannot be reduced to stationarity by any appropriate transform. As a consequence, the observed time series data have to be used to fit a model with varying parameters. These latter may influence either the mean or the variance of the time series and according to their specification different classes of nonlinear models may be characterized. Linear models are defined by a single structure while nonlinear models may be specified by a multiplicity of different structures. So classes of nonlinear models have been introduced each of which may be applied successfully to real time series data sets that are commonly observed in well delimited application fields. Contributions of evolutionary computing techniques will be reviewed in this chapter for linear models, as regards identification stage and subset models, and to a rather larger extent for some classes of nonlinear models, concerned with identification and parameter estimation. Beginning with the popular autoregressive moving-average linear models, we shall outline the relevant applications of evolutionary computing to the domains of threshold models, including piecewise linear, exponential and autoregressive conditional heteroscedastic structures, bilinear models and artificial neural networks.

Keywords

Root Mean Square Error ARMA Model Subset Model EXPAR Model Forecast Mean Square Error 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. Baragona R, Battaglia F, Cucina D (2002) A note on estimating autoregressive exponential models. Quad Stat 4:71–88Google Scholar
  2. Baragona R, Battaglia F, Cucina D (2004a) Estimating threshold subset autoregressive moving-average models by genetic algorithms. Metron 62:39–61MathSciNetGoogle Scholar
  3. Baragona R, Battaglia F, Cucina D (2004b) Fitting piecewise linear threshold autoregressive models by means of genetic algorithms. Comput Stat Data Anal 47:277–295zbMATHCrossRefMathSciNetGoogle Scholar
  4. Baragona R, Cucina D (2008) Double threshold autoregressive conditionally heteroscedastic model building by genetic algorithms. J Stat Comput Simul 78:541–558zbMATHCrossRefMathSciNetGoogle Scholar
  5. Box GEP, Jenkins GM, Reinsel GC (1994) Time series analysis: forecasting and control, 3rd edn. Prentice Hall, Englewood Cliffs, NJzbMATHGoogle Scholar
  6. Bozdogan H (1988) Icomp: a new model-selection criterion. In: Bock HH (ed) Classification and related methods of data analysis. Elsevier (North Holland), Amsterdam, pp 599–608Google Scholar
  7. Bozdogan H, Bearse P (2003) Information complexity criteria for detecting influential observations in dynamic multivariate linear models using the genetic algorithm. J Stat Plan Inference 114:31–44zbMATHCrossRefMathSciNetGoogle Scholar
  8. Cai Z, Fan J, Yao Q (2000) Functional-coefficient regression models for nonlinear time series. J Am Stat Assoc 95:941–956zbMATHCrossRefMathSciNetGoogle Scholar
  9. Chatterjee S, Laudato M (1997) Genetic algorithms in statistics: procedures and applications. Commun Stat Theory Methods 26(4):1617–1630zbMATHGoogle Scholar
  10. Chen CWS, Cherng TH, Wu B (2001) On the selection of subset bilinear time series models: a genetic algorithm approach. Comput Stat 16:505–517CrossRefMathSciNetGoogle Scholar
  11. Chen R, Tsay RS (1993) Functional-coefficient autoregressive models. J Am Stat Assoc 88:298–308zbMATHCrossRefMathSciNetGoogle Scholar
  12. Chiogna M, Gaetan C, Masarotto G (2008) Automatic identification of seasonal transfer function models by means of iterative stepwise and genetic algorithms. J Time Ser Anal 29:37–50zbMATHMathSciNetGoogle Scholar
  13. Davis R, Lee T, Rodriguez-Yam G (2006) Structural break estimation for nonstationary time series models. J Am Stat Assoc 101:223–239zbMATHCrossRefMathSciNetGoogle Scholar
  14. Davis R, Lee T, Rodriguez-Yam G (2008) Break detection for a class of nonlinear time series models. J Time Ser Anal 29:834–867zbMATHCrossRefMathSciNetGoogle Scholar
  15. Delgado A, Prat A (1997) Modeling time series using a hybrid system: neural networks and genetic algorithm. In: Bellacicco A, Lauro NC (eds) Reti neurali e statistica. Franco Angeli, Milan, pp 77–88Google Scholar
  16. Gaetan C (2000) Subset arma model identification using genetic algorithms. J Time Ser Anal 21:559–570zbMATHCrossRefMathSciNetGoogle Scholar
  17. Ghaddar DK, Tong H (1981) Data transformation and self-exciting threshold autoregression. Appl Stat 30:238–248CrossRefGoogle Scholar
  18. Granger C, Andresen A (1978) Introduction to bilinear time series models. Vandenbroek and Ruprecht, GöttingenGoogle Scholar
  19. Haggan V, Ozaki T (1981) Modelling nonlinear random vibrations using an amplitude-dependent autoregressive time series model. Biometrika 68:189–196zbMATHCrossRefMathSciNetGoogle Scholar
  20. Hannan EJ, Rissanen J (1982) Recursive estimation of mixed autoregressive moving average order. Biometrika 69:81–94zbMATHCrossRefMathSciNetGoogle Scholar
  21. Hornik K (1993) Some new results on neural network approximation. Neural Netw 6:1069–1072CrossRefGoogle Scholar
  22. Hornik K, Leisch F (2001) Neural networks models. In: Peña D, Tiao GC, Tsay RS (eds) A course in time series analysis. Wiley, Hoboken, New Jersey, NJ, pp 348–362Google Scholar
  23. Li CW, Li WK (1996) On a double-threshold autoregressive heteroscedastic time series model. J Appl Econ 11:253–274CrossRefGoogle Scholar
  24. Mandic D, Chambers J (2001) Recurrent neural networks for prediction: architectures, learning algorithms and stability. Wiley, New York, NYCrossRefGoogle Scholar
  25. McCulloch WS, Pitts W (1943) A logical calculus of the ideas immanent in nervous activity. Bull Math Biophys 5:115–133zbMATHCrossRefMathSciNetGoogle Scholar
  26. Minerva T, Poli I (2001a) Building arma models with genetic algorithms. In: Boers EJW, et al (eds) EvoWorkshop 2001, LNCS 2037. Springer, Berlin, pp 335–342Google Scholar
  27. Minerva T, Poli I (2001b) A neural net model to predict high tides in Venice. In: Borra S, Rocci R, Vichi M, Shader M (eds) Advances in data analysis and classification. Springer, Berlin, pp 367–374Google Scholar
  28. Ong CS, Huang JJ, Tzeng GH (2005) Model identification of ARIMA family using genetic algorithms. Appl Math Comput 164:885–912zbMATHCrossRefMathSciNetGoogle Scholar
  29. Ozaki T (1982) The statistical analysis of perturbed limit cycle processes using nonlinear time series models. J Time Ser Anal 3:29–41zbMATHCrossRefMathSciNetGoogle Scholar
  30. Pittman J, Murthy CA (2000) Fitting optimal piecewise linear functions using genetic algorithms. IEEE Trans Pattern Anal Mach Intell 22(7):701–718CrossRefGoogle Scholar
  31. Priestley MB (1988) Non-linear and non-stationary time series analysis. Academic Press, LondonGoogle Scholar
  32. Rissanen J (1987) Stochastic complexity. J R Stat Soc B 49:223–265zbMATHMathSciNetGoogle Scholar
  33. Rissanen J (2007) Information and complexity in statistical modelling. Springer, BerlinGoogle Scholar
  34. Rosenblatt F (1958) The perceptron: a probabilistic model for information storage and organization in the brain. Psychol Rev 65:386–408CrossRefMathSciNetGoogle Scholar
  35. Rumelhart DE, Hinton GE, Williams RJ (1986) Learning internal representations by error propagation. In: Rumelhart DE, McClelland JL (eds) Parallel distributed processing. MIT Press, Cambridge, MA, pp 318–362Google Scholar
  36. Subba Rao T (1981) On the theory of bilinear time series models. J R Stat Soc B 43:244–255zbMATHMathSciNetGoogle Scholar
  37. Syswerda G (1989) Uniform crossover in genetic algorithms. In: Schaffer JD (ed) Proceedings of the 3rd international conference on genetic algorithms. Morgan Kaufmann, Los Altos, CA, pp 2–9Google Scholar
  38. Tong H (1990) Non linear time series: a dynamical system approach. Oxford University Press, OxfordGoogle Scholar
  39. Van Emden MH (1971) An analysis of complexity. Mathematical Centre Tracts, AmsterdamGoogle Scholar
  40. Versace M, Bhatt R, Hinds O, Shiffer M (2004) Predicting the exchange traded fund DIA with a combination of genetic algorithms and neural networks. Expert Syst Appl 27:417–425CrossRefGoogle Scholar
  41. Winker P (2001) Optimization heuristics in econometrics: applications of threshold accepting. Wiley, ChichesterGoogle Scholar
  42. Winker P, Gilli M (2004) Applications of optimization heuristics to estimation and modelling problems. Computat Stat Data Anal 47:211–223zbMATHCrossRefMathSciNetGoogle Scholar
  43. Wong CS, Li WK (1998) A note on the corrected akaike information criterion for threshold autoregressive models. J Time Ser Anal 19:113–124zbMATHCrossRefMathSciNetGoogle Scholar
  44. Wu B, Chang CL (2002) Using genetic algorithms to parameters (d,r) estimation for threshold autoregressive models. Comput Stat Data Anal 38:315–330zbMATHCrossRefMathSciNetGoogle Scholar
  45. Baragona R (2003b) General local search methods in time series, contributed paper at the international workshop on computational management science, economics, finance and engineering, Limassol, CyprusGoogle Scholar
  46. Sarle WS (1994) Neural networks and statistical models. In: Proceedings of the 19th annual SAS users group international conference. SAS Institute, Cary, NC, pp 1538–1550Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Roberto Baragona
    • 1
    Email author
  • Francesco Battaglia
    • 2
  • Irene Poli
    • 3
  1. 1.Department of Communication and Social ResearchSapienza University of RomeRomeItaly
  2. 2.Department of Statistical SciencesSapienza University of RomeRomaItaly
  3. 3.Department of StatisticsCa’ Foscari University of VeniceVeniceItaly

Personalised recommendations