Advertisement

Prediction of Chaotic Time Series Based on Multi-scale Gaussian Processes

  • Yatong Zhou
  • Taiyi Zhang
  • Xiaohe Li
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4224)

Abstract

This paper considers the prediction of chaotic time series by proposed multi-scale Gaussian processes (MGP) models, an extension of classical Gaussian processes (GP) model. Unlike the GP spending much time to find the optimal hyperparameters, MGP employs a covariance function that is constructed by a scaling function with its different dilations and translations, ensuring that the optimal hyperparameter is easy to determine. Moreover, the scaling function with its different dilations and translations can form a set of complete bases, resulting in that the MGP can acquire better prediction performance than GP. The effectiveness of MGP is evaluated using simulated Mackey-Glass series as well as real-world electric load series. Results show the proposed model outperforms GP on prediction performance, and takes much less time to determine hyperparameter. Results also show that the performance of MGP is competitive with support vector machine (SVM). They give better performance compared to the radial basis function (RBF) networks.

Keywords

Support Vector Machine Mean Square Error Radial Basis Function Covariance Function Gaussian Proc 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Bakker, R., Schouten, J.C., Giles, C.L.: Learning chaotic attractors by neural networks. Neural Compuation 12, 2355–2383 (2000)CrossRefGoogle Scholar
  2. Leung, H., Titus, L., Wang, S.C.: Prediction of noisy chaotic time series using an optimal radial basis function neural network. IEEE Trans. Neural Networks 12, 1163–1172 (2001)CrossRefGoogle Scholar
  3. Müller, K.R., Smola, A.J., Rätsch, G.: Using support vector machines for time series predicting. In: Proc. of Int. Conf. Artifical Neural Networks, pp. 999–1004. Springer, Heidelberg (1997)Google Scholar
  4. Williams, C.K.I., Rasmussen, C.E.: Gaussian processes for regression. In: Advances in Neural Information Processing Systems 8 (NIPS), pp. 598–604. MIT Press, Cambridge (1996)Google Scholar
  5. Mackay, D.J.C.: Introduction to Gaussian processes. In: Gerstner, W., Hasler, M., Germond, A., Nicoud, J.-D. (eds.) ICANN 1997. LNCS, vol. 1327, pp. 1–28. Springer, Heidelberg (1997)CrossRefGoogle Scholar
  6. Bellhouari, S.B., Bermak, A.: Gaussian process for nonstationary time series prediction. Computational Statistics and Data Analysis 47, 705–712 (2004)CrossRefMathSciNetGoogle Scholar
  7. Takens, F.: Detecting strange attractors in fluid turbulence. In: Rand, D., Young, L.S. (eds.) Dynamical systems and turbulence, pp. 366–381. Springer, Heidelberg (1981)CrossRefGoogle Scholar
  8. Mallat, S.: A theory for multiresolution signal decomposition: The wavelet representation. IEEE Trans. PAMI 11, 674–693 (1989)MATHGoogle Scholar
  9. Moody, J., Darken, C.: Fast learning in networks of locally-tuned processing units. Neural Computation 1, 281–294 (1989)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Yatong Zhou
    • 1
  • Taiyi Zhang
    • 1
  • Xiaohe Li
    • 1
  1. 1.Dept. Information and Communication EngineeringXi’an Jiaotong UniversityXi’anP.R. China

Personalised recommendations