Advertisement

Predicting Nonstationary Time Series with Multi-scale Gaussian Processes Model

  • Yatong Zhou
  • Taiyi Zhang
  • Xiaohe Li
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4232)

Abstract

The Gaussian processes (GP) model has been successfully applied to the prediction of nonstationary time series. Due to the model’s covariance function containing an undetermined hyperparameters, to find its maximum likelihood values one usually suffers from either susceptibility to initial conditions or large computational cost. To overcome the pitfalls mentioned above, at the same time to acquire better prediction performance, a novel multi-scale Gaussian processes (MGP) model is proposed in this paper. In the MGP model, the covariance function is constructed by a scaling function with its different dilations and translations, ensuring that the optimal value of the hyperparameter is easy to determine. Although some more time is spent on the calculation of covariance function, MGP takes much less time to determine hyperparameter. Therefore, the total training time of MGP is competitive to GP. Experiments demonstrate the prediction performance of MGP is better than GP. Moreover, the experiments also show that the performance of MGP and support vector machine (SVM) is comparable. They give better performance compared to the radial basis function (RBF) networks.

Keywords

Support Vector Machine Mean Square Error Covariance Function Gaussian Process Conjugate Gradient Method 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Sollich, P., Halees, A.: Learning curves for Gaussian process regression: approximations and bounds. Neural Computation 14, 1393–1428 (2002)MATHCrossRefGoogle Scholar
  2. 2.
    Mackay, D.J.C.: Introduction to Gaussian processes. In: Gerstner, W., Hasler, M., Germond, A., Nicoud, J.-D. (eds.) ICANN 1997. LNCS, vol. 1327. Springer, Heidelberg (1997)Google Scholar
  3. 3.
    Sollich, P.: Bayesian methods for support vector machines: Evidence and predictive class probabilities. Machine learning 46, 21–52 (2002)MATHCrossRefGoogle Scholar
  4. 4.
    Bellhouari, S.B., Bermak, A.: Gaussian process for nonstationary time series prediction. Computational Statistics and Data Analysis 47, 705–712 (2004)CrossRefMathSciNetGoogle Scholar
  5. 5.
    Basseville, M., Basseville, A., Chou, K.C.: Modeling and estimation of multiresolution stochastic processes. IEEE Trans. on Information Theory 38, 766–784 (1992)CrossRefGoogle Scholar
  6. 6.
    Fitzek, F.H.P., Reisslein, M.: MPEG4 and H.263 video traces for network performance evaluation (extended version). Technical Report: TKN-00-06. TU Berlin Dept. Electrical Engineering, Telecommunication networks Groups (2000)Google Scholar
  7. 7.
    Moody, J., Darken, C.: Fast learning in networks of locally-tuned processing units. Neural Computation 1, 281–294 (1989)CrossRefMATHGoogle Scholar
  8. 8.
    Vapnik, V., Golowich, S., Smola, A.J.: Support vector method for function approximation, regression estimation, and signal processing. In: Neural Information Processing System (NIPS), pp. 322–327. MIT press, Cambridge (1997)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Yatong Zhou
    • 1
  • Taiyi Zhang
    • 1
  • Xiaohe Li
    • 1
  1. 1.Dept. Information and Communication EngineeringXi’an Jiaotong UniversityXi’anP.R. China

Personalised recommendations