Prediction of Chaotic Time Series Based on Multi-scale Gaussian Processes
This paper considers the prediction of chaotic time series by proposed multi-scale Gaussian processes (MGP) models, an extension of classical Gaussian processes (GP) model. Unlike the GP spending much time to find the optimal hyperparameters, MGP employs a covariance function that is constructed by a scaling function with its different dilations and translations, ensuring that the optimal hyperparameter is easy to determine. Moreover, the scaling function with its different dilations and translations can form a set of complete bases, resulting in that the MGP can acquire better prediction performance than GP. The effectiveness of MGP is evaluated using simulated Mackey-Glass series as well as real-world electric load series. Results show the proposed model outperforms GP on prediction performance, and takes much less time to determine hyperparameter. Results also show that the performance of MGP is competitive with support vector machine (SVM). They give better performance compared to the radial basis function (RBF) networks.
KeywordsSupport Vector Machine Mean Square Error Radial Basis Function Covariance Function Gaussian Proc
Unable to display preview. Download preview PDF.
- Müller, K.R., Smola, A.J., Rätsch, G.: Using support vector machines for time series predicting. In: Proc. of Int. Conf. Artifical Neural Networks, pp. 999–1004. Springer, Heidelberg (1997)Google Scholar
- Williams, C.K.I., Rasmussen, C.E.: Gaussian processes for regression. In: Advances in Neural Information Processing Systems 8 (NIPS), pp. 598–604. MIT Press, Cambridge (1996)Google Scholar