A Novel Multiple Support Vector Machines Architecture for Chaotic Time Series Prediction

  • Jian-sheng Qian
  • Jian Cheng
  • Yi-nan Guo
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4221)


Inspired by the so-called “divide-and-conquer” principle that is often used to attack a complex problem by dividing it into simpler problems, a two-stage multiple support vector machines (SVMs) architecture is proposed to improve its prediction accuracy and generalization performance for chaotic time series prediction. Fuzzy C-means (FCM) clustering algorithm is adopted in the first stage to partition the input dataset into several subnets. Then, in the second stage, multiple SVMs that best fit partitioned subsets are constructed by Gaussian radial basis function kernel and the optimal free parameters of SVMs. All the models are evaluated by Mackey-Glass chaotic time series and used for coal mine gas concentration in the experiment. The simulation shows that the multiple SVMs achieve significant improvement in the generalization performance in comparison with the single SVM model. In addition, the multiple SVMs also converges faster and uses fewer support vectors.


Support Vector Machine Root Mean Square Error Cluster Center Chaotic Time Series Gaussian Radial Basis Function Kernel 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Cao, L.J., Tay, F.E.H.: Support Vector Machine With Adaptive Parameters in Financial Time Series Forecasting. IEEE Transactions on Neural Networks 14(6), 1506–1518 (2003)CrossRefGoogle Scholar
  2. 2.
    Vapnik, V.N.: An Overview of Statistical Learning Theory. IEEE Transactions Neural Networks 10(5), 988–999 (1999)CrossRefGoogle Scholar
  3. 3.
    Cristianini, N., Taylor, J.S.: An Introduction to Support Vector Machines: and Other Kernel-based Learning Methods. Cambridge University Press, New York (2000)Google Scholar
  4. 4.
    Milidiu, R.L., Machado, R.J., Rentera, R.P.: Time-series Forecasting Through Wavelets Transformation and a Mixture of Expert Models. Neurocomputing 20, 145–146 (1999)CrossRefGoogle Scholar
  5. 5.
    Bezdek, J.C.: Pattern Recognition with Fuzzy Objective Function Algorithms. Plenum Press, New York (1981)MATHGoogle Scholar
  6. 6.
    Mackey, M.C., Glass, L.: Oscillation and Chaos in Physiological Control System. Science 197, 287–289 (1977)CrossRefGoogle Scholar
  7. 7.
    Sanger, T.D.: A Tree-structured Adaptive Network for Function Approximation in Highdimensional Spaces. IEEE Transaction on Neural Networks 2(2), 285–293 (1991)CrossRefGoogle Scholar
  8. 8.
    Jang, J.-S.R.: ANFIS: Adaptive-Network-based Fuzzy Inference System. IEEE Transactions on System, Man and Cybernetics 23(3), 665–685 (1993)CrossRefMathSciNetGoogle Scholar
  9. 9.
    Wan, W., Hirasawa, K., Hu, J.: Relation between Weight Initialization of Neural networks and Pruning Algorithms Case Study on Mackey-Glass Time Series. In: Proceedings of the International Joint Conference on Neural Networks, pp. 1750–1755 (2001)Google Scholar
  10. 10.
    Kim, K.J.: Financial Time Series Forecasting Using Support Vector Machines. Neurocomputing 55, 307–319 (2003)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Jian-sheng Qian
    • 1
  • Jian Cheng
    • 1
  • Yi-nan Guo
    • 1
  1. 1.School of Information and Electrical EngineeringChina University of Mining and TechnologyXu ZhouChina

Personalised recommendations