Neural Processing Letters

, Volume 15, Issue 2, pp 179–195 | Cite as

ε-Descending Support Vector Machines for Financial Time Series Forecasting

  • Francis E. H. Tay
  • L. J. Cao
Article

Abstract

This paper proposes a modified version of support vector machines (SVMs), called ε-descending support vector machines (ε-DSVMs), to model non-stationary financial time series. The ε-DSVMs are obtained by incorporating the problem domain knowledge – non-stationarity of financial time series into SVMs. Unlike the standard SVMs which use a constant tube in all the training data points, the ε-DSVMs use an adaptive tube to deal with the structure changes in the data. The experiment shows that the ε-DSVMs generalize better than the standard SVMs in forecasting non-stationary financial time series. Another advantage of this modification is that the ε-DSVMs converge to fewer support vectors, resulting in a sparser representation of the solution.

non-stationary financial time series support vector machines tube size structural risk minimization principle 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Hall, J. W.: Adaptive selection of U.S. stocks with neural nets, Trading On the Edge: Neural, Genetic, and Fuzzy Systems for Chaotic Financial Markets, ed by G. J. Deboeck, pp. 45–65. New York, Wiley, 1994.Google Scholar
  2. 2.
    Yaser, S. A. M. and Atiya, A. F.: Introduction to financial forecasting, Applied Intelligence, 6 (1996), 205–213.CrossRefGoogle Scholar
  3. 3.
    Freitas, N. D., Milo, M. and Clarkson, P.: Sequential support vector machines, Proceedings of the 1999 IEEE Signal Processing Society Workshop, pp. 31–40.Google Scholar
  4. 4.
    Refenes, A. N., Bentz, Y., Bunn, D.W., Burgess, A. N. and Zapranis, A. D.: Financial time series modeling with discounted least squares back-propagation, Neurocomputing, 14 (1997), 123–138.CrossRefGoogle Scholar
  5. 5.
    Scholkopf, B., Burges, C. and Vapnik, V.: Extracting support data for a given task, Proceedings of First International Conference on Knowledge Discovery & DataMining, AAAI press, Menlo Park, CA, 1995.Google Scholar
  6. 6.
    Schmidt, M.: Identifying speaker with support vector networks, In Interface '96 Proceedings, Sydney, 1996.Google Scholar
  7. 7.
    Joachimes, T.: Text categorization with support vector machines, Technical Report, <ftp://ftp-ai.informatik.uni-dortmund.de/pub/Reports/report23.ps.z>.Google Scholar
  8. 8.
    Muller, R., Smola, J. A. and Scholkopf, B.: Prediction time series with support vector machines, In Proceedings of International Conference on Artificial Neural Networks, pp. 999, 1997.Google Scholar
  9. 9.
    Mukherjee, S., Osuna E. and Girosi, F.: Nonlinear prediction of chaotic time series using support vector machines, Proc. Of IEEE NNSP'97, Amelia Island, FL, 1997.Google Scholar
  10. 10.
    Vapnik, V. N., Golowich, S. E. and Smola, A. J.: Support vector method for function approximation, regression estimation, and signal processing, Advances in Neural Information Processing Systems, 9 (1996), 281–287.Google Scholar
  11. 11.
    Muller, K. R., Smola, J. A., Ratsch, G., Scholkopf, B. and Kohlmorgen, J.: Prediction time series with support vector machines, Advances in Kernel Methods, TheMIT Press, London, England, 1999.Google Scholar
  12. 12.
    Vapnik, V. N.: The Nature of Statistical Learning Theory, Springer-Verlag, New York, 1995.Google Scholar
  13. 13.
    Cristianini, N. and Taylor, J. S.: An Introduction to Support VectorMachines: and Other Kernel-Based Learning Methods, New York: Cambridge University Press, 2000.Google Scholar
  14. 14.
    Kuhn, H. W. and Tucker, A. W.: Nonlinear programming, In Proceedings 2th Berkeley Symposium on Mathematical Statistics and Probabilistics, Berkeley, University of California Press, pp. 481–492, 1951.Google Scholar
  15. 15.
    Smola, A. J. and Scholkopf, B.: A tutorial on support vector regression, NeuroCOLT Technical Report TR, Royal Holloway College, London, UK, 1998.Google Scholar
  16. 16.
    Smola, A. J.: Learning with Kernels, PhD Thesis, GMD, Birlinghoven, Germany,1998.Google Scholar
  17. 17.
    Thomason, M.: The practitioner methods and tool, Journal of Computational Intelligence in Finance, 7(3) (1999), 36–45.Google Scholar
  18. 18.
    Thomason, M.: The practitioner methods and tool, Journal of Computational Intelligence in Finance, 7(4) (1999), 35–45.MathSciNetGoogle Scholar
  19. 19.
    Goldberg, D. E.: Genetic Algorithms in Search, Optimization, and Machine Learning, Reading, MA: Addison-Wesley, 1989.Google Scholar
  20. 20.
    Tay, F. E. H. and Cao, L. J.: Saliency analysis of support vector machines for feature selection, accepted by the Journal of Neural Network World, 2001.Google Scholar
  21. 21.
    Tay, F. E. H. and Cao, L. J.: A comparative study of saliency analysis and genetic algorithm for feature selection in support vector machines, accepted by the Journal of Intelligent Data Analysis, 2000.Google Scholar
  22. 22.
    Montgomery, D. C. and Runger, G. C.: Applied Statistics and Probability for Engineers, Wiley & Sons, New York, 1999.Google Scholar

Copyright information

© Kluwer Academic Publishers 2002

Authors and Affiliations

  • Francis E. H. Tay
    • 1
  • L. J. Cao
    • 2
  1. 1.Department of Mechanical & Production EngineeringNational University of SingaporeSingapore.
  2. 2.Institute of High Performance ComputingSingapore

Personalised recommendations