Advertisement

Time Series Analysis Using Fractal Theory and Online Ensemble Classifiers

  • Dalton Lunga
  • Tshilidzi Marwala
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4304)

Abstract

Fractal analysis is proposed as a concept to establish the degree of persistence and self-similarity within the stock market data. This concept is carried out using the rescaled range analysis (R/S) method. The R/S analysis outcome is applied to an online incremental algorithm (Learn++) that is built to classify the direction of movement of the stock market. The use of fractal geometry in this study provides a way of determining quantitatively the extent to which time series data can be predicted. In an extensive test, it is demonstrated that the R/S analysis provides a very sensitive method to reveal hidden long run and short run memory trends within the sample data. The time series data that is measured to be persistent is used in training the neural network. The results from Learn++ algorithm show a very high level of confidence of the neural network in classifying sample data accurately.

Keywords

Fractal Dimension Time Series Data Hurst Exponent Fractal Theory Very High 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Freund, Y., Schapire, R.: Stock market prices do not follow random walks: evidence from a simple specification test. Review of Financial Studies, 41–66 (1988)Google Scholar
  2. 2.
    Mandelbrot, B.: Fractals and scaling in finance: Discontinuity. Springer, Heidelberg (1997)MATHGoogle Scholar
  3. 3.
    Hurst, H.E.: Long-term storage of reservoirs. Transactions of the American Society 116, 770–778 (1951)Google Scholar
  4. 4.
    Skjeltorp, J.: Scaling in the norwergian stock market. Physica A 283, 486 (2000)CrossRefGoogle Scholar
  5. 5.
    Gammel, B.: Hurst’s rescaled range statistical analysis for pseudorandom number generators used in physical simulations. The American Physical Society 58, 2586 (1998)Google Scholar
  6. 6.
    Byorick, J., Moreton, M., Marino, A., Polikar, R.: Learn++: A classifier independent incremental learning algorithm. In: Joint Conference on Neural Networks, vol. 31, pp. 1742–1747 (2002)Google Scholar
  7. 7.
    Littlestone, N., Warmuth, M.: Weighted majority voting algorithm. Information and computer science 108, 212–216 (1994)MATHMathSciNetGoogle Scholar
  8. 8.
    Polikar, R., Udpa, L., Udpa, S., Honavar, V.: An incremental learning algorithm with confidence estimation for automated identification of nde signals. Transactions on Ultrasonic Ferroelectrics, and Frequency Control 51, 990–1001 (2004)CrossRefGoogle Scholar
  9. 9.
    McIver, D., Friedl, M.: Estimating pixel-scale land cover classification confidence using nonparametric machine learning methods. Transactions on Geoscience and Remote Sensing 39 (2001)Google Scholar
  10. 10.
    Krause, S., Polikar, R.: An ensemble of classifiers approach for the missing feature proble 553 (2003)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Dalton Lunga
    • 1
  • Tshilidzi Marwala
    • 1
  1. 1.School of Electrical and Information EngineeringUniversity of the WitwatersrandJohannesburgSouth Africa

Personalised recommendations