Advertisement

Neural Computing & Applications

, Volume 14, Issue 3, pp 256–271 | Cite as

Level estimation, classification and probability distribution architectures for trading the EUR/USD exchange rate

  • Andreas LindemannEmail author
  • Christian L. Dunis
  • Paulo Lisboa
Original Article

Abstract

Dunis and Williams (Derivatives: use, trading and regulation 8(3):211–239, 2002; Applied quantitative methods for trading and investment. Wiley, Chichester, 2003) have shown the superiority of a Multi-layer perceptron network (MLP), outperforming its benchmark models such as a moving average convergence divergence technical model (MACD), an autoregressive moving average model (ARMA) and a logistic regression model (LOGIT) on a Euro/Dollar (EUR/USD) time series. The motivation for this paper is to investigate the use of different neural network architectures. This is done by benchmarking three different neural network designs representing a level estimator, a classification model and a probability distribution predictor. More specifically, we present the Mulit-layer perceptron network, the Softmax cross entropy model and the Gaussian mixture model and benchmark their respective performance on the Euro/Dollar (EUR/USD) time series as reported by Dunis and Williams. As it turns out, the Multi-layer perceptron does best when used without confirmation filters and leverage, while the Softmax cross entropy model and the Gaussian mixture model outperforms the Multi-layer perceptron when using more sophisticated trading strategies and leverage. This might be due to the ability of both models using probability distributions to identify successfully trades with a high Sharpe ratio.

Keywords

Confirmation filters Gaussian mixture models Leverage Multi-layer perceptron networks Probability distribution Softmax cross entropy networks Trading strategy 

References

  1. 1.
    Bishop CM (1994) Mixture density networks. Internal report NCRG/4288. Department of Computer Science and Applied Mathematics Aston University, BirminghamGoogle Scholar
  2. 2.
    Dempster AP, Laird NM, Rubin DB (1977) Maximum likelihood from incomplete data vie the EM algorithm. J R Statist Soc B 39 1:1–38Google Scholar
  3. 3.
    Dunis C, Williams M (2002) Modelling and trading the EUR/USD exchange rate: do neural network models perform better? Derivatives: use, trading and regulation 8(3):211–239Google Scholar
  4. 4.
    Dunis C, Williams M (2003) Applications of advanced regression analysis for trading and investment. In: Dunis C, Laws J, Naïm P (eds) Applied quantitative methods for trading and investment. Wiley, ChichesterGoogle Scholar
  5. 5.
    Dunne RA, Campbell NA (1997) On the pairing of the Softmax activation and cross-entropy penalty functions and the derivation of the Softmax activation function. In: Proceedings 8th Australians conference on the neural networks, Melbourne, pp 181–185Google Scholar
  6. 6.
    Husmeier D (1997) Modelling conditional probability densities with neural networks. PhD thesis, Department of Mathematics, King’s College London, http://www.bioss.sari.ac.uk/~dirk/papers/phd_thesis.ps.gz
  7. 7.
    Husmeier D (1999) Neural networks for conditional probability estimation—forecasting beyond point predictions (perspectives in neural computing). Springer, LondonGoogle Scholar
  8. 8.
    Igelnik B, Pao YH (1995) Stochastic choice of basis functions in adaptive functional approximation and the functional-link net. IEEE Trans Neural Networks 6:1320–1329CrossRefGoogle Scholar
  9. 9.
    Jordan MI, Jacobs RA (1994) Hierarchical mixtures of experts and the EM algorithm. Neural Computat 6:181–214Google Scholar
  10. 10.
    Kaastra I, Boyd M (1996) Designing a neural network for forecasting financial and economic time series. Neurocomputing 10:215–236CrossRefGoogle Scholar
  11. 11.
    Lisboa PJG, Vellido A (2000) Business applications of neural networks. In: Lisboa PJG, Edisbury B, Vellido A (eds) Business applications of neural networks: the state-of-the-art of real-world applications. World Scientific, Singapore, pp vii–xxiiGoogle Scholar
  12. 12.
    MacKay DJC (1992) Bayesian interpolation. Neural Computation 4:415–447Google Scholar
  13. 13.
    Neuneier R, Hergert F, Finnoff W, Ormoneit D (1994) Estimation of conditional densities: a comparison of neural network approaches. In: Marinaro M, Morasso P (eds) Springer, Berlin Heidelberg New York, pp 689–692Google Scholar
  14. 14.
    Shapiro AF (2000) A Hitchhiker’s guide to the techniques of adaptive nonlinear models. Insurance Math Econ 26:119–132CrossRefGoogle Scholar
  15. 15.
    Weigend A, Nix AN (1994) Predictions with confidence intervals (local error bars)’. In: Proceedings of the international conference on neural information processing, pp 847–852Google Scholar
  16. 16.
    Weigend AS, Srivastava AN (1995) Predicting conditional probability distributions: a connectionist approach. Int J Neural Syst 6(2):109–118CrossRefPubMedGoogle Scholar

Copyright information

© Springer-Verlag London Limited 2005

Authors and Affiliations

  • Andreas Lindemann
    • 1
    • 3
    Email author
  • Christian L. Dunis
    • 1
    • 3
  • Paulo Lisboa
    • 2
  1. 1.Liverpool School of Accounting, Finance & EconomicsLiverpool John Moores UniversityLiverpoolUK
  2. 2.School of Computing and Mathematical SciencesLiverpool John Moores UniversityLiverpoolUK
  3. 3.Centre for International Banking, Economics and FinanceJMULiverpoolUK

Personalised recommendations