Noise Benefits in Feedback Machine Learning: Bidirectional Backpropagation

  • Bart KoskoEmail author
Conference paper
Part of the Understanding Complex Systems book series (UCS)


The new bidirectional backpropagation algorithm converts an ordinary feedforward neural network into a simple feedback dynamical system. The algorithm minimizes a joint performance measure so that training in one direction does not overwrite training in the reverse direction. This involves little extra computation. The forward direction gives the usual classification or regression network. The new backward pass approximates the centroids of the input pattern classes in a neural classifier. The bidirectional algorithm can also approximate inverse point mappings in the rare cases where such mappings exist. Carefully injected noise can speed the convergence of the bidirectional backpropagation. This holds because backpropagation is a special case of the expectation-maximization algorithm for maximum likelihood and because such noise can always boost its convergence. The noise also tends to improve accuracy in classification and regression.


  1. 1.
    O. Adigun, B. Kosko, Bidirectional representation and backpropagation learning, in International Joint Conference on Advances in Big Data Analytics (CSREA Press, 2016), pp. 3–9Google Scholar
  2. 2.
    O. Adigun, B. Kosko, Bidirectional Backpropagation. To appear in IEEE Trans. Syst. Man Cybern.: Syst. Man Cybern. (2018)Google Scholar
  3. 3.
    P.J. Werbos, Backpropagation through time: what it does and how to do it. Proc. IEEE 78, 1550–1560 (1990)CrossRefGoogle Scholar
  4. 4.
    D. Rumelhart, G. Hinton, R. Williams, Learning representations by back-propagating errors. Nature 323, 533–536 (1986)Google Scholar
  5. 5.
    Y. LeCun, Y. Bengio, G. Hinton, Deep learning. Nature 521, 436–444 (2015)CrossRefGoogle Scholar
  6. 6.
    M. Jordan, T. Mitchell, Machine learning: trends, perspectives, and prospects. Science 349, 255–260 (2015)MathSciNetCrossRefGoogle Scholar
  7. 7.
    K. Audhkhasi, O. Osoba, B. Kosko, Noise-enhanced convolutional neural networks. Neural Netw. 78, 15–23 (2016)CrossRefGoogle Scholar
  8. 8.
    A.P. Dempster, N.M. Laird, D.B. Rubin, Maximum likelihood from incomplete data via the EM algorithm. J. R. Stat. Soc. Ser. B (Methodol.) 39, 1–38 (1977)Google Scholar
  9. 9.
    O. Osoba, S. Mitaim, B. Kosko, The noisy expectation-maximization algorithm. Fluct. Noise Lett. 12, 1350012 (2013)Google Scholar
  10. 10.
    O. Osoba, B. Kosko, The noisy expectation-maximization algorithm for multiplicative noise injection. Fluct. Noise Lett. 15, 1650007 (2016)Google Scholar
  11. 11.
    B. Kosko, Bidirectional associative memories. IEEE Trans. Syst. Man Cybern. 18, 49–60 (1988)Google Scholar
  12. 12.
    B. Kosko, Neural Networks and Fuzzy Systems: A Dynamical Systems Approach to Machine Intelligence (Prentice Hall, Englewood Cliffs, 1991)Google Scholar
  13. 13.
    C.M. Bishop Pattern Recognition and Machine Learning (Springer, Berlin, 2006)Google Scholar
  14. 14.
    K. Audhkhasi, O. Osoba, B. Kosko, Noisy hidden Markov models for speech recognition, in Neural Networks (2013), pp. 1–6Google Scholar
  15. 15.
    B. Kosko, K. Audhkhasi, O. Osoba, Noise can speed backpropagation learning and deep bidirectional pretraining in reviewGoogle Scholar
  16. 16.
    O. Adigun, B. Kosko, Using noise to speed up video classification with recurrent backpropagation (2017), pp. 108–115Google Scholar
  17. 17.
    S. Hochreiter, J. Schmidhuber, Long short-term memory. Neural Comput. 9, 1735–1780 (1997)Google Scholar
  18. 18.
    C. Junyoung, G. Caglar, C. Kyunghyun, B. Yoshua, Gated feedback recurrent neural networks, in Proceedings of the 32nd International Conference on Machine Learning (PMLR 37) (2015), pp. 2067–2075Google Scholar
  19. 19.
    B. Franzke, B. Kosko, Using noise to speed up Markov chain Monte Carlo estimation. Procedia Comput. Sci. 53, 113–120 (2015)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Electrical and Computer Engineering DepartmentSignal and Image Processing Institute, University of Southern CaliforniaLos AngelesUSA

Personalised recommendations