Advertisement

An Adaptive Neural Mechanism with a Lizard Ear Model for Binaural Acoustic Tracking

  • Danish Shaikh
  • Poramate Manoonpong
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9825)

Abstract

Acoustic tracking of a moving sound source is relevant in many domains including robotic phonotaxis and human-robot interaction. Typical approaches rely on processing time-difference-of-arrival cues obtained via multi-microphone arrays with Kalman or particle filters, or other computationally expensive algorithms. We present a novel bio-inspired solution to acoustic tracking that uses only two microphones. The system is based on a neural mechanism coupled with a model of the peripheral auditory system of lizards. The peripheral auditory model provides sound direction information which the neural mechanism uses to learn the target’s velocity via fast correlation-based unsupervised learning. Simulation results for tracking a pure tone acoustic target moving along a semi-circular trajectory validate our approach. Three different angular velocities in three separate trials were employed for the validation. A comparison with a Braitenberg vehicle-like steering strategy shows the improved performance of our learning-based approach.

Keywords

Binaural acoustic tracking Correlation learning Lizard peripheral auditory system 

References

  1. 1.
    Aertsen, A., Johannesma, P., Hermes, D.: Spectro-temporal receptive fields of auditory neurons in the grassfrog. Biol. Cybern. 38(4), 235–248 (1980)CrossRefGoogle Scholar
  2. 2.
    Braitenberg, V.: Vehicles: Experiments in Synthetic Psychology. MIT Press, Bradford Books, Cambridge (1984)Google Scholar
  3. 3.
    Christensen-Dalsgaard, J., Manley, G.: Directionality of the lizard ear. J. Exp. Biol. 208(6), 1209–1217 (2005)CrossRefGoogle Scholar
  4. 4.
    Christensen-Dalsgaard, J., Tang, Y., Carr, C.: Binaural processing by the gecko auditory periphery. J. Neurophysiol. 105(5), 1992–2004 (2011)CrossRefGoogle Scholar
  5. 5.
    Fletcher, N.: Acoustic Systems in Biology. Oxford University Press, New York (1992)Google Scholar
  6. 6.
    Fletcher, N., Thwaites, S.: Physical models for the analysis of acoustical systems in biology. Q. Rev. Biophys. 12(1), 25–65 (1979)CrossRefGoogle Scholar
  7. 7.
    Ju, T., Shao, H., Peng, Q.: Tracking the moving sound target based on distributed microphone pairs. In: 2013 10th International Computer Conference on Wavelet Active Media Technology and Information Processing (ICCWAMTIP), pp. 330–334, December 2013Google Scholar
  8. 8.
    Ju, T., Shao, H., Peng, Q., Zhang, M.: Tracking the moving sound target based on double arrays. In: 2012 International Conference on Computational Problem-Solving (ICCP), pp. 315–319, October 2012Google Scholar
  9. 9.
    Klopf, A.: A neuronal model of classical conditioning. Psychobiology 16(2), 85–125 (1988)Google Scholar
  10. 10.
    Kosko, B.: Differential Hebbian learning. AIP Conf. Proc. 151(1), 277–282 (1986)CrossRefGoogle Scholar
  11. 11.
    Kwak, K.: Sound source tracking of moving speaker using multi-channel microphones in robot environments. In: 2011 IEEE International Conference on Robotics and Biomimetics (ROBIO), pp. 3017–3020, December 2011Google Scholar
  12. 12.
    Liang, Z., Ma, X., Dai, X.: Robust tracking of moving sound source using multiple model Kalman filter. Appl. Acoust. 69(12), 1350–1355 (2008)CrossRefGoogle Scholar
  13. 13.
    Liang, Z., Ma, X., Dai, X.: Robust tracking of moving sound source using scaled unscented particle filter. Appl. Acoust. 69(8), 673–680 (2008)CrossRefGoogle Scholar
  14. 14.
    Manoonpong, P., Geng, T., Kulvicius, T., Porr, B., Wörgötter, F.: Adaptive, fast walking in a biped robot under neuronal control and learning. PLoS Comput. Biol. 3(7), 1–16 (2007)CrossRefGoogle Scholar
  15. 15.
    Manoonpong, P., Wörgötter, F.: Adaptive sensor-driven neural control for learning in walking machines. In: Leung, C.S., Lee, M., Chan, J.H. (eds.) ICONIP 2009, Part II. LNCS, vol. 5864, pp. 47–55. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  16. 16.
    Nakadai, K., Lourens, T., Okuno, H., Kitano, H.: Active audition for humanoid. In: Proceedings of 17th National Conference on Artificial Intelligence (AAAI-2000), pp. 832–839. AAAI (2000)Google Scholar
  17. 17.
    Ning, F., Gao, D., Niu, J., Wei, J.: Combining compressive sensing with particle filter for tracking moving wideband sound sources. In: 2015 IEEE International Conference on Signal Processing, Communications and Computing (ICSPCC), pp. 1–6, September 2015Google Scholar
  18. 18.
    Nishie, S., Akagi, M.: Acoustic sound source tracking for a moving object using precise Doppler-shift measurement. In: 2013 Proceedings of the 21st European Signal Processing Conference (EUSIPCO), pp. 1–5, September 2013Google Scholar
  19. 19.
    Okuno, H., Nakadai, K., Hidai, K.I., Mizoguchi, H., Kitano, H.: Human robot non-verbal interaction empowered by real-time auditory and visual multiple-talker tracking. Adv. Robot. 17(2), 115–130 (2003)CrossRefGoogle Scholar
  20. 20.
    Poor, B., Wörgötter, F.: Fast heterosynaptic learning in a robot food retrieval task inspired by the limbic system. Biosystems 89(1–3), 294–299 (2007). Papers Presented at the 6th International Workshop on Neural CodingCrossRefGoogle Scholar
  21. 21.
    Porr, B., Wörgötter, F.: Strongly improved stability and faster convergence of temporal sequence learning by utilising input correlations only. Neural Comput. 18(6), 1380–1412 (2006)CrossRefzbMATHGoogle Scholar
  22. 22.
    Reeve, R., Webb, B.: New neural circuits for robot phonotaxis. Philos. Trans. R. Soc. Lond. A Math. Phys. Eng. Sci. 361(1811), 2245–2266 (2003)MathSciNetCrossRefGoogle Scholar
  23. 23.
    Shaikh, D., Hallam, J., Christensen-Dalsgaard, J.: From “Ear" to there: a review of biorobotic models of auditory processing in lizards. Biol. Cybern. (2016, in press)Google Scholar
  24. 24.
    Tsuji, D., Suyama, K.: A moving sound source tracking based on two successive algorithms. In: 2009 IEEE International Symposium on Circuits and Systems, ISCAS 2009, pp. 2577–2580, May 2009Google Scholar
  25. 25.
    Valin, J.M., Michaud, F., Rouat, J.: Robust localization and tracking of simultaneous moving sound sources using beamforming and particle filtering. Robot. Auton. Syst. 55(3), 216–228 (2007)CrossRefGoogle Scholar
  26. 26.
    Wever, E.: The Reptile Ear: Its Structure and Function. Princeton University Press, Princeton (1978)Google Scholar
  27. 27.
    Wolpert, D., Ghahramani, Z., Jordan, M.: An internal model for sensorimotor integration. Science 269(5232), 1880–1882 (1995)CrossRefGoogle Scholar
  28. 28.
    Zhang, L.: Modelling directional hearing in lizards. Ph.D. thesis, Maersk Mc-Kinney Moller Institute, Faculty of Engineering, University of Southern Denmark (2009)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  1. 1.Embodied AI and Neurorobotics Lab, Centre for BioRoboticsMaersk Mc-Kinney Moeller Institute, University of Southern DenmarkOdense MDenmark

Personalised recommendations