Journal of Comparative Physiology A

, Volume 197, Issue 9, pp 915–921 | Cite as

Sound imaging of nocturnal animal calls in their natural habitat

  • Takeshi MizumotoEmail author
  • Ikkyu Aihara
  • Takuma Otsuka
  • Ryu Takeda
  • Kazuyuki Aihara
  • Hiroshi G. Okuno
Original Paper


We present a novel method for imaging acoustic communication between nocturnal animals. Investigating the spatio-temporal calling behavior of nocturnal animals, e.g., frogs and crickets, has been difficult because of the need to distinguish many animals’ calls in noisy environments without being able to see them. Our method visualizes the spatial and temporal dynamics using dozens of sound-to-light conversion devices (called “Firefly”) and an off-the-shelf video camera. The Firefly, which consists of a microphone and a light emitting diode, emits light when it captures nearby sound. Deploying dozens of Fireflies in a target area, we record calls of multiple individuals through the video camera. We conduct two experiments, one indoors and the other in the field, using Japanese tree frogs (Hyla japonica). The indoor experiment demonstrates that our method correctly visualizes Japanese tree frogs’ calling behavior. It has confirmed the known behavior; two frogs call synchronously or in anti-phase synchronization. The field experiment (in a rice paddy where Japanese tree frogs live) also visualizes the same calling behavior to confirm anti-phase synchronization in the field. Experimental results confirm that our method can visualize the calling behavior of nocturnal animals in their natural habitat.


Sound imaging Visualization Acoustic communication Nocturnal animal Measurement method 



This work was supported by the JSPS Grant-in-Aid for Exploratory Research (No. 21650043), the JSPS Grand-in-Aid for (S) (No. 19100003), the JSPS Grand-in-Aid for JSPS Fellows (No. 08J00608), the JSPS FIRST Program, and Honda Research Institute Japan, Co. Ltd. We would like to thank T. Kobayashi and H. Kitahata for their suggestions for analyzing the recorded emission data, H. Riquimaroux, P. M. Narins, K. Okanoya and A. Yamaguchi for their helpful advice, and A. Lim and L. K. Cahier for suggestions to improve the English of this manuscript. We performed all experiments in accordance with the guidelines of the Animal Research Committee of Kyoto University.

Supplementary material

359_2011_652_MOESM1_ESM.pdf (152 kb)
PDF (151 KB)


  1. Aihara I (2009) Modeling synchronized calling behavior of Japanese tree frogs. Phys Rev E 8:011918–011925. doi: 10.1103/PhysRevE.80.011918 CrossRefGoogle Scholar
  2. Aihara I, Takeda R, Mizumoto T, Otsuka T, Takahashi T, Okuno HG, Aihara K (2011) Complex and transitive synchronization in a frustrated system of calling frogs. Phys Rev E 83(3):031–913. doi: 10.1103/PhysRevE.83.031913 CrossRefGoogle Scholar
  3. Asano F, Asoh H, Matsui T (1999) Sound source localization and signal separation for office robot “JiJo-2”. In: Proceedings of Multisensor Fusion and Integration for Intelligent Systems, pp 243–248. doi: 10.1109/MFI.1999.815997
  4. Clark CW, Ellison WT (2000) Calibration and comparison of the acoustic location methods used during the spring migration of the bowhead whale, Balaena mysticetus, off Pt. Barrow, Alaska, 1984–1993. J Acoust Soc Am 107(6):3509–3517. doi: 10.1121/1.429421 PubMedCrossRefGoogle Scholar
  5. Farid H (2001) Blind inverse gamma correction. IEEE Trans Image Process 10(10):1428–1433. doi: 10.1109/83.951529 PubMedCrossRefGoogle Scholar
  6. Feng A, Narins PM, Xu CH, Lin WY, Yu ZL, Qiu Q, Xu ZM, Shen JX (2006) Ultrasonic communication in frogs. Nature 440:2333–2336 doi: 10.1038/nature04416 Google Scholar
  7. Gerhardt H, Huber F (2002) Acoustic communication in insects and anurans. The University of Chicago Press, ChicagoGoogle Scholar
  8. Grafe TU (1997) Costs and benefits of mate choice in the lek-breeding reed frog, Hyperolius marmoratus. Animal Behav 53:1103–1117. doi: 10.1006/anbe.1996.0427 CrossRefGoogle Scholar
  9. Hedwig B, Poulet J (2004) Complex auditory behavior emerges from simple reactive steering. Nature (430):781–785 doi: 10.1038/nature02787
  10. Hyvarinen A, Karhunen J, Oja E (2001) Independent component analysis. Wiley-Interscience, New YorkCrossRefGoogle Scholar
  11. ITU-R (2007) Recommendation ITU-R BT.606-6: Studio encoding parameters of digital television for standard 4:3 and wide screen 16:9 aspect ratios. International Telecommunication Union Radiocommunication Sector, GenevaGoogle Scholar
  12. Jones DL, Ratnam R (2009) Blind location and separation of callers in a natural chorus using a microphone array. J Acoust Soc Am 126(2):895–910. doi: 10.1121/1.3158924 PubMedCrossRefGoogle Scholar
  13. Liu C (2005) Foundations of MEMS. Prentice Hall, New JerseyGoogle Scholar
  14. MacCurdy R, Fristrup K (2009) Automatic animal tracking using matched filters and time difference of arrival. J Commun 4(7):487–495. doi: 10.4304/jcm.4.7.487-495 Google Scholar
  15. Maeda N, Matsui M (1999) Frogs and toads of Japan. Bun-ichi Sogo Shuppan Co. Ltd., Tokyo, pp 36–39Google Scholar
  16. Matsui M (1996) Natural history of the amphibia. University of Tokyo Press, Tokyo, pp 150–152Google Scholar
  17. Nakadai K, Okuno HG, Nakajima H, Hasegawa Y, Tsujino H (2009) Design and implementation of robot audition system “HARK”. Adv Robotics 24:739–761. doi: 10.1163/016918610X493561 CrossRefGoogle Scholar
  18. Nakajima H, Nakadai K, Hasegawa Y, Tsujino H (2010) Blind source separation with parameter-free adaptive step-size method for robot audition. IEEE Trans Audio, Speech, Language Process 18(6):1476–1484. doi: 10.1109/TASL.2009.2035219 CrossRefGoogle Scholar
  19. Narins PM, Capranica RR (1978) Communicative significance of the two-note call of the treefrog Eleutherodactylus coqui. J Comp Physiol A 127:1–9. doi: 10.1007/BF00611921 CrossRefGoogle Scholar
  20. Otsu N (1979) A threshold selection method from gray-level histograms. IEEE Trans Syst, Man, Cybern SMC-9(1):62–66. doi: 10.1109/TSMC.1979.4310076 Google Scholar
  21. Riquimaroux H, Gaioni SJ, Suga N (1991) Cortical computational maps control the auditory perception. Science (251):565–568 doi: 10.1126/science.1990432
  22. Sawada H, Mukai R, Araki S (2003) Polar coordinate based nonlinear function for frequency-domain blind source separation. IEICE Trans Fundam Electron, Commun Comput Sci 86(3):590–596Google Scholar
  23. Schwartz JJ (2001) Call monitoring and interactive playback systems in the study of acoustic interactions among male anurans. Smithsonian Institution Press, Washington, DC, pp 183–204Google Scholar
  24. Simmons AM (2004) Call recognition in the bullfrog, Rana catesbiana: generalization along the duration continuum. J Acoust Soc Am 115(3):1345–1355. doi: 10.1121/1.1643366 PubMedCrossRefGoogle Scholar
  25. Simmons AM, Simmons JA, Bates ME (2008) Analyzing acoustic interactions in natural bullfrog (Rana catesbeiana) choruses. J Comp Psychol A 122(3):274–282. doi: 10.1037/0735-7036.122.3.274 CrossRefGoogle Scholar
  26. Spiesberger JL (1999) Locating animals from their sounds and tomography of the atmosphere: experimental demonstration. J Acoust Soc Am 106(2):837–846. doi: 10.1121/1.427100 PubMedCrossRefGoogle Scholar
  27. Spiesberger JL, Fristrup KM (1990) Passive localization of calling animals and sensing of their acoustic environment using acoustic tomography. Am Natur 135(1):107–153. doi: 10.1086/285035 CrossRefGoogle Scholar
  28. Suggs DN, Simmons AM (2005) Information theory analysis of patterns of modulation in the advertisement call of the male bullfrog, Rana catesbiana. J Acoust Soc Am 117(4):2330–2337. doi: 10.1121/1.1863693 PubMedCrossRefGoogle Scholar
  29. Wells K (2007) The ecology and behavoir of amphibians. The University of Chicago Press, ChicagoGoogle Scholar

Copyright information

© Springer-Verlag 2011

Authors and Affiliations

  • Takeshi Mizumoto
    • 1
    Email author
  • Ikkyu Aihara
    • 2
  • Takuma Otsuka
    • 1
  • Ryu Takeda
    • 1
  • Kazuyuki Aihara
    • 3
  • Hiroshi G. Okuno
    • 1
  1. 1.Graduate School of InformaticsKyoto UniversityKyotoJapan
  2. 2.Department of Physics, Graduate School of SciencesKyoto UniversityKyotoJapan
  3. 3.Institute of Industrial ScienceUniversity of TokyoTokyoJapan

Personalised recommendations