Sound imaging of nocturnal animal calls in their natural habitat
We present a novel method for imaging acoustic communication between nocturnal animals. Investigating the spatio-temporal calling behavior of nocturnal animals, e.g., frogs and crickets, has been difficult because of the need to distinguish many animals’ calls in noisy environments without being able to see them. Our method visualizes the spatial and temporal dynamics using dozens of sound-to-light conversion devices (called “Firefly”) and an off-the-shelf video camera. The Firefly, which consists of a microphone and a light emitting diode, emits light when it captures nearby sound. Deploying dozens of Fireflies in a target area, we record calls of multiple individuals through the video camera. We conduct two experiments, one indoors and the other in the field, using Japanese tree frogs (Hyla japonica). The indoor experiment demonstrates that our method correctly visualizes Japanese tree frogs’ calling behavior. It has confirmed the known behavior; two frogs call synchronously or in anti-phase synchronization. The field experiment (in a rice paddy where Japanese tree frogs live) also visualizes the same calling behavior to confirm anti-phase synchronization in the field. Experimental results confirm that our method can visualize the calling behavior of nocturnal animals in their natural habitat.
KeywordsSound imaging Visualization Acoustic communication Nocturnal animal Measurement method
This work was supported by the JSPS Grant-in-Aid for Exploratory Research (No. 21650043), the JSPS Grand-in-Aid for (S) (No. 19100003), the JSPS Grand-in-Aid for JSPS Fellows (No. 08J00608), the JSPS FIRST Program, and Honda Research Institute Japan, Co. Ltd. We would like to thank T. Kobayashi and H. Kitahata for their suggestions for analyzing the recorded emission data, H. Riquimaroux, P. M. Narins, K. Okanoya and A. Yamaguchi for their helpful advice, and A. Lim and L. K. Cahier for suggestions to improve the English of this manuscript. We performed all experiments in accordance with the guidelines of the Animal Research Committee of Kyoto University.
- Asano F, Asoh H, Matsui T (1999) Sound source localization and signal separation for office robot “JiJo-2”. In: Proceedings of Multisensor Fusion and Integration for Intelligent Systems, pp 243–248. doi: 10.1109/MFI.1999.815997
- Gerhardt H, Huber F (2002) Acoustic communication in insects and anurans. The University of Chicago Press, ChicagoGoogle Scholar
- Hedwig B, Poulet J (2004) Complex auditory behavior emerges from simple reactive steering. Nature (430):781–785 doi: 10.1038/nature02787
- ITU-R (2007) Recommendation ITU-R BT.606-6: Studio encoding parameters of digital television for standard 4:3 and wide screen 16:9 aspect ratios. International Telecommunication Union Radiocommunication Sector, GenevaGoogle Scholar
- Liu C (2005) Foundations of MEMS. Prentice Hall, New JerseyGoogle Scholar
- Maeda N, Matsui M (1999) Frogs and toads of Japan. Bun-ichi Sogo Shuppan Co. Ltd., Tokyo, pp 36–39Google Scholar
- Matsui M (1996) Natural history of the amphibia. University of Tokyo Press, Tokyo, pp 150–152Google Scholar
- Riquimaroux H, Gaioni SJ, Suga N (1991) Cortical computational maps control the auditory perception. Science (251):565–568 doi: 10.1126/science.1990432
- Sawada H, Mukai R, Araki S (2003) Polar coordinate based nonlinear function for frequency-domain blind source separation. IEICE Trans Fundam Electron, Commun Comput Sci 86(3):590–596Google Scholar
- Schwartz JJ (2001) Call monitoring and interactive playback systems in the study of acoustic interactions among male anurans. Smithsonian Institution Press, Washington, DC, pp 183–204Google Scholar
- Wells K (2007) The ecology and behavoir of amphibians. The University of Chicago Press, ChicagoGoogle Scholar