Skip to main content
Log in

Combining audio and visual displays to highlight temporal and spatial seismic patterns

  • Original Paper
  • Published:
Journal on Multimodal User Interfaces Aims and scope Submit manuscript

Abstract

Data visualization, and to a lesser extent data sonification, are classic tools to the scientific community. However, these two approaches are very rarely combined, although they are highly complementary: our visual system is good at recognizing spatial patterns, whereas our auditory system is better tuned for temporal patterns. In this article, data representation methods are proposed that combine visualization, sonification, and spatial audio techniques, in order to optimize the user’s perception of spatial and temporal patterns in a single display, to increase the feeling of immersion, and to take advantage of multimodal integration mechanisms. Three seismic data sets are used to illustrate the methods, covering different physical phenomena, time scales, spatial distributions, and spatio-temporal dynamics. The methods are adapted to the specificities of each data set, and to the amount of information that the designer wants to display. This leads to further developments, namely the use of audification with two time scales, the switch from pure audification to time-modulated noise, and the switch from pure audification to sonic icons. First user feedback from live demonstrations indicates that the methods presented in this article seem to enhance the perception of spatio-temporal patterns, which is a key parameter to the understanding of seismically active systems, and a step towards apprehending the processes that drive this activity.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

Data availability

all datasets used in this study are available on the web at the following addresses (see also the manuscript):

- https://www.usgs.gov/volcanoes/kilauea/monitoring

- https://www.usgs.gov/observatories/hawaiian-volcano-observatory

- https://pnsn.org/seismograms

- https://www.pnsn.org/tremor

- http://www.fdsn.org/networks/detail/Y4_2014.

Notes

  1. This process, known as audification, is maybe the first sonification technique that was used: in 1878 the early technology of telephone was used to listen to nervous impulses in muscles [5], cited by Dombois [6]; in 1924 bat scream recordings were slowed down and listened to [7]. On the contrary, electromagnetic waves have too high frequencies to be audible, they need to be slowed down for tuning the frequencies down until they reach our hearing range.

  2. Wave Field Synthesis.

  3. Vector-Based Amplitude Panning.

  4. Distance-Based Amplitude Panning.

  5. Head-Related Transfer Functions.

  6. Description available at http://floriandombois.net/works/circum-pacific.html. See also Dombois’s “Surf” (2006) at http://floriandombois.net/works/acceleration.html and “Acceleration 2,200” (2003) at http://floriandombois.net/works/acceleration.html ; all URLs last retrieved June 16, 2021.

  7. Part of this work was presented as a demo (no paper issued) at the Computer Music Multidisciplinary Research (CMMR) Symposium in Marseille, France, in October 2019.

  8. Available at https://volcanoes.usgs.gov/volcanoes/kilauea/monitoring_kilauea.html.

  9. Information available at https://volcanoes.usgs.gov/observatories/hvo/.

  10. Information at https://pnsn.org/seismograms.

  11. Freely available through an interactive web interface: https://www.pnsn.org/tremor.

  12. Details available at http://www.fdsn.org/networks/detail/Y4_2014/.

  13. Ardour is an open source digital audio workstation available at https://ardour.org/.

  14. See http://www.seismicsoundlab.org/?page_id=157.

  15. Words uttered by the listeners are in italic font.

References

  1. Kramer G, Walker B, Bonebright T, Cook P, Flowers JH, Miner N, Neuhoff J (1999) Sonification report: Status of the field and research agenda. Tech. rep, National Science Foundation

  2. Latour B (1986) Visualization and cognition: drawing things together. Knowl Soc 6(6):1

    Google Scholar 

  3. Kramer G (1994) Auditory display: sonification, audification, and auditory interfaces. Avalon Publishing, New York

    Google Scholar 

  4. Hermann T, Hunt A, Neuhoff JG (2011) The sonification handbook. Logos Verlag, Berlin

    Google Scholar 

  5. Bernstein J, Schönlein C (1881) Telephonische Wahrnehmung der Schwankungen des Muskelstromes bei der Contraction. Sitzungsberichte der Naturforschenden Gesellschaft zu Halle, pp 18–27

  6. Dombois F (2008) The muscle telephone. The undiscovered start of audification. In: Kursell J (ed) The 1870s. In: Sounds of science–Schall im labor. Max Planck Institute for the History of Science, Berlin, Germany, pp 41–45

  7. Dombois F, Eckel G (2011) Audification. In: Hermann T, Hunt A, Neuhoff JG (eds) The sonification handbook, chap. 12(Logos Verlag, Berlin, Germany), pp 301–324

  8. Speeth SD (1961) Seismometer sounds. J Acoust Soc Am 33(7):909. https://doi.org/10.1121/1.1908843

    Article  Google Scholar 

  9. Frantti GE, Levereault LA (1965) Auditory discrimination of seismic signals from earthquakes and explosions. Bull Seismol Soc Am 55(1):1

    Article  Google Scholar 

  10. Volmar A (2013) Listening to the cold war: the nuclear test ban negotiations, seismology, and psychoacoustics 1958–1963. Osiris 28(1):80. https://doi.org/10.1086/671364

    Article  Google Scholar 

  11. Hayward C (1994) Listening to the Earth sing. In: Kramer G (ed) Auditory display: sonification, audification, and auditory interfaces, chap. 15. Addison-Wesley, Reading, pp 369–404

  12. Dombois F (2001) Using audification in planetary seismology. In: Proceedings of the international conference on auditory display (ICAD)

  13. Dombois F (2002) Auditory seismology—on free oscillations, focal mechanisms, explosions and synthetic seismograms. In: Proceedings of the international conference on auditory display (ICAD)

  14. Kilb DL, Peng Z, Simpson D, Michael AJ, Fisher M, Rohrlick D (2012) Listen, watch, learn: SeisSound video products. Seismol Res Lett 83(2):281. https://doi.org/10.1785/gssrl.83.2.281

    Article  Google Scholar 

  15. Holtzman B, Candler J, Turk M, Peter D (2014) Seismic sound lab: sights, sounds and perception of the earth as an acoustic space. In: Aramaki M, Derrien O, Kronland-Martinet R, Ystad S (eds) Sound Music Motion. Springer International Publishing, New York, pp 161–174. https://doi.org/10.1007/978-3-319-12976-1_10

    Chapter  Google Scholar 

  16. Paté A, Boschi L, Le Carrou JL, Holtzman B (2016) Categorization of seismic sources by auditory display. Int J Human Comput Stud 85:57. https://doi.org/10.1016/j.ijhcs.2015.08.002

    Article  Google Scholar 

  17. Paté A, Boschi L, Dubois D, Le Carrou JL, Holtzman B (2017) Auditory display of seismic data: on the use of expert’s categorizations and verbal descriptions as heuristics for geoscience. J Acoust Soc Am 10(1121/1):4978441

    Google Scholar 

  18. Boschi L, Delcor L, Le Carrou JL, Fritz C, Paté A, Holtzman B (2017) On the perception of audified seismograms. Seismol Res Lett 88(5):1279. https://doi.org/10.1785/0220170077

    Article  Google Scholar 

  19. Gaver WW (1989) The SonicFinder: an interface that uses auditory icons. Human-Comput Interact 4:67. https://doi.org/10.1207/s15327051hci0401_3

    Article  Google Scholar 

  20. Neuhoff JG (2011) Perception, cognition and action in auditory displays. In: Hermann T, Hunt A, Neuhoff JG (eds) The sonification handbook, chap. 4. Logos Verlag, Berlin, pp 63–86

  21. Walker A, Brewster S, Mcgookin D, Ng A (2001) Diary in the sky: a spatial audio display for a mobile calendar. In: Proceedings of the 5th annual conference of the British HCI Group (Lille, France), pp 531–539. https://doi.org/10.1007/978-1-4471-0353-0_33

  22. Howard IP, Templeton WB (1966) Human spatial orientation. Wiley, New York no DOI

    Google Scholar 

  23. Deutsch D (2013) Grouping mechanisms in music. In: Deutsch D (ed) The psychology of music, chap. 6. Elsevier, Amsterdam. https://doi.org/10.1016/C2009-0-62532-0. 3rd edition

  24. Bregman AS (1994) Auditory scene analysis: the perceptual organization of sound. MIT Press, Cambridge no DOI

    Google Scholar 

  25. Féron F, Frissen I, Boissinot J, Guastavino C (2010) Upper limits of auditory rotational motion perception. J Acoust Soc Am 128(6):3703. https://doi.org/10.1121/1.3502456

    Article  Google Scholar 

  26. Blauert J (1983) Spatial hearing. MIT Press, Cambridge

    Google Scholar 

  27. Middlebrooks JC, Green DM (1991) Sound localization by human listeners. Annu Rev Psychol 42(1):135. https://doi.org/10.1146/annurev.ps.42.020191.001031

    Article  Google Scholar 

  28. Dunai L, Lengua I, Peris-Fajarnés G, Brusola F (2015) Virtual sound localization by blind people. Arch Acoust 40(4):561. https://doi.org/10.1515/aoa-2015-0055

    Article  Google Scholar 

  29. Guillaume A, Rivenez M, Andéol G, Pellieux L (2007) Perception of urgency and spatialization of auditory alarms. In: Proceedings of the 13th international conference on auditory display (ICAD), Montréal, Canada

  30. Parente P, Bishop G (2003) BATS: The Blind Audio Tactile Mapping System. In: Proceedings of ACM south eastern conference

  31. Loeliger E, Stockman T (2012) Wayfinding without visual cues: evaluation of an interactive audio map system. Interact Comput 26(5):403. https://doi.org/10.1093/iwc/iwt042

    Article  Google Scholar 

  32. Brungart DS, Simpson BD (2008) Design, validation, and in-flight evaluation of an auditory attitude indicator based on pilot-selected music. In: Proceedings of the 14th international conference on auditory display (ICAD), Paris, France

  33. Kiefer P (ed) (2010) Klangräume der Kunst. Kehrer Verlag, Berlin

    Google Scholar 

  34. Brown LM, Brewster SA, Ramloll R, Burton M, Riedel B (2003) Design guidelines for audio presentation of graphs and tables. In: Proceedings of the international conference on auditory display (ICAD), Boston, MA

  35. Roginska A, Childs E, Johnson MK (2006) Monitoring real-time data: a sonification approach. In: Proceedings of the 12th international conference on auditory display (ICAD), London, UK

  36. Bonebright TL, Nees MA, Connerley TT, McCain GR (2001) Testing the effectiveness of sonified graphs for education: a programmatic research project. In: Proceedings of the international conference on auditory display (ICAD), Espoo, Finland

  37. McGookin D, Brewster S (2011) Earcons. In: Hermann T, Hunt A, Neuhoff JG (eds) The sonification handbook, chap. 14. Logos Verlag, Berlin, pp 339–362

  38. Holtzman B, Candler J, Repetto D, Pratt M, Paté A, Turk M, Gualtieri L, Peter D, Trakinski V, Ebel D, Gossmann J, Lem N (2017) SeismoDome: sonic and visual representation of earthquakes and seismic waves in the planetarium. In: AGU fall meeting, New Orleans, LA

  39. Barth A, Karlstrom L, Holtzman B, Niyak A, Paté A (2020) Sonification and animation of multivariate data illuminates geyser eruption dynamics. Comput Music J (submitted)

  40. Peng Z, Aiken C, Kilb D, Shelly DR, Enescu B (2012) Listening to the 2011 Magnitude 9.0 Tohoku-Oki, Japan. Earthq Seismol Res Lett 83(2):287. https://doi.org/10.1785/gssrl.83.2.287

    Article  Google Scholar 

  41. Dombois F, Brodwolf O, Friedli O, Rennert I, Koenig T (2008) Sonifyer—a concept, a software, a platform. In: Proceedings of the international conference on auditory display (ICAD)

  42. Spence C (2007) Audiovisual multisensory integration. Acoust Sci Technol 28(2):61. https://doi.org/10.1250/ast.28.61

    Article  Google Scholar 

  43. Hendrix C, Barfield W (1996) The sense of presence within auditory virtual environments. Presence 5(3):290. https://doi.org/10.1162/pres.1996.5.3.290

    Article  Google Scholar 

  44. Viaud-Delmon I, Warusfel O, Seguelas A, Rio E, Jouvent R (2006) High sensitivity to multisensory conflicts in agoraphobia exhibited by virtual reality. Eur Psychiat 21(7):501. https://doi.org/10.1016/j.eurpsy.2004.10.004

    Article  Google Scholar 

  45. Chouet BA, Matoza RS (2013) A multi-decadal view of seismic methods for detecting precursors of magma movement and eruption. J Volcanol Geoth Res 252:108. https://doi.org/10.1016/j.jvolgeores.2012.11.013

    Article  Google Scholar 

  46. Liang C, Crozier J, L K, M DE (2020) Magma oscillations in a conduit-reservoir system, application to very long period (VLP) seismicity at basaltic volcanoes: 2. Data inversion and interpretation at Klauea Volcano. J Geophys Res Solid Earth https://doi.org/10.1029/2019JB017456

  47. Wu SM, Lin FC, Farrell J, Shiro B, Karlstrom L, Okubo P, Koper K (2020) Spatiotemporal seismic structure variations associated with the 2018 Klauea eruption based on temporary dense geophone arrays. Geophys Res Lett. https://doi.org/10.1029/2019GL086668

    Article  Google Scholar 

  48. White RS, Drew J, Martens HR, Key J, Soosalu H, Jakobsdóttir SS (2011) Dynamics of dyke intrusion in the mid-crust of Iceland. Earth Planet Sci Lett 304(3–4):300. https://doi.org/10.1016/j.epsl.2011.02.038

    Article  Google Scholar 

  49. Woods J, Winder T, White RS, Brandsdóttir B (2019) Evolution of a lateral dike intrusion revealed by relatively-relocated dike-induced earthquakes: The 2014–15 Bárðarbunga-Holuhraun rifting event, Iceland. Earth Planet Sci Lett 506:53. https://doi.org/10.1016/j.epsl.2018.10.032

    Article  Google Scholar 

  50. Rogers G, Dragert H (2003) Episodic tremor and slip on the Cascadia subduction zone: the chatter of silent slip. Science 300(5627):1942. https://doi.org/10.1126/science.1084783

    Article  Google Scholar 

  51. Frank WB, Shapiro NM, Husker AL, Kostoglodov V, Bhat HS, Campillo M (2015) Along-fault pore-pressure evolution during a slow-slip event in Guerrero, Mexico. Earth Planet Sci Lett 413:135. https://doi.org/10.1016/j.epsl.2014.12.051

    Article  Google Scholar 

  52. Wech AG, Creager KC (2008) Automated detection and location of Cascadia tremor. Geophys Res Lett. https://doi.org/10.1029/2008GL035458

    Article  Google Scholar 

  53. Roessler D, Passarelli L, Govoni A, Bautz R, Dahm T, Maccaferri F, Rivalta E, Schierjott J, Woith H (2014) Extended Pollino Seismic Experiment, 2014-2015, GFZ Potsdam (FEFI, Pompei, NERA projects). GFZ Data Services. https://doi.org/10.14470/L9180569

  54. Matsubara M, Morimoto Y, Uchide T (2016) Collaborative study of interactive seismic array sonification for data exploration and public outreach activities. In Proceedings of ISon 2016, 5th interactive sonification workshop, Bielefeld, Germany

  55. McGee R, Rogers D (2016) Musification of seismic data. In: Proceedings of the international conference on auditory display (ICAD), Canberra, Australia

  56. Groß-Vogt K, Frank M, Höldrich R (2019) Focused Audification and the optimization of its parameters. J Multimodal User Interf. https://doi.org/10.1007/s12193-019-00317-8

  57. Hirschfelder JO, Curtiss CF, Bird RB (1964) Molecular theory of gases and liquids. Wiley, New York

  58. Lossius T, Pascal PB, de la Hogue T (2009) DBAP–distance-based amplitude panning. In: Proceedings of the international computer music conference (ICMC)

  59. Dubus G, Bresin R (2013) A systematic review of mapping strategies for the sonification of physical quantities. PLoS ONE 8(12):e82491. https://doi.org/10.1371/journal.pone.0082491

    Article  Google Scholar 

Download references

Acknowledgements

The authors would like to thank everyone who allowed them to make the audio-visual representations public: thanks to the CMMR organization team and attendance, thanks to the organizers and audience of the Open House days at LDEO/Columbia University, Junia, and Université Catholique de Lille, for giving helpful feedback.

Thanks also to Josh Crozier from the Department of Earth Sciences at the University of Oregon for generating the Kilauea waveform dataset from the catalog.

Funding

No funding was received to assist with the preparation of this manuscript.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Arthur Paté.

Ethics declarations

Conflict of interest

The authors have no conflicts of interest to declare that are relevant to the content of this article.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Paté, A., Farge, G., Holtzman, B.K. et al. Combining audio and visual displays to highlight temporal and spatial seismic patterns. J Multimodal User Interfaces 16, 125–142 (2022). https://doi.org/10.1007/s12193-021-00378-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12193-021-00378-8

Keywords

Navigation