Skip to main content

Auditory Emoticons: Iterative Design and Acoustic Characteristics of Emotional Auditory Icons and Earcons

  • Conference paper
  • 3568 Accesses

Part of the Lecture Notes in Computer Science book series (LNISA,volume 8511)

Abstract

In recent decades there has been an increased interest in sonification research. Two commonly used sonification techniques, auditory icons and earcons, have been the subject of a lot of study. However, despite this there has been relatively little research investigating the relationship between these sonification techniques and emotions and affect. Additionally, despite their popularity, auditory icons and earcons are often treated separately and are rarely compared directly in studies. The current paper shows iterative design procedures to create emotional auditory icons and earcons. The ultimate goal of the study is to compare auditory icons and earcons in their ability to represent emotional states. The results show that there are some strong user preferences both within sonification categories and between sonfication categories. The implications and extensions of this work are discussed.

Keywords

  • auditory icons
  • earcons
  • auditory emoticons
  • non-speech sounds
  • sonification

References

  1. Kramer, G.: An introduction to auditory display. In: Kramer, G. (ed.) Auditory Display: Sonificaiton, Audification, and Auditory Interfaces. Addison-Wesley, MA (1994)

    Google Scholar 

  2. Gaver, W.W.: Auditory icons: Using sound in computer interfaces. Human-Computer Interaction 2, 167–177 (1986)

    CrossRef  Google Scholar 

  3. Blattner, M.M., Sumikawa, D.A., Greenberg, R.M.: Earcons and icons: Their structure and common design principles. Human-Computer Interaction 4, 11–44 (1989)

    CrossRef  Google Scholar 

  4. Gaver, W.W.: The SonicFinder, a prototype interface that uses auditory icons. Human-Computer Interaction 4, 67–94 (1989)

    CrossRef  Google Scholar 

  5. Brewster, S.A.: The design of sonically-enhanced widgets. Interacting with Computers 11(2), 211–235 (1998)

    CrossRef  Google Scholar 

  6. Walker, B.N., Lindsay, J., Nance, A., Nakano, Y., Palladino, D.K., Dingler, T., Jeon, M.: Spearcons (speech-based earcons) improve navigation performance in advanced auditory menus. Human Factors, Online First Version (2012)

    Google Scholar 

  7. Jeon, M., Walker, N.B.: Spindex (Speech Index) improves acceptance and performance in auditory menu navigation for visually impaired and sighted users. ACM Transactions on Accessible Computing 3(3), 10:11-26 (2011)

    Google Scholar 

  8. Schleicher, R., Sundaram, S., Seebode, J.: Assessing audio clips on affective and semantic level to improve general applicability. In: Proceedings of the DAGA (2010)

    Google Scholar 

  9. Lemmens, P.M.C., De Haan, A., van Galen, G.P., Meulenbroek, R.G.J.: Emotionally charged earcons reveal affective congruency effects. Ergonomics 50(12), 2017–2025 (2007)

    CrossRef  Google Scholar 

  10. Larsson, P., Opperud, A., Fredriksson, K., Västfjäll, D.: Emotional and behavioural response to auditory icons and earcons in driver-vehicle interfaces. In: Proceedings of the 21st International Technical Conference on the Enhanced Safety of Vehicles, Stuttgart, Germany, June 15-18 (2009)

    Google Scholar 

  11. Lemmens, P.M.C.: Using the major and minor mode to create affectively-charged earcons. In: Proceedings of the 7th International Conference on Auditory Display, Limerick, Ireland (2005)

    Google Scholar 

  12. Pirhonen, A., Tuuri, K., Mustonen, M.-S., Murphy, E.: Beyond clicks and beeps: In pursuit of an effective sound design methodology. In: Oakley, I., Brewster, S. (eds.) HAID 2007. LNCS, vol. 4813, pp. 133–144. Springer, Heidelberg (2007)

    CrossRef  Google Scholar 

  13. Lee, J.-H., Jeon, M., Han, K.H.: The analysis of sound attributes on sensibility dimensions. In: Proceedings of the 18th International Congress on Acoustics (ICA 2004) , vol. II, Kyoto, Japan (April 2004)

    Google Scholar 

  14. Jeon, M., Lee, J.-H., Kim, Y.E., Han, K.H.: Analysis of musical features and affective words for affection-based music search system. In: Proceedings of the 2004 Korean Conference on Cognitive Science (KCCS 2004), Seoul, Korea (June 2004)

    Google Scholar 

  15. Ekman, P.: An argument for basic emotions. Cognition and Emotion 6, 169–200 (1992)

    CrossRef  Google Scholar 

  16. McGee-Lennon, M., Wolters, M.K., McLachlan, R., Brewster, S.: Hall. C.: Name that tune: Musicons as reminders in the home. In: Proceedings of the SIGCHI Conference on Human Factors in Computing System, Vancouver, BC, Canada (2011)

    Google Scholar 

  17. Jeon, M.: Lyricons (Lyrics + Earcons): Designing a new auditory cue combining speech and sounds. In: Stephanidis, C. (ed.) HCII 2013, Posters, Part I. CCIS, vol. 373, pp. 342–346. Springer, Heidelberg (2013)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and Permissions

Copyright information

© 2014 Springer International Publishing Switzerland

About this paper

Cite this paper

Sterkenburg, J., Jeon, M., Plummer, C. (2014). Auditory Emoticons: Iterative Design and Acoustic Characteristics of Emotional Auditory Icons and Earcons. In: Kurosu, M. (eds) Human-Computer Interaction. Advanced Interaction Modalities and Techniques. HCI 2014. Lecture Notes in Computer Science, vol 8511. Springer, Cham. https://doi.org/10.1007/978-3-319-07230-2_60

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-07230-2_60

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-07229-6

  • Online ISBN: 978-3-319-07230-2

  • eBook Packages: Computer ScienceComputer Science (R0)