Auditory Emoticons: Iterative Design and Acoustic Characteristics of Emotional Auditory Icons and Earcons

  • Jason Sterkenburg
  • Myounghoon Jeon
  • Christopher Plummer
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8511)

Abstract

In recent decades there has been an increased interest in sonification research. Two commonly used sonification techniques, auditory icons and earcons, have been the subject of a lot of study. However, despite this there has been relatively little research investigating the relationship between these sonification techniques and emotions and affect. Additionally, despite their popularity, auditory icons and earcons are often treated separately and are rarely compared directly in studies. The current paper shows iterative design procedures to create emotional auditory icons and earcons. The ultimate goal of the study is to compare auditory icons and earcons in their ability to represent emotional states. The results show that there are some strong user preferences both within sonification categories and between sonfication categories. The implications and extensions of this work are discussed.

Keywords

auditory icons earcons auditory emoticons non-speech sounds sonification 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Kramer, G.: An introduction to auditory display. In: Kramer, G. (ed.) Auditory Display: Sonificaiton, Audification, and Auditory Interfaces. Addison-Wesley, MA (1994)Google Scholar
  2. 2.
    Gaver, W.W.: Auditory icons: Using sound in computer interfaces. Human-Computer Interaction 2, 167–177 (1986)CrossRefGoogle Scholar
  3. 3.
    Blattner, M.M., Sumikawa, D.A., Greenberg, R.M.: Earcons and icons: Their structure and common design principles. Human-Computer Interaction 4, 11–44 (1989)CrossRefGoogle Scholar
  4. 4.
    Gaver, W.W.: The SonicFinder, a prototype interface that uses auditory icons. Human-Computer Interaction 4, 67–94 (1989)CrossRefGoogle Scholar
  5. 5.
    Brewster, S.A.: The design of sonically-enhanced widgets. Interacting with Computers 11(2), 211–235 (1998)CrossRefGoogle Scholar
  6. 6.
    Walker, B.N., Lindsay, J., Nance, A., Nakano, Y., Palladino, D.K., Dingler, T., Jeon, M.: Spearcons (speech-based earcons) improve navigation performance in advanced auditory menus. Human Factors, Online First Version (2012)Google Scholar
  7. 7.
    Jeon, M., Walker, N.B.: Spindex (Speech Index) improves acceptance and performance in auditory menu navigation for visually impaired and sighted users. ACM Transactions on Accessible Computing 3(3), 10:11-26 (2011)Google Scholar
  8. 8.
    Schleicher, R., Sundaram, S., Seebode, J.: Assessing audio clips on affective and semantic level to improve general applicability. In: Proceedings of the DAGA (2010)Google Scholar
  9. 9.
    Lemmens, P.M.C., De Haan, A., van Galen, G.P., Meulenbroek, R.G.J.: Emotionally charged earcons reveal affective congruency effects. Ergonomics 50(12), 2017–2025 (2007)CrossRefGoogle Scholar
  10. 10.
    Larsson, P., Opperud, A., Fredriksson, K., Västfjäll, D.: Emotional and behavioural response to auditory icons and earcons in driver-vehicle interfaces. In: Proceedings of the 21st International Technical Conference on the Enhanced Safety of Vehicles, Stuttgart, Germany, June 15-18 (2009)Google Scholar
  11. 11.
    Lemmens, P.M.C.: Using the major and minor mode to create affectively-charged earcons. In: Proceedings of the 7th International Conference on Auditory Display, Limerick, Ireland (2005)Google Scholar
  12. 12.
    Pirhonen, A., Tuuri, K., Mustonen, M.-S., Murphy, E.: Beyond clicks and beeps: In pursuit of an effective sound design methodology. In: Oakley, I., Brewster, S. (eds.) HAID 2007. LNCS, vol. 4813, pp. 133–144. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  13. 13.
    Lee, J.-H., Jeon, M., Han, K.H.: The analysis of sound attributes on sensibility dimensions. In: Proceedings of the 18th International Congress on Acoustics (ICA 2004) , vol. II, Kyoto, Japan (April 2004)Google Scholar
  14. 14.
    Jeon, M., Lee, J.-H., Kim, Y.E., Han, K.H.: Analysis of musical features and affective words for affection-based music search system. In: Proceedings of the 2004 Korean Conference on Cognitive Science (KCCS 2004), Seoul, Korea (June 2004)Google Scholar
  15. 15.
    Ekman, P.: An argument for basic emotions. Cognition and Emotion 6, 169–200 (1992)CrossRefGoogle Scholar
  16. 16.
    McGee-Lennon, M., Wolters, M.K., McLachlan, R., Brewster, S.: Hall. C.: Name that tune: Musicons as reminders in the home. In: Proceedings of the SIGCHI Conference on Human Factors in Computing System, Vancouver, BC, Canada (2011)Google Scholar
  17. 17.
    Jeon, M.: Lyricons (Lyrics + Earcons): Designing a new auditory cue combining speech and sounds. In: Stephanidis, C. (ed.) HCII 2013, Posters, Part I. CCIS, vol. 373, pp. 342–346. Springer, Heidelberg (2013)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • Jason Sterkenburg
    • 1
  • Myounghoon Jeon
    • 1
  • Christopher Plummer
    • 2
  1. 1.Cognitive & Learning SciencesUSA
  2. 2.Visual and Performing ArtsMichigan Technological UniversityHoughtonUSA

Personalised recommendations