Skip to main content
Log in

Feeling what you hear: task-irrelevant sounds modulate tactile perception delivered via a touch screen

  • Original Paper
  • Published:
Journal on Multimodal User Interfaces Aims and scope Submit manuscript

Abstract

Several recent studies of crossmodal perception have demonstrated that the presentation of task-irrelevant auditory stimuli can modulate the number of tactile stimuli that a person perceives. In the present study, we attempted to extend these findings concerning audiotactile interactions in human information processing to a touch screen device. Two experiments were conducted in order to address the following research questions: 1) Can the presentation of task-irrelevant sounds be used to modify the perception of the number of tactile pulses delivered via a touch-screen device? 2) Do task-irrelevant auditory stimuli have a more pronounced effect on the tactile perception of numerosity when the task conditions become more attentionally-demanding (i.e., under conditions of dual-tasking)? The results of both experiments demonstrate that the presentation of task-irrelevant sounds can modulate the number of vibrotactile targets that a participant will perceive. What is more, task-irrelevant sounds had a larger effect on tactile perception when the participants had to perform a secondary attention-demanding task at the same time.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Ely B (2006) Ten things that will change the way we live. Forbes.com. Available via http://www.forbes.com/lifestyle/2006/02/16/sony-sun-cisco-cx_cd_0217feat_ls.html, Feb. 2006

  2. Logitech (2008) G25 racing wheel. Available via http://www.logitech.com/index.cfm/gaming/pc_gaming/wheels/devices/131&cl=us,en

  3. BMW Professional Development (2003) The new 5 series IDrive. BMW of North America, LLC, Woodcliff Lake

    Google Scholar 

  4. Brewster S, Chohan F, Brown L (2007) Tactile feedback for mobile interactions. In: Proceedings of the SIGCHI conference on human factors in computing systems (CHI 2007), pp 159–162

  5. Brown L, Brewster S, Purchase H (2005) A first investigation into the effectiveness of tactons. In: Proceeding of world haptics 2005, pp 167–176

  6. Lee JC, Dietz P, Leigh D et al (2004) Haptic pen: a tactile feedback stylus for touch screens. In: Proceedings of the ACM symposium on user interfaces software and technology 2004, pp 291–294

  7. Kristoffersen S, Ljungberg F (1999) “Making place” to make IT work: empirical explorations of HCI for mobile CSCW. In: Proceeding of SIGGROUP’99. ACM, New York, pp 276–285

    Google Scholar 

  8. Kunkel P (1999) Digital dreams: the work of the Sony design center. Universe Books, p 208

  9. Poupyrev I, Maruyama S (2003) Tactile interfaces for small touch screens. In: UIST 2003, ACM symposium on user interface software and technology, CHI Letters, vol 5, pp 217–220

  10. Hoggan E, Brewster S (2006) Crossmodal icons for information display. In: CHI 2006, ACM conference on human factors in computing systems, pp 857–862

  11. Lee JH, Spence C (2008) Assessing the benefits of multimodal feedback on dual-task performance under demanding conditions. In: Proceeding of human computer interaction 2008, pp 185–192

  12. Deatherage BH (1972) Auditory and other sensory forms of information processing. In: Van Cott HP, Kinkade RG (eds) Human engineering guide to equipment design. Wiley, New York, pp 124–160

    Google Scholar 

  13. Brown L, Brewster S, Purchase H (2005) A first investigation into the effectiveness of tactons. In: Proceedings of world haptics 2005, pp 167–176

  14. Calvert GA, Spence C, Stein BE (eds) (2004) The handbook of multisensory processing. MIT Press, Cambridge

    Google Scholar 

  15. Spence C, Driver J (eds) (2004) Crossmodal space and crossmodal attention. Oxford University Press, Oxford

    Google Scholar 

  16. Shams L, Kamitani Y, Shimojo S (2000) What you see is what you hear. Nature 408:788

    Article  Google Scholar 

  17. Shams L, Kamitani Y, Shimojo S (2002) Visual illusion induced by sound. Cogn Brain Res 14:147–152

    Article  Google Scholar 

  18. Violentyev A, Shimojo S, Shams L (2005) Touch-induced visual illusion. Neuroreport 16:1107–1110

    Article  Google Scholar 

  19. Hötting K, Röder B (2004) Hearing cheats touch, but less in congenitally blind than in sighted individuals. Psychol Sci 15:60–64

    Article  Google Scholar 

  20. Bresciani JP, Ernst MO, Drewing K et al. (2005) Feeling what you hear: auditory signals can modulate tactile tap perception. Exp Brain Res 162:172–180

    Article  Google Scholar 

  21. Bresciani JP, Ernst MO (2007) Signal reliability modulates auditory-tactile integration for event counting. Neuroreport 18:1157–1161

    Article  Google Scholar 

  22. Wozny DR, Beierholm UR, Shams L (2008) Human trimodal perception follows optimal statistical inference. J Vis 8:1–11

    Article  Google Scholar 

  23. Bresciani JP, Dammeier F, Ernst MO (2008) Tri-modal integration of visual, tactile and auditory signals for the perception of sequences of events. Brain Res Bull 75:753–760

    Article  Google Scholar 

  24. Hötting K, Friedrich CK, Röder B (2008) Hearing cheats tactile deviant-detection: an event-related potential study. Poster presented at the 9th Annual meeting of the international multisensory research forum, Hamburg, Germany, 16–19 July 2008

  25. Senkowski D, Schneider TR, Foxe JJ, Engel AK (2008) Crossmodal binding through neural coherence: implications for multisensory processing. Trends Neurosci 31:401–409

    Article  Google Scholar 

  26. Chapman CE, Bushnell MC, Miron D, Duncan GH, Lund JP (1987) Sensory perception during movement in man. Exp Brain Res 68:516–524

    Article  Google Scholar 

  27. Gallace A, Tan HZ, Spence C (2006) The failure to detect tactile change: a tactile analogue of visual change blindness. Psychon Bull Rev 13:300–303

    Google Scholar 

  28. Spence C, Ranson J, Driver J (2000) Crossmodal selective attention: on the difficulty of ignoring sounds at the locus of visual attention. Percept Psychophys 62:410–424

    Google Scholar 

  29. Lécuyer A, Burkhardt J-M, Etienne L (2004) Feeling bumps and holes without a haptic interface: the perception of pseudo-haptic textures. In: CHI 2004, ACM conference on human factors in computing systems, CHI Letters, vol 6, pp 239–246

  30. Barnard PJ (1991) Connecting psychological theory to HCl: science, craft or just plain craftiness? In: IEE colloquium on theory in human-computer interaction (HCI), London, UK, 17 Dec 1991

  31. Kitagawa N, Spence C (2006) Audiotactile multisensory interactions in human information processing. Jpn Psychol Res 48:158–173

    Article  Google Scholar 

  32. Spence C (2007) Audiovisual multisensory integration. Acoust Sci Technol 28:61–70

    Article  MathSciNet  Google Scholar 

  33. Lee J-H, Spence C (2008) Spatiotemporal visuotactile interaction. In: Lecture notes in computer science, vol 5024. Springer, Berlin, pp 826–831

    Google Scholar 

  34. Kitagawa N, Kato M, Kashino M (2008) Voluntary action improves auditory-somatosensory crossmodal temporal resolution. Poster presented at the 9th Annual meeting of the international multisensory research forum, Hamburg, Germany, 16–19 July 2008

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ju-Hwan Lee.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Lee, JH., Spence, C. Feeling what you hear: task-irrelevant sounds modulate tactile perception delivered via a touch screen. J Multimodal User Interfaces 2, 145 (2008). https://doi.org/10.1007/s12193-009-0014-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s12193-009-0014-8

Keywords

Navigation