Evaluating Somatosensory Interactions: Designing a Handheld Tactile Acoustic Device for Mobile Phones

  • Maria KaramEmail author
  • Patrick M. Langdon
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9738)


We present the details of preliminary development efforts to create a tactile acoustic device (TAD) for the hands. The Model Human Cochlea (MHC) is a method for conveying sound to the body in a chair form factor, and originally developed as a sensory substitution system to provide some access to sounds from movies or music to deaf and hard of hearing people. We present initial design research on modifying the MHC system from a chair to a mobile handheld tactile device, towards improving mobile-phone speech comprehension in noisy environments. Scaling the MHC from the back, which has the least sensitive skin on the body, to the highly sensitive skin on the hands, requires an understanding of the physiology, psychology, electronics, and software applicable to this kind of sensation. This research addresses factors critical to expanding the design of tactile acoustic devices for somatosensory system interactions to different areas of the body.


Interactive displays Somatosensory systems Crossmodal systems Mobile phone interaction Multimodal HCI systems Sensory substitution Tactile acoustic devices 


  1. Bach-y-Rita, P., Webster, J.G., Tompkins, W.J., Crabb, T.: Sensory substitution for space gloves and space robots. In: Space Telerobotics Workshop, Jet Propulsion Laboratory, Pasadena, CA, 20–22 1987, pp. 51–57 (1987)Google Scholar
  2. Bach-Y-Rita, P., Collins, C.C., Saunders, F.A., White, B., Scadden, L.: Vision substitution by tactile image projection. Nature 221(5184), 963–964 (1969)CrossRefGoogle Scholar
  3. Brooks, P.L., Frost, B.J.: Evaluation of a tactile Vocoder for word recognition. J. Acoust. Soc. Am. 74(1), 34–39 (1983)CrossRefGoogle Scholar
  4. Gallace, A., Spence, C.: In touch with the future: The sense of touch from cognitive neuroscience to virtual reality. OUP, Oxford (2014)CrossRefGoogle Scholar
  5. Gault, R.H.: Hearing through the sense organs of touch and vibration. J. Franklin Inst. 204(3), 329–358 (1927)CrossRefGoogle Scholar
  6. Gemperle, F., Nathan O., Siewiorek, D.: Design of a wearable tactile display. In: Proceedings of the 5th IEEE International Symposium on Wearable Computers ISWC 2001. IEEE Computer Society, Washington (2001)Google Scholar
  7. Karam, M., Langdon, P.: Seeing, hearing and feeling through the body: the emerging science of human-somatosensory interactions. In: Antona, M., Stephanidis, C. (eds.) UAHCI 2015. LNCS, vol. 9176, pp. 205–216. Springer, Heidelberg (2015)CrossRefGoogle Scholar
  8. Karam, M., Russo, F.A., Fels, D.I.: Designing the model human cochlea: an ambient crossmodal audio-tactile display. IEEE Trans. Haptics 2(3), 160–169 (2009)CrossRefGoogle Scholar
  9. Meenowa, J., Hameed, M., Furner, S., Langdon, P.M.: Using haptic displays for workload reduction and inclusive control. In: 11th International Conference on Human Computer Interaction (HCII 2005), 2005-7-22 to 2005-7-27, Las Vegas, Nevada, USA (2005)Google Scholar
  10. Russo, F.A., Ammirante, P., Fels, D.I.: Vibrotactile discrimination of musical timbre. J. Exp. Psychol. Hum. Percept. Perform. 38, 822–826 (2012)CrossRefGoogle Scholar
  11. Wall, B., Brewster, S.: Feeling what you hear: tactile feedback for navigation of audio graphs. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI 2006), pp. 1123–1132. ACM, New York (2006)Google Scholar
  12. Withana, A., Koyama, S., Saakes, D., Minamizawa, K., Inami, M., Nanayakkara, S.: RippleTouch: initial exploration of a wave resonant based full body haptic interface. In: Proceedings of the 6th Augmented Human International Conference (AH 2015), pp. 61–68. ACM, New York (2015)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  1. 1.Kings College LondonLondonUK
  2. 2.Department of EngineeringUniversity of CambridgeCambridgeUK

Personalised recommendations