Influence of the Auditory Localization Direction on the Haptic Estimation of Virtual Length

  • Maik Stamm
  • M. Ercan Altinsoy
  • Sebastian Merchel
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6851)

Abstract

Haptic feedback can be utilized for solving a variety of different tasks in the virtual world. The identification of virtual shapes and objects is a particularly important task. Stamm et al. strived to detect the basic principles of shape and object identification in virtual worlds while conducting haptic identification experiments with numerous virtual models in a previous study. During the exploration and recognition process subjects experienced various difficulties that directly refer to the basic principles. One of those difficulties is subjects’ insufficient spatial orientation in the virtual scene. A promising approach refers to the utilization of auditory localization cues. However, it is important to investigate possible interaction effects of such a multimodal reproduction. This work investigates if the haptic recognition of geometrical characteristics could be influenced by simultaneously reproduced localization cues. Specifically, it is investigated if the auditory localization direction influences the haptic length estimation of virtual objects.

Keywords

haptic virtual objects length estimation force-feedback auditory localization direction stereophony 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Jansson, G., Bergamasco, M., Frisoli, A.: A new option for the visually impaired to experience 3d art at museums: Manual exploration of virtual copies. Visual Impairment Research 5(1), 1–12 (2003)CrossRefGoogle Scholar
  2. 2.
    Van Scoy, F.L., Kawai, T., Darrah, M., Rash, C.: Haptic display of mathematical functions for teaching mathematics to students with vision disabilities: Design and proof of concept. In: Brewster, S., Murray-Smith, R. (eds.) Haptic HCI 2000. LNCS, vol. 2058, pp. 31–40. Springer, Heidelberg (2001)CrossRefGoogle Scholar
  3. 3.
    Sepulveda-Cervantes, G., Parra-Vega, V., Dominguez-Ramirez, O.: Haptic cues for effective learning in 3d maze navigation. In: IEEE International Workshop on Haptic Audio visual Environments and Games, HAVE 2008, pp. 93–98 (2008)Google Scholar
  4. 4.
    Keehner, M., Lowe, R.K.: Seeing with the hands and with the eyes: The contributions of haptic cues to anatomical shape recognition in surgery. Association for the Advancement of Artificial Intelligence (2009)Google Scholar
  5. 5.
    Holland, K.L., Williams II, R.L., Conatser Jr., R.R., Howell, J.N., Cade, D.L.: The implementation and evaluation of a virtual haptic back. Virtual Reality Society 7, 94–102 (2004)CrossRefGoogle Scholar
  6. 6.
    Faeth, A., Oren, M., Harding, C.: Combining 3-d geovisualization with force feedback driven user interaction. In: ACM SIGSPATIAL International Conference on Advances in Geographic Information Systems, Irvine, CA, USA (2008)Google Scholar
  7. 7.
    Qi, W.: Geometry based haptic interaction with scientific data. In: ACM International conference on Virtual reality continuum and its applications, Hong Kong (2006)Google Scholar
  8. 8.
    Stamm, M., Altinsoy, M.E., Merchel, S.: Identification Accuracy and Efficiency of Haptic Virtual Objects Using Force-Feedback. In: 3rd International Workshop on Perceptual Quality of Systems, Bautzen, Germany (2010)Google Scholar
  9. 9.
    Lederman, S.J., Klatzky, R.L.: Haptic identification of common objects: Effects of constraining the manual exploration process. Perception & Psychophysics 66, 618–628 (2004)CrossRefGoogle Scholar
  10. 10.
    Colwell, C., Petrie, H., Kornbrot, D.: Use of a haptic device by blind and sighted people: Perception of virtual textures and objects. In: Placencia, I., Porrero, E. (eds.) Improving the Quality of Life for the European Citizen: Technology for Inclusive Design and Equality. IOS Press, Amsterdam (1998)Google Scholar
  11. 11.
    Magnusson, C., Rassmus-Grohn, K.: A virtual traffic environment for people with visual impairment. Visual Impairment Research 7(1), 1–12 (2005)CrossRefGoogle Scholar
  12. 12.
    Magnusson, C., Rassmus-Grhn, K.: Audio haptic tools for navigation in non visual environments. In: The 2nd International Conference on Enactive Interfaces, ENACTIVE 2005, Genoa, Italy, pp. 17–18 (2005)Google Scholar
  13. 13.
    Murphy, E., Moussette, C., Verron, C., Guastavino, C.: Design and evaluation of an audio-haptic interface. In: eNTERFACE, Orsay-Paris, France (2008)Google Scholar
  14. 14.
    Wood, J., Magennis, M., Francisca, E., Arias, C., Gutierrez, T., Bergamasco, M.: The design and evaluation of a computer game for the blind in the grab haptic audio virtual environment. In: EuroHaptics (2003)Google Scholar
  15. 15.
    Pulkki, V.: Spatial Sound Generation and Perception by Amplitude Panning Techniques. Ph.D. thesis, Helsinki University of Technology (2001)Google Scholar
  16. 16.
    SensAble Technologies: PHANTOM OMNI Technical Specifications (2010), http://www.sensable.com
  17. 17.
    Conti, F., Barbagli, F., Morris, D., Sewell, C.: Chai 3D - Documentation (2009), http://www.chai3d.org
  18. 18.
    Open Source Project: Pure Data - Documentation (2011), http://puredata.info/
  19. 19.
    Gepshtein, S., Banks, M.S.: Viewing geometry determines how vision and haptics combine in size perception. Current Biology 13, 483–488 (2003)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Maik Stamm
    • 1
  • M. Ercan Altinsoy
    • 1
  • Sebastian Merchel
    • 1
  1. 1.Chair of Communication AcousticsDresden University of TechnologyGermany

Personalised recommendations