Advertisement

Designing and Evaluating a Wearable Device for Accessing Gaze Signals from the Sighted

  • Shi QiuEmail author
  • Matthias Rauterberg
  • Jun Hu
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9737)

Abstract

Gaze signals, frequently used by the sighted in social interactions as visual cues, are hardly accessible for low-vision and blind people. In this paper, we proposed a prototype, namely Tactile Band, to aim at testing the hypothesis that tactile feedback can enable the blind person to feel attention (gaze signals) from the sighted, enhancing the level of engagement in face-to-face communication. We tested our hypothesis with 30 participants with a face-to-face conversation scenario, in which the blindfolded and the sighted participants talked about a given daily topic. Comments from the participants and the reflection on the experiment provided useful insights for improvements and further research.

Keywords

Wearable device Gaze signals Eye-tracking Accessibility Engagement Tactile feedback 

Notes

Acknowledgments

We thank our participants from Hong Kong Blind Union, Yangzhou Special Education School in China. This research is supported by the China Scholarship Council and facilitated by Eindhoven University of Technology.

References

  1. 1.
    Argyle, M., Cook, M.: Gaze and Mutual Gaze. Cambridge University Press, Cambridge (1976)Google Scholar
  2. 2.
    White, G.R., Fitzpatrick, G., McAllister, G.: Toward accessible 3D virtual environments for the blind and visually impaired. In: 3rd International Conference on Digital Interactive Media in Entertainment and Arts, pp. 134–141 (2008)Google Scholar
  3. 3.
    McNeill, D.: Hand and Mind: What Gestures Reveal About Thought. University of Chicago Press, Chicago (1992)Google Scholar
  4. 4.
    Krishna, S., Balasubramanian, V., Panchanathan, S.: Enriching social situational awareness in remote interactions: insights and inspirations from disability focused research. In: International Conference on Multimedia, pp. 1275–1284 (2010)Google Scholar
  5. 5.
    Argyle, M.: The Psychology of Interpersonal Behaviour. Penguin UK, London (1994)Google Scholar
  6. 6.
    Kendon, A.: Some functions of gaze-direction in social interaction. Acta Psychol. 26, 22–63 (1967)CrossRefGoogle Scholar
  7. 7.
    Vertegaal, R., Slagter, R., Van der Veer, G., Nijholt, A.: Eye gaze patterns in conversations: there is more to conversational agents than meets the eyes. In: SIGCHI Conference on Human Factors in Computing Systems, pp. 301–308 (2001)Google Scholar
  8. 8.
    Krishna, S., Colbry, D., Black, J., Balasubramanian, V., Panchanathan, S., et al.: A systematic requirements analysis and development of an assistive device to enhance the social interaction of people who are blind or visually impaired. In: Workshop on Computer Vision Applications for the Visually Impaired (2008)Google Scholar
  9. 9.
    ur Rahman, S., Liu, L.: Vibrotactile rendering of human emotions on the manifold of facial expressions. J. Multimedia 3(3), 18–25 (2008)Google Scholar
  10. 10.
    Krishna, S., Bala, S., McDaniel, T., McGuire, S., Panchanathan, S.: VibroGlove: an assistive technology aid for conveying facial expressions. In: Extended Abstracts on Human Factors in Computing Systems, pp. 3637–3642 (2010)Google Scholar
  11. 11.
    Finocchietti, S., Cappagli, G., Baud-Bovy, G., Magnusson, C., Caltenco, H., Wilson, G., Brewster, S., Rogge, A., Röder, B., Cocchi, E., Capris, E., Campana, P., Gilio, C., Gori, M.: ABBI, a New technology for sensory-motor rehabilitation of visual impaired people. In: International Conference on Enabling Access for Persons with Visual Impairment, pp. 80–84 (2015)Google Scholar
  12. 12.
    Qiu, S., Osawa, H., Rauterberg, M., Hu, J.: E-gaze: create gaze communication for persons with visual disability. In: 3rd International Conference on Human-Agent Interaction, pp. 199–202 (2015)Google Scholar
  13. 13.
    Sears, A., Hanson, V.L.: Representing users in accessibility research. ACM Trans. Accessible Comput. 4(2), 7 (2012)CrossRefGoogle Scholar
  14. 14.
    Strauß, P.M., Hoffmann, H., Minker, W., Neumann, H., Palm, G., Scherer, S., Weidenbacher, U.: Wizard-of-oz data collection for perception and interaction in multi-user environments. In: International Conference on Language Resources and Evaluation, pp. 2014–2017 (2006)Google Scholar
  15. 15.
    IELTS Speaking Module - Part 2 - Sample Topics (2012). http://www.goodluckielts.com/IELTS-speaking-topics-2.html
  16. 16.
    McAuley, E., Duncan, T., Tammen, V.V.: Psychometric properties of the intrinsic motivation inventory in a competitive sport setting: a confirmatory factor analysis. Res. Q. Exerc. Sport 60(1), 48–58 (1989)CrossRefGoogle Scholar
  17. 17.
    Aron, A., Aron, E.N., Smollan, D.: Inclusion of other in the self scale and the structure of interpersonal closeness. J. Pers. Soc. Psychol. 63(4), 596 (1992)CrossRefGoogle Scholar
  18. 18.
    Mojzisch, A., Schilbach, L., Helmert, J.R., Pannasch, S., Velichkovsky, B.M., Vogeley, K.: The effects of self-involvement on attention, arousal, and facial expression during social interaction with virtual others: a psychophysiological study. Soc. Neurosci. 1(3–4), 184–195 (2006)CrossRefGoogle Scholar
  19. 19.
    Hsieh, H.-F., Shannon, S.E.: Three approaches to qualitative content analysis. Qual. Health Res. 15(9), 1277–1288 (2005)CrossRefGoogle Scholar
  20. 20.
    Qiu, S., Hu, J., Rauterberg, M.: Nonverbal signals for face-to-face communication between the blind and the sighted. In: International Conference on Enabling Access for Persons with Visual Impairment, pp. 157–165 (2015)Google Scholar
  21. 21.
    Jokinen, K., Furukawa, H., Nishida, M., Yamamoto, S.: Gaze and turn-taking behavior in casual conversational interactions. ACM Trans. Interact. Intell. Syst. 3(2), 12 (2013)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  1. 1.Eindhoven University of TechnologyEindhovenThe Netherlands

Personalised recommendations