Advertisement

Facial Tracking-Assisted Hand Pointing Technique for Wall-Sized Displays

  • Haokan Cheng
  • Takahashi Shin
  • Jiro Tanaka
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9749)

Abstract

In this paper we propose a novel pointing technique leveraging the user’s body motion to achieve smooth, efficient user experiences on wall-sized displays. Our proposal substantially consists of two parts: a graphical cursor controlled by the user’s hand motions, and mechanisms to assist the cursor manipulation by tracking the user’s face orientation. By interaction design associating the user’s face and hand motions to different aspects of the cursor’s movement, we aimed to bring swiftness to the interaction in large-display environments with necessary precision. A prototype was built to instantiate the concept, and two comparative experiments were conducted to evaluate the effectiveness of the proposal.

Keywords

Facial tracking Input method Pointing technique Wall-sized display 

References

  1. 1.
    Zhai, S., Morimoto, C., Ihde, S.: Manual and gaze input cascaded (MAGIC) pointing. In: CHI 1999: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 246–253 (1999)Google Scholar
  2. 2.
    Nakanishi, Y., Fujii, T., Kiatjima, K., Sato, Y., Koike, H.: Vision-based face tracking system for large displays. In: Borriello, G., Holmquist, L.E. (eds.) UbiComp 2002. LNCS, vol. 2498, pp. 152–159. Springer, Heidelberg (2002)CrossRefGoogle Scholar
  3. 3.
    Nakanishi, Y., Sato, Y., Koike, H.: EnhancedDesk and EnhancedWall: augmented desk and wall interfaces with real-time tracking of user’s motion. In: Proceedings of Ubicomp2002 Workshop on Collaborations with Interactive Walls and Tables, pp. 27–30 (2002)Google Scholar
  4. 4.
    Vogel, D., Balakrisham, R.: Distant freehand pointing and clicking on very large, high resolution displays. In: UIST 2005: Proceedings of the 18th Annual ACM Symposium on User Interface Software and Technology, pp. 33–42 (2005)Google Scholar
  5. 5.
    Kobayashi, M., Igarashi, T.: Ninja cursors: using multiple cursors to assist target acquisition on large screens. In: CHI 2008: Proceeding of the Twenty-Sixth Annual SIGCHI Conference on Human Factors in Computing Systems, pp. 949–958 (2008)Google Scholar
  6. 6.
    Blanch, R., Ortega, M.: Rake cursor: improving pointing performance with concurrent input channels. In: CHI 2009: Proceedings of the 27th International Conference on Human Factors in Computing, pp. 1415–1418 (2009)Google Scholar
  7. 7.
    Porta, M., Ravarelli, A., Spagnoli, G.: ceCursor, a contextual eye cursor for general pointing in windows environments. In: ETRA 2010: Proceedings of the 2010 Symposium on Eye-Tracking Research and Applications, pp. 331–338 (2010)Google Scholar
  8. 8.
    MacKenzie, I.S.: An eye on input: research challenges in using the eye for computer input control. In: ETRA 2010: Proceedings of the 2010 Symposium on Eye-Tracking Research and Applications, pp. 11–12 (2010)Google Scholar
  9. 9.
    Liu, C., Chanpuis, O., Beaudouin-Lafon, M., Lecolinet, E., Mackay, W.: Effects of display size and navigation type on a classification task. In: CHI 2014: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 4147–4156 (2014)Google Scholar
  10. 10.
    Nancel, M., Pietriga, E., Chapuis, O., Beaudouin-Lafon, M.: Mid-air pointing on ultra-walls. ACM Trans. Comput.-Hum. Interact. (TOCHI) 22(5), 1–62 (2015)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  1. 1.University of TsukubaTsukubaJapan

Personalised recommendations