FlowMouse: A Computer Vision-Based Pointing and Gesture Input Device

  • Andrew D. Wilson
  • Edward Cutrell
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3585)

Abstract

We introduce FlowMouse, a computer vision-based pointing device and gesture input system. FlowMouse uses optical flow techniques to model the motion of the hand and a capacitive touch sensor to enable and disable interaction. By using optical flow rather than a more traditional tracking based method, FlowMouse is exceptionally robust, simple in design, and offers opportunities for fluid gesture-based interaction that go well beyond merely emulating pointing devices such as the mouse. We present a Fitts law study examining pointing performance, and discuss applications of the optical flow field for gesture input.

References

  1. 1.
    Anandan, P.: A Computational Framework and Algorithm for the Measurement of Visual Motion. International Journal of Computer Vision 2, 283–310 (1989)CrossRefGoogle Scholar
  2. 2.
    Barron, J., Fleet, D., Beauchemin, S., Burkitt, T.: Performance of Optical Flow Techniques. In: Computer Vision and Pattern Recognition (1992)Google Scholar
  3. 3.
    Bathiche, S.: Pointer Ballistics for Windows XP (2002), http://www.microsoft.com/whdc/device/input/pointer-bal.mspx
  4. 4.
    Bradski, G., Eruhimov, V., Molinov, S., Mosyagin, V., Pisarevsky, V.: A Video Joystick from a Toy. In: Proceedings of the 2001 Workshop on Perceptive User Interfaces (2001)Google Scholar
  5. 5.
    Buxton, W.: A Three-State Model of Graphical Input. In: INTERACT 1990 (1990)Google Scholar
  6. 6.
    Buxton, W., Meyers, B.: A Study in Two-Handed Input. In: Proc. of CHI 1986: ACM Conference on Human Factors in Computing Systems (1986)Google Scholar
  7. 7.
    Cao, X., Balakrishnan, R.: VisionWand: Interaction Techniques for Large Displays Using a Passive Wand Tracked in 3D. In: ACM Symposium on User Interface Software and Technology (2003)Google Scholar
  8. 8.
    Cutler, R., Turk, M.: View-based Interpretation of Real-time Optical Flow for Gesture Recognition. In: IEEE Conference on Automatic Face and Gesture Recognition (1998)Google Scholar
  9. 9.
    Douglas, S., Kirkpatrick, A., MacKenzie, I.S.: Testing Pointing Device Performance and User Assessment with the ISO 9241, Part 9 Standard. In: Proc. CHI 1999 (1999)Google Scholar
  10. 10.
    Fitzmaurice, G.W., Ishii, H., Buxton, W.: Bricks: Laying the Foundations for Graspable User Interfaces. In: Proceedings of CHI 1995 (1995)Google Scholar
  11. 11.
    Hinckley, K., Sinclair, M.: Touch-Sensing Input Devices. In: ACM CHI 1999 Conference on Human Factors in Computing Systems (1999)Google Scholar
  12. 12.
    Hinckley, K., Pausch, R., Proffitt, D., Kassell, N.: Interaction and Modeling Techniques for Desktop Two-Handed Input. In: ACM UIST 1998 Symposium on User Interface Software & Technology (1998)Google Scholar
  13. 13.
    Horn, B.K.P.: Closed Form Solution of Absolute Orientation Using Unit Quaternions. Journal of the Optical Society 4(4), 629–642 (1987)CrossRefGoogle Scholar
  14. 14.
    Jellinek, H.D., Card, S.K.: Powermice and User Performance. In: SIGCHI Conference on Human Factors in Computing Systems (1990)Google Scholar
  15. 15.
    Kjeldsen, R., Kender, J.: Interaction with On-Screen Objects Using Visual Gesture Recogntion. In: CVPR 1997 (1997)Google Scholar
  16. 16.
    Krueger, M.: Artificial Reality II. Addison-Wesley, Reading (1991)Google Scholar
  17. 17.
    Kurtenbach, G., Buxton, W.: The Limits of Expert Performance Using Hierarchic Marking Menus. In: Proceedings of InterCHI 1999 (1993)Google Scholar
  18. 18.
    Lenman, S., Bretzner, L., Thuresson, B.: Using Marking Menus to Develop Command Sets for Computer Vision Based Hand Gesture Interfaces. In: Proceedings of the Second Nordic Conference On Human-Computer Interaction (2002)Google Scholar
  19. 19.
    Letessier, J., Berard, F.: Visual Tracking of Bare Fingers for Interactive Surfaces. In: ACM Symposium on User Interface Software and Technology (2004)Google Scholar
  20. 20.
    MacKenize, I.S.: Fitts’ Law as Research and Design Tool in Human-Computer Interaction. In: Human-Computer Interaction 1992, pp. 91–139 (1992)Google Scholar
  21. 21.
    Quek, F., Mysliwiec, T., Zhao, M.: FingerMouse: A Freehand Computer Pointing Interface. In: Proc. of Int’l Conf. on Automatic Face and Gesture Recognition (1995)Google Scholar
  22. 22.
    Rekimoto, J.: ThumbSense: Automatic Mode Sensing for Touchpad-based Interactions. In: CHI 2003 Late Breaking Results (2003)Google Scholar
  23. 23.
    Rekimoto, J., Ayatsuka, Y.: CyberCode: Designing Augmented Reality Environments with Visual Tags. In: Designing Augmented Reality Environments, DARE 2000 (2000)Google Scholar
  24. 24.
    Turk, M., Robertson, G.: Perceptual User Interfaces. Communications of the ACM (2000)Google Scholar
  25. 25.
    Wellner, P.: Interacting with Paper on the DigitalDesk. Communications of the ACM 36(7), 86–97 (1993)Google Scholar
  26. 26.
    Wilson, A.: TouchLight: An Imaging Touch Screen and Display for Gesture-Based Interaction. In: International Conference on Multimodal Interfaces (2004)Google Scholar
  27. 27.
    Wilson, A., Oliver, N.: GWindows: Towards Robust Perception-Based UI. In: First IEEE Workshop on Computer Vision and Pattern Recognition for Human Computer Interaction (2003)Google Scholar
  28. 28.
    Wu, M., Balakrishnan, R.: Multi-finger and Whole Hand Gestural Interaction Techniques for Multi-User Tabletop Displays. In: ACM Symposium on User Interface Software and Technology (2003)Google Scholar

Copyright information

© IFIP International Federation for Information Processing 2005

Authors and Affiliations

  • Andrew D. Wilson
    • 1
  • Edward Cutrell
    • 1
  1. 1.Microsoft ResearchRedmondUSA

Personalised recommendations