Advertisement

A Customizable Camera-Based Human Computer Interaction System Allowing People with Disabilities Autonomous Hands-Free Navigation of Multiple Computing Tasks

  • Wajeeha Akram
  • Laura Tiberii
  • Margrit Betke
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4397)

Abstract

Many people suffer from conditions that lead to deterioration of motor control making access to the computer using traditional input devices difficult. In particular, they may loose control of hand movement to the extent that the standard mouse cannot be used as a pointing device. Most current alternatives use markers or specialized hardware, for example, wearable devices, to track and translate a user’s movement to pointer movement. These approaches may be perceived as intrusive. Camera-based assistive systems that use visual tracking of features on the user’s body often require cumbersome manual adjustment. This paper introduces an enhanced computer vision based strategy where features, for example on a user’s face, viewed through an inexpensive USB camera, are tracked and translated to pointer movement. The main contributions of this paper are (1) enhancing a video based interface with a mechanism for mapping feature movement to pointer movement that allows users to navigate to all areas of the screen even with very limited physical movement and (2) providing a customizable, hierarchical navigation framework for human computer interaction (HCI). This framework provides effective use of the vision-based interface system for accessing multiple applications in an autonomous setting. Experiments with several users show the effectiveness of the mapping strategy and its usage within the application framework as a practical tool for desktop users with disabilities.

Keywords

Computer-vision assistive technology alternative input devices video-based human-computer interfaces autonomous navigation 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    National Multiple Sclerosis Society, http://www.nationalmssociety.org, accessed April 2006
  2. 2.
    Microsoft Accessibility, http://www.microsoft.com/enable/research/agingpop.aspx, accessed April 2006
  3. 3.
    Gips, J., Olivieri, P., Tecce, J.J.: Direct Control of the Computer through Electrodes Placed Around the Eyes. In: Smith, M.J., Salvendy, G. (eds.) Human-Computer Interaction: Applications and Case Studies, M, pp. 630–635. Elsevier, Amsterdam (1993)Google Scholar
  4. 4.
    Synapse Adaptive, http://www.synapseadaptive.com/prc/prchead.htm, accessed April 2006
  5. 5.
    NaturalPoint SmartNAV, http://www.naturalpoint.com/smartnav/, accessed July 2006
  6. 6.
    Magee, J.J., Scott, M.R., Waber, B.N., Betke, M.: EyeKeys: A Real-time Vision Interface Based on Gaze Detection from a Low-grade Video Camera. In: Proceedings of the IEEE Workshop on Real-Time Vision for Human-Computer Interaction (RTV4HCI), Washington, D.C., July, IEEE, Los Alamitos (2004)Google Scholar
  7. 7.
    Betke, M., Gips, J., Fleming, P.: The camera mouse: Visual tracking of body features to provide computer access for people with severe disabilities. IEEE Transactions on Neural Systems and Rehabilitation Engineering 10(1), 1–10 (2002)CrossRefGoogle Scholar
  8. 8.
    Gorodnichy, D.O., Roth, G.: Nouse ‘Use your nose as a mouse’ perceptual vision technology for hands-free games and interfaces. In: Proceedings of the International Conference on Vision Interface, VI 2002, Calgary, Canada, May (2002)Google Scholar
  9. 9.
    Assistive Technologies, http://www.assistivetechnologies.com, accessed April 2006
  10. 10.
    Apple Computer Disability Resources, http://www.apple.com/accessibility, accessed April 2006
  11. 11.
    WiViK on-screen keyboard (virtual keyboard) software, http://www.wivik.com, accessed April 2006
  12. 12.
    The Dasher Project, http://www.inference.phy.cam.ac.uk/dasher accessed April 2006
  13. 13.
    Gips, J., Gips, J.: A Computer Program Based on Rick Hoyt’s Spelling Method for People with Profound Special Needs. In: Proceedings International Conference on Computers Helping People with Special Needs, CCHP 2000, Karlsruhe, pp. 245–250 (2000)Google Scholar
  14. 14.
    Waber, B.N., Magee, J.J., Betke, M.: Web Mediators for Accessible Browsing. Boston University Computer Science Department Technical Report BUCS 2006-007 (May 2006)Google Scholar
  15. 15.
    Larson, H., Gips, J., Web, A.: Browser for People with Quadriplegia. In: Stephanidis, C. (ed.) Universal Access in HCI: Inclusive Design in the Information Society, Proceedings of the International Conference on Human-Computer Interaction, Crete, pp. 226–230. Lawrence Erlbaum Associates, Mahwah (2003)Google Scholar
  16. 16.
    OpenCV library. http://sourcforge.net/projects/opencvlibrary, accessed April 2006
  17. 17.
    Lucas, B.D., Kanade, T.: An iterative image registration technique with an application to stereo vision. In: Proceedings of the 7th International Joint Conference on Artificial Intelligence (IJCAI), Vancouver, Canada, April, pp. 674–679 (1981)Google Scholar
  18. 18.
    Fagiani, C., Betke, M., Gips, J.: Evaluation of tracking methods for human-computer interaction. In: Proceedings of the IEEE Workshop on Applications in Computer Vision (WACV 2002), Orlando, Florida, December, pp. 121–126. IEEE, Los Alamitos (2002)CrossRefGoogle Scholar
  19. 19.
    Human-centered design processes for interactive systems. International Organization for Standardization ISO 13407 (1999)Google Scholar
  20. 20.
    The Boston Home, http://www.thebostonhome.org, accessed April 2006
  21. 21.
    Jacob, R.J.K.: What you look at is what you get. Computer 26(7), 65–66 (1993)CrossRefGoogle Scholar
  22. 22.
    Chau, M., Betke, M.: Real Time Eye Tracking and Blink Detection with USB Cameras. Boston University Computer Science Technical Report 2005-012 (May 2005)Google Scholar
  23. 23.
    Lombardi, J., Betke, M.: A camera-based eyebrow tracker for hands-free computer control via a binary switch. In: Proceedings of the 7th ERCIM Workshop, User Interfaces For All, UI4All 2002, Paris, France, October, pp. 199–200 (2002)Google Scholar
  24. 24.
    Cloud, R.L., Betke, M., Gips, J.: Experiments with a Camera-Based Human-Computer Interface System. In: Proceedings of the 7th ERCIM Workshop, User Interfaces for All, UI4ALL 2002, Paris, France, October, pp. 103–110 (2002)Google Scholar
  25. 25.
    Gorodnichy, D.O.: On importance of nose for face tracking. In: Proceedings of the IEEE International Conference on Automatic Face and Gesture Recognition (FG 2002), Washington, D.C, May, pp. 188–196. IEEE, Los Alamitos (2002)CrossRefGoogle Scholar
  26. 26.
    Ferscha, A.: Contextware: Bridging Physical and Virtual Worlds. In: Proceedings of the Ada-Europe Conference on Reliable Software Technologies (2002)Google Scholar

Copyright information

© Springer Berlin Heidelberg 2007

Authors and Affiliations

  • Wajeeha Akram
    • 1
  • Laura Tiberii
    • 1
  • Margrit Betke
    • 1
  1. 1.Department of Computer Science, Boston University, 111 Cummington Street, Boston, MA 02215USA

Personalised recommendations