Skip to main content

Intelligent Interfaces to Empower People with Disabilities

  • Chapter
Book cover Handbook of Ambient Intelligence and Smart Environments

Abstract

Severe motion impairments can result from non-progressive disorders, such as cerebral palsy, or degenerative neurological diseases, such as Amyotrophic Lateral Sclerosis (ALS), Multiple Sclerosis (MS), or muscular dystrophy (MD). They can be due to traumatic brain injuries, for example, due to a traffic accident, or to brainstem strokes [9, 84]. Worldwide, these disorders affect millions of individuals of all races and ethnic backgrounds [4, 75, 52]. Because disease onset of MS and ALS typically occurs in adulthood, afflicted people are usually computer literate. Intelligent interfaces can immensely improve their daily lives by allowing them to communicate and participate in the information society, for example, by browsing the web, posting messages, or emailing friends. However, people with advanced ALS, MS, or MD may reach a point when they cannot control the keyboard and mouse anymore and also cannot rely on automated voice recognition because their speech has become slurred.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. AbleNet Switches (2008) Roseville, MN, USA. http://www.ablenetinc.com

  2. Akram W, Tiberii L, Betke M (2006) A customizable camera-based human computer interaction system allowing people with disabilities autonomous hands free navigation of multiple computing tasks. In: Stephanidis C, Pieper M(eds) Universal Access in Ambient Intelligence Environments – 9th International ERCIM Workshop “User Interfaces For All” UI4ALL 2006, Königswinter, Germany, September 2006, Revised Papers. LNCS 4397, Springer-Verlag, pp 28–42

    Google Scholar 

  3. Akram W, Tiberii L, Betke M (2008) Designing and evaluating video-based interfaces for users with motion impairments. Universal Access in the Information Society, in review

    Google Scholar 

  4. ALS Association (2008) http://www.alsa.org

  5. Animate! (2008) Software for camera mouse users to create video animations of an anthropomorphic figure. http://csr.bu.edu/visiongraphics/CameraMouse/animate.html

  6. Athitsos V, Wang J, Sclaroff S, Betke M (2006) Detecting instances of shape classes that exhibit variable structure. In: Computer Vision – ECCV 2006, 9th European Conference on Computer Vision, Graz, Austria, May 7-13, 2006, Proceedings, Part 1, LNCS, Vol. 3951, Springer Verlag, pp 121–134

    Google Scholar 

  7. Barea R, Boquete L, Mazo M, López E (2002) System for assisted mobility using eye movements based on electrooculography. IEEE Transactions on Neural Systems and Rehabilitation Engineering 10(4):209–218

    Article  Google Scholar 

  8. Bates R, Istance H (2002) Zooming interfaces! Enhancing the performance of eye controlled pointing devices. In: Proceedings of the Fifth International ACM Conference on Assistive Technologies (Assets ’02), ACM, New York, NY, USA, pp 119–126

    Chapter  Google Scholar 

  9. Bauby JD (1997) The Diving Bell and the Butterfly. Vintage Books

    Google Scholar 

  10. Betke M (2008) Camera-based interfaces and assistive software for people with severe motion impairments. In: Augusto J, Shapiro D, Aghajan H (eds) Proceedings of the 3rd Workshop on “Artificial Intelligence Techniques for Ambient Intelligence” (AITAmI’08), Patras, Greece. 21st-22nd of July 2008. Co-located event of ECAI 2008, Springer-Verlag

    Google Scholar 

  11. Betke M, Kawai J (1999) Gaze detection via self-organizing gray-scale units. In: Proceedings of the International Workshop on Recognition, Analysis, and Tracking of Faces and Gestures in Real-Time Systems, IEEE, Kerkyra, Greece, pp 70–76

    Google Scholar 

  12. Betke M, Mullally WJ, Magee J (2000) Active detection of eye scleras in real time. In: Proceedings of the IEEE Workshop on Human Modeling, Analysis and Synthesis, Hilton Head Island, SC

    Google Scholar 

  13. Betke M, Gips J, Fleming P (2002) The Camera Mouse: Visual tracking of body features to provide computer access for people with severe disabilities. IEEE Transactions on Neural Systems and Rehabilitation Engineering 10(1):1–10

    Article  Google Scholar 

  14. Betke M, Gusyatin O, Urinson M (2006) SymbolDesign: A user-centered method to design pen-based interfaces and extend the functionality of pointer input devices. Universal Access in the Information Society 4(3):223–236

    Article  Google Scholar 

  15. Beymer D, Flickner M (2003) Eye gaze tracking using an active stereo head. In: Proceedings of the 2003 Conference on Computer Vision and Pattern Recognition (CVPR’03) – Volume II, Madison, Wisconsin, pp 451–458

    Google Scholar 

  16. Biswas P, Samanta D (2007) Designing computer interface for physically challenged persons. In: Proceedings of the International Information Technology Conference (ICIT 2007), Rourkella, India, pp 161–166

    Google Scholar 

  17. Boston College Campus School (2008) http://www.bc.edu/schools/lsoe/campsch

  18. Boston Home (2008) The Boston Home is a specialized care residence for adults with advanced multiple sclerosis and other progressive neurological diseases. http://thebostonhome.org

  19. Bouchard B, Roy P, Bouzouane A, Giroux S, Mihailidis A (2008) Towards an extension of the COACH task guidance system: Activity recognition of Alzheimer’s patients. In: Augusto J, Shapiro D, Aghajan H (eds) Proceedings of the 3rd Workshop on “Artificial Intelligence Techniques for Ambient Intelligence” (AITAmI’08), Patras, Greece. 21st-22nd of July 2008. Co-located event of ECAI 2008, Springer-Verlag

    Google Scholar 

  20. Bowyer K, Phillips PJ (1998) Empirical Evaluation Techniques in Computer Vision. IEEE Computer Society Press, Los Alamitos, CA, USA

    MATH  Google Scholar 

  21. Brown C (1992) Assistive technology computers and persons with disabilities. Communications of the ACM 35(5):36–45

    Article  Google Scholar 

  22. Camera Mouse (2008) A video-based mouse-replacement interface for people with severe motion impairments. http://www.cameramouse.org

  23. Center of Communication Disorders (2008) Children’s Hospital, Boston, USA. http://www.childrenshospital.org/clinicalservices/Site2016/mainpage/S2016P0.html

  24. Chau M, Betke M (2005) Real time eye tracking and blink detection with USB cameras. Tech. Rep. 2005-012, Computer Science Department, Boston University, http://www.cs.bu.edu/techreports/pdf/2005-012-blink-detection.pdf

  25. Cloud RL, Betke M, Gips J (2002) Experiments with a camera-based human-computer interface system. In: 7th ERCIM Workshop on User Interfaces for All, Paris, France, pp 103–110

    Google Scholar 

  26. Connor C, Yu E, Magee J, Cansizoglu E, Epstein S, Betke M (2009) Movement and recovery analysis of a mouse-replacement interface for users with severe disabilities. In: Proceedings of the 13th International Conference on Human-Computer Interaction (HCI International 2009), San Diego, CA, in press.

    Google Scholar 

  27. Crampton S, Betke M (2003) Counting fingers in real time: A webcam-based human-computer interface with game applications. In: Proceedings of the Conference on Universal Access in Human-Computer Interaction (UA-HCI), Crete, Greece, pp 1357–1361

    Google Scholar 

  28. Darrell T, Essa IA, Pentland A (1996) Task-specific gesture analysis in real time using interpolated views. IEEE Transactions on Pattern Analysis and Machine Intelligence 18(12):1236–1242

    Article  Google Scholar 

  29. DiMattia P, Curran FX, Gips J (2001) An Eye Control Teaching Device for Students without Language Expressive Capacity – EagleEyes. The Edwin Mellen Press, see also http://www.bc.edu/eagleeyes

  30. Don Johnston Switches (2008) Volo, IL, USA. http://www.donjohnston.com

  31. Evans DG, Drew R, Blenkhorn P (2000) Controlling mouse pointer position using an infrared head-operated joystick. IEEE Transactions on Rehabilitation Engineering 8(1):107–117

    Article  Google Scholar 

  32. Eye Tracking System, Applied Science Laboratories (2008) Bedford, MA, USA. http://www.a-s-l.com

  33. Fagiani C, Betke M, Gips J (2002) Evaluation of tracking methods for human-computer interaction. In: IEEE Workshop on Applications in Computer Vision, Orlando, Florida, pp 121–126

    Google Scholar 

  34. Forrester Research (2003) The Aging of the US Population and Its Impact on Computer Use. A Research Report commissioned by Microsoft Corporation. http://www.microsoft.com/enable/research/computerusers.aspx

  35. Frankish C, Hull R, Morgan P (1995) Recognition accuracy and user acceptance of pen interfaces. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’95), pp 503–510

    Google Scholar 

  36. Freeman WT, Beardsley PA, Kage H, Tanaka K, Kyuman C, Weissman C (2000) Computer vision for computer interaction. SIGGRAPH Computer Graphics 33(4):65–68

    Article  Google Scholar 

  37. Gandy M, Starner T, Auxier J, Ashbrook D (2000) The Gesture Pendant: A self-illuminating, wearable, infrared computer vision system for home automation control and medical monitoring. In: Fourth International Symposium on Wearable Computers (ISWC’00), p 87

    Google Scholar 

  38. Gips J, Gips J (2000) A computer program based on Rick Hoyt’s spelling method for people with profound special needs. In: Proceedings of the International Conference on Computers Helping People with Special Needs (ICCHP), Karlsruhe, Germany, pp 245–250

    Google Scholar 

  39. Gips J, Betke M, DiMattia PA (2001) Early experiences using visual tracking for computer access by people with profound physical disabilities. In: Stephanidis C (ed) Universal Acccess In HCI: Towards an Information Society for All, Volume 3, Proceedings of the 1st International Conference on Universal Access in Human-Computer Interaction (UA-HCI), Lawrence Erlbaum Associates, Mahwah, NJ, pp 914–918

    Google Scholar 

  40. Goldberg D, Richardson C (1993) Touch-typing with a stylus. In: Proceedings of the INTERCHI ’93 Conference on Human Factors in Computing Systems, IOS Press, Amsterdam, The Netherlands, pp 80–87

    Chapter  Google Scholar 

  41. Gorman M, Lahav A, Saltzman E, Betke M (2007) A camera-based music making tool for physical rehabilitation. Computer Music Journal 31(2):39–53

    Article  Google Scholar 

  42. Gorodnichy DO, Roth G (2004) Nouse ‘use your nose as a mouse’ perceptual vision technology for hands-free games and interfaces. Image and Vision Computing 22(12):931–942

    Article  Google Scholar 

  43. Grauman K, Betke M, Gips J, Bradski GR (2001) Communication via eye blinks – detection and duration analysis in real time. In: Proceedings of the IEEE Computer Vision and Pattern Recognition Conference (CVPR), Kauai, Hawaii, vol 2, pp 1010–1017

    Google Scholar 

  44. Grauman K, Betke M, Lombardi J, Gips J, Bradski GR (2003) Communication via eye blinks and eyebrow raises: Video-based human-computer interfaces. International Journal Universal Access in the Information Society 2(4):359–373

    Article  Google Scholar 

  45. Guan H, Chang J, Chen L, Feris R, Turk M (2006) Multi-view appearance-based 3D hand pose estimation. In: IEEE Workshop on Vision for Human Computer Interaction, New York, NY, pp 1–6

    Google Scholar 

  46. Hansen DW, Pece AEC (2005) Eye tracking in the wild. Computer Vision and Image Understanding 98(1):155–181

    Article  Google Scholar 

  47. Hansen JP, Tørning K, Johansen AS, Itoh K, Aoki H (2004) Gaze typing compared with input by head and hand. In: ETRA ’04: Proceedings of the 2004 Symposium on Eye Tracking Research & Applications, ACM, New York, NY, USA, pp 131–138

    Chapter  Google Scholar 

  48. Harbush K, Kühn M (2003) Towards an adaptive communication aid with text input from ambiguous keyboards. In: 11th Conference of the European Chapter of the Association for Computational Linguistics (EACL 2003)

    Google Scholar 

  49. HeadWay (2008) Infrared head-mounted mouse alternative, Penny & Giles, Don Johnston, Inc. http://www.synapseadaptive.com/donjohnston/pengild.htm

  50. Hutchinson T, JR KPW, Martin WN, Reichert KC, Frey LA (1989) Human-computer interaction using eye-gaze input. IEEE Transactions on Systems, Man and Cybernetics 19(6):1527–1533

    Article  Google Scholar 

  51. Hwang F, Keates S, Langdon P, Clarkson J (2004) Mouse movements of motion-impaired users: a submovement analysis. In: Proceedings of the 6th International ACM SIGACCESS Conference on Computers and Accessibility (Assets ’04), pp 102–109

    Google Scholar 

  52. International Myotonic Dystrophy Organization (2008) http://www.myotonic\-dystrophy.org

  53. Ivins JP, Porril J (1998) A deformable model of the human iris for measuring small three-dimensional eye movements. Machine Vision and Applications 11(1):42–51

    Article  Google Scholar 

  54. Jaimes A, Sebe N (2005) Multimodal human computer interaction: A survey. In: Sebe N, Lew M, Huang T (eds) Computer Vision in Human-Computer Interaction, ICCV 2005 Workshop on HCI, Beijing, China, October 21, 2005, Proceedings. Lecture Notes in Computer Science, Volume 3766, Springer-Verlag, Berlin Heidelberg, pp 1–15

    Google Scholar 

  55. Ji Q, Zhu Z (2004) Eye and gaze tracking for interactive graphic display. Machine Vision and Applications 15(3):139–148

    Google Scholar 

  56. Kaplan (2008) World Institute on Disability. http://www.accessiblesociety.org

  57. Kapoor A, Picard RW (2002) Real-time, fully automatic upper facial feature tracking. In: Proceedings of the Fifth IEEE International Conference on Automatic Face Gesture Recognition, Washington, D.C., pp 10–15

    Google Scholar 

  58. Kawato S, Tetsutani N (2004) Detection and tracking of eyes for gaze-camera control. Image and Vision Computing 22(12):1031–1038

    Article  Google Scholar 

  59. Kim KN, Ramakrishna RS (1999) Vision-based eye-gaze tracking for human computer interface. In: Proceedings of the IEEE International Conference on Systems, Man, and Cybernetics, Vol. 2, Tokyo, Japan, pp 324–329

    Google Scholar 

  60. Kim WB, Kwan C, Fedyuk I, Betke M (2008) Camera canvas: Image editor for people with severe disabilities. Tech. Rep. 2008-010, Computer Science Department, Boston University, http://www.cs.bu.edu/techreports/pdf/2008-010-camera-canvas.pdf

  61. Kollios G, Sclaroff S, Betke M (2001) Motion mining: Discovering spatio-temporal patterns in databases of human motion. In: Proceedings of the 2001 ACM SIGMOD Workshop on Research Issues in Data Mining and Knowledge Discovery (DMKD 2001), Santa Barbara, CA, pp 25–32

    Google Scholar 

  62. Kristensson PO, Zhai S (2004) SHARK2: a large vocabulary shorthand writing system for pen-based computers. In: Proceedings of the 17th Annual ACM Symposium on User Interface Software and Technology (UIST ’04), ACM, New York, NY, USA, pp 43–52

    Chapter  Google Scholar 

  63. LaCascia M, Sclaroff S, Athitsos V (2000) Fast, reliable head tracking under varying illumination: An approach based on robust registration of texture-mapped 3D models. IEEE Transactions on Pattern Analysis and Machine Intelligence 22(4):322–336

    Article  Google Scholar 

  64. Lankenau A, Röfer T (2000) Smart wheelchairs - state of the art in an emerging market. Künstliche Intelligenz, Schwerpunkt Autonome Mobile Systeme 4:37–39

    Google Scholar 

  65. LC Technologies Eyegaze System (2008) http://www.lctinc.com

  66. Lombardi J, Betke M (2002) A camera-based eyebrow tracker for hands-free computer control via a binary switch. In: 7th ERCIM Workshop on User Interfaces for All, Paris, France, pp 199–200

    Google Scholar 

  67. Long AC, Landay JA, Rowe LA, Michiels J (2000) Visual similarity of pen gestures. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’00), ACM, New York, NY, USA, pp 360–367

    Chapter  Google Scholar 

  68. MacKenzie IS, Zhang SX (1997) The immediate usability of graffiti. In: Proceedings of the Conference on Graphics Interface ’97, Canadian Information Processing Society, Toronto, Ont., Canada, Canada, pp 129–137

    Google Scholar 

  69. MacKenzie IS, Zhang X (2008) Eye typing using word and letter prediction and a fixation algorithm. In: Proceedings of the 2008 Symposium on Eye Tracking Research & Applications (ETRA ’08), ACM, New York, NY, USA, pp 55–58

    Chapter  Google Scholar 

  70. Magee JJ, Betke M, Gips J, Scott MR, Waber BN (2008) A human-computer interface using symmetry between eyes to detect gaze direction. Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans 38(6):1–15

    Article  Google Scholar 

  71. Meyer A (1995) Pen computing: a technology overview and a vision. ACM SIGCHI Bulletin 27(3):46–90

    Article  Google Scholar 

  72. Morimoto CH, Koons D, Amir A, Flickner M (2000) Pupil detection and tracking using multiple light sources. Image and Vision Computing 18(4):331–335

    Article  Google Scholar 

  73. Myers BA, Wobbrock JO, Yang S, Yeung B, Nichols J, Miller R (2002) Using handhelds to help people with motor impairments. In: Proceedings of the Fifth International ACM Conference on Assistive Technologies (Assets ’02), ACM, New York, NY, USA, pp 89–96

    Chapter  Google Scholar 

  74. Myers GA, Sherman KR, Stark L (1991) Eye monitor: microcomputer-based instrument uses an internal mode to track the eye. Computer 24(3):14–21

    Article  Google Scholar 

  75. National Multiple Sclerosis Society (2008) http://www.nationalmssociety.org

  76. Nonaka H (2003) Communication interface with eye-gaze and head gesture using successive DP matching and fuzzy inference. Journal of Intelligent Information Systems 21(2):105–112

    Article  Google Scholar 

  77. NSF HCC (2008) Human-Centered Computing, Division of Information & Intelligent Systems, National Science Foundation. http://www.nsf.gov

  78. Pacchetti C, Mancini F, Aglieri R, Fundaro C, Martignoni E, Nappi G (2000) Active music therapy in Parkinson’s disease: an integrative method for motor and emotional rehabilitation. Psychosomatic Medicine 62(3):386–393

    Google Scholar 

  79. Paquette M (2005) IWeb Explorer, project report, Computer Science Department, Boston University

    Google Scholar 

  80. Park KR (2007) A real-time gaze position estimation method based on a 3-d eye model. IEEE Transactions on Systems, Man, and Cybernetics – Part B: Cybernetics 37(1):199–212

    Google Scholar 

  81. Paul S, Ramsey D (2000) Music therapy in physical medicine and rehabilitation. Australian Occupational Therapy Journal 47:111–118

    Article  Google Scholar 

  82. Ponweiser W, Vincze M (2007) Task and context aware performance evaluation of computer vision algorithms. In: International Conference on Computer Vision Systems: Vision Systems in the Real World: Adaptation, Learning, Evaluation, Bielefeld, Germany (ICVS 2007)

    Google Scholar 

  83. Poulson D, Nicolle C (2004) Making the internet accessible for people with cognitive and communication impairments. Universal Access in the Information Society 3(1):48–56

    Article  Google Scholar 

  84. Schnabel (2007) Director of the film “The Diving Bell and the Butterfly,” France: Pathé Renn Productions

    Google Scholar 

  85. Schwerdt K, Crowley JL (2000) Robust face tracking using color. In: Proceedings of the Fourth IEEE International Conference on Automatic Face and Gesture Recognition, Grenoble, France

    Google Scholar 

  86. Sclaroff S, Betke M, Kollios G, Alon J, Athitsos V, Li R, Magee J, Tian T (2005) Tracking, analysis, recognition of human gestures in video. In: Proceedings of the 8th International Conference on Document Analysis and Recognition, Seoul, Korea, pp 806–810

    Google Scholar 

  87. Shih SW, Liu J (2004) A novel approach to 3-D gaze tracking using stereo cameras. IEEE Transactions on Systems, Man, and Cybernetics – Part B: Cybernetics 34(1):234–245

    Article  Google Scholar 

  88. Shneiderman B (1971) Computer science education and social relevance. ACM SIGCSE Bulletin 3(1):21–24

    Article  Google Scholar 

  89. Shugrina M, Betke M, Collomosse J (2006) Empathic painting: Interactive stylization through observed emotional state. In: Proceedings of the 4th International Symposium on Non-Photorealistic Animation and Rendering (NPAR 2006), Annecy, France, 8 pp.

    Google Scholar 

  90. SIGACCESS (2008) ACM special interest group on accessible computing. http://www.sigaccess.org

  91. Sirohey S, Rosenfeld A, Duric Z (2002) A method of detecting and tracking irises and eyelids in video. Pattern Recognition 35(5):1389–1401

    Article  MATH  Google Scholar 

  92. Sirovich L, Kirby M (1987) Low-dimensional procedure for the characterization of human faces. Journal of the Optical Society of America A 4(3):519-523

    Article  Google Scholar 

  93. Smart Nav Head Tracker (2008) Natural Point, Eye Control Technologies, Inc., Corvallis, OR, USA. http://www.naturalpoint.com/smartnav

  94. Soukoreff RW, MacKenzie IS (2004) Towards a standard for pointing device evaluation, perspectives on 27 years of Fitts’ law research in HCI. International Journal on Human-Computer Studies 61(6):751–789

    Article  Google Scholar 

  95. Stary C (2006) Special UAIS issue on user-centered interaction paradigms for universal access in the information society. Universal Access in the Information Society 4(3):175–176

    Article  Google Scholar 

  96. Steriadis CE, Constantinou P (2003) Designing human-computer interfaces for quadriplegic people. ACM Transactions on Computer-Human Interaction 10(2):87–118

    Article  Google Scholar 

  97. StrokeIt (2008) A mouse gesture recognition engine and command processor, software by Jeff Doozan. http://tcbmi.com/strokeit

  98. Takami O, Morimoto K, Ochiai T, Ishimatsu T (1995) Computer interface to use head and eyeball movement for handicapped people. In: IEEE International Conference on Systems, Man and Cybernetics. Intelligent Systems for the 21st Century, vol 2, pp 1119–1123

    Google Scholar 

  99. Team Hoydt Website (2008) Racing towards inclusion. http://www.teamhoyt.com

  100. Tian Y, Kanade T, Cohn J (2000) Dual-state parametric eye tracking. In: Proceedings of the Fourth IEEE International Conference on Automatic Face and Gesture Recognition, Grenoble, France, pp 110–115

    Google Scholar 

  101. Tracker Pro (2008) Infrared-based head tracker, Madentec, Ltd., Edmonton, Alberta, Canada. http://www.madentec.com

  102. Tsugawa S, Aoki M, Hosaka A, Seki K (1994) Recent Japanese projects of AVCS-related systems. In: Proceedings of the Symposium on Intelligent Vehicles, pp 125–130

    Google Scholar 

  103. Turk M (2005) RTV4HCI: A historical overview. In: Kisacanin B, Pavlovic V, Huang T (eds) Real-Time Vision for Human-Computer Interaction, Springer-Verlag

    Google Scholar 

  104. ERCIM working group “User Interfaces for All’’. http://www.ui4all.gr/index.html

  105. Vaidyanathan R, Chung B, Gupta L, Kook H, Kota S, West JD (2007) Tongue-movement communication and control concept for hands-free human-machine interfaces. IEEE Transactions on Systems, Man, and Cybernetics – Part A: Systems and Humans 37(4):533–546

    Article  Google Scholar 

  106. Viewpoint (2008) Eye tracking system. Arrington Research. http://www.arring\-tonresearch.com

  107. Waber B, Magee JJ, Betke M (2006) Web mediators for accessible browsing. In: Stephanidis C, Pieper M (eds) Universal Access in Ambient Intelligence Environments – 9th International ERCIM Workshop “User Interfaces For All” UI4ALL 2006, Königswinter, Germany, September 2006, Revised Papers. LNCS 4397, Springer-Verlag, pp 447–466

    Google Scholar 

  108. Waber BN, Magee JJ, Betke M (2005) Fast head tilt detection for human-computer interaction. In: Sebe N, Lew M, Huang T (eds) Computer Vision in Human-Computer Interaction, ICCV 2005 Workshop on HCI, Beijing, China, October 21, 2005, Proceedings. LNCS, Vol. 3766, Springer-Verlag, pp 90–99

    Google Scholar 

  109. Wang J, Athitsos V, Sclaroff S, Betke M (2008) Detecting objects of variable shape structure with hidden state shape models. IEEE Transactions on Pattern Analysis and Machine Intelligence 30(3):477–492

    Article  Google Scholar 

  110. Wang JG, Sung E (2002) Study on eye gaze estimation. IEEE Transactions on Systems, Man, and Cybernetics – Part B: Cybernetics 32(3):332–350

    Article  Google Scholar 

  111. Ward DJ, Blackwell AF, MacKay DJC (2000) Dasher - a data entry interface using continuous gestures and language models. In: Proceedings UIST 2000: The 13th Annual ACM Symposium on User Interface Software and Technology, http://www.inference.phy.cam.ac.uk/dasher

  112. Wobbrock JO, Gajos KZ (2008) Goal crossing with mice and trackballs for people with motor impairments: Performance, submovements, and design directions. ACM Transactions on Accessible Computing 1(1):1–37

    Article  Google Scholar 

  113. Wobbrock JO, Cutrell E, Harada S, MacKenzie IS (2008) An error model for pointing based on Fitts’ law. In: Proceeding of the Twenty-sixth Annual SIGCHI Conference on Human Factors in Computing Systems (CHI ’08), pp 1613–1622

    Google Scholar 

  114. Wu C, Aghajan H (2008) Context-aware gesture analysis for speaker HCI. In: Augusto J, Shapiro D, Aghajan H (eds) Proceedings of the 3rd Workshop on “Artificial Intelligence Techniques for Ambient Intelligence” (AITAmI’08), Patras, Greece. 21st-22nd of July 2008. Co-located event of ECAI 2008, Springer-Verlag

    Google Scholar 

  115. Wu TF, Chen MC (2007) Performance of different pointing devices on children with cerebral palsy. In: Stephanidis C (ed) Universal Access in Human-Computer Interaction: Applications and Services, Vol. 4556, Springer-Verlag, Berlin Heidelberg, pp 462–469

    Chapter  Google Scholar 

  116. Xie X, Sudhakar R, Zhuang H (1995) Real-time eye feature tracking from a video image sequence using Kalman filter. IEEE Transactions on Systems, Man, and Cybernetics 25(12):1568–1577

    Article  Google Scholar 

  117. Yanco HA, Gips J (1998) Driver performance using single switch scanning with a powered wheelchair: Robotic assisted control versus traditional control. In: Proceedings of the Rehabilitation Engineering and Assistive Technology Society of North America Annual Conference (RESNA ’98), RESNA Press, pp 298–300

    Google Scholar 

  118. Yang M, Kriegman D, Ahuja N (2002) Detecting faces in images: A survey. IEEE Transactions on Pattern Analysis and Machine Intelligence 24(1):34–58

    Article  Google Scholar 

  119. Yoo DH, Chung MJ (2004) Non-intrusive eye gaze estimation without knowledge of eye pose. In: Proceedings of the Sixth IEEE International Conference on Automatic Face and Gesture Recognition, Seoul, Korea, pp 785–790

    Google Scholar 

  120. Young L, Sheena D (1975) Survey of eye movement recording methods. Behavior Research Methods and Instrumentation 7(5):397–429

    Google Scholar 

  121. ZAC Browser (2008) Web browser designed for autistic children. http://www.alsa.org

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Margrit Betke .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer Science+Business Media, LLC

About this chapter

Cite this chapter

Betke, M. (2010). Intelligent Interfaces to Empower People with Disabilities. In: Nakashima, H., Aghajan, H., Augusto, J.C. (eds) Handbook of Ambient Intelligence and Smart Environments. Springer, Boston, MA. https://doi.org/10.1007/978-0-387-93808-0_15

Download citation

  • DOI: https://doi.org/10.1007/978-0-387-93808-0_15

  • Publisher Name: Springer, Boston, MA

  • Print ISBN: 978-0-387-93807-3

  • Online ISBN: 978-0-387-93808-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics