Intelligent Interfaces to Empower People with Disabilities

Abstract

Severe motion impairments can result from non-progressive disorders, such as cerebral palsy, or degenerative neurological diseases, such as Amyotrophic Lateral Sclerosis (ALS), Multiple Sclerosis (MS), or muscular dystrophy (MD). They can be due to traumatic brain injuries, for example, due to a traffic accident, or to brainstem strokes [9, 84]. Worldwide, these disorders affect millions of individuals of all races and ethnic backgrounds [4, 75, 52]. Because disease onset of MS and ALS typically occurs in adulthood, afflicted people are usually computer literate. Intelligent interfaces can immensely improve their daily lives by allowing them to communicate and participate in the information society, for example, by browsing the web, posting messages, or emailing friends. However, people with advanced ALS, MS, or MD may reach a point when they cannot control the keyboard and mouse anymore and also cannot rely on automated voice recognition because their speech has become slurred.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [1]
    AbleNet Switches (2008) Roseville, MN, USA. http://www.ablenetinc.com
  2. [2]
    Akram W, Tiberii L, Betke M (2006) A customizable camera-based human computer interaction system allowing people with disabilities autonomous hands free navigation of multiple computing tasks. In: Stephanidis C, Pieper M(eds) Universal Access in Ambient Intelligence Environments – 9th International ERCIM Workshop “User Interfaces For All” UI4ALL 2006, Königswinter, Germany, September 2006, Revised Papers. LNCS 4397, Springer-Verlag, pp 28–42Google Scholar
  3. [3]
    Akram W, Tiberii L, Betke M (2008) Designing and evaluating video-based interfaces for users with motion impairments. Universal Access in the Information Society, in reviewGoogle Scholar
  4. [4]
    ALS Association (2008) http://www.alsa.org
  5. [5]
    Animate! (2008) Software for camera mouse users to create video animations of an anthropomorphic figure. http://csr.bu.edu/visiongraphics/CameraMouse/animate.html
  6. [6]
    Athitsos V, Wang J, Sclaroff S, Betke M (2006) Detecting instances of shape classes that exhibit variable structure. In: Computer Vision – ECCV 2006, 9th European Conference on Computer Vision, Graz, Austria, May 7-13, 2006, Proceedings, Part 1, LNCS, Vol. 3951, Springer Verlag, pp 121–134Google Scholar
  7. [7]
    Barea R, Boquete L, Mazo M, López E (2002) System for assisted mobility using eye movements based on electrooculography. IEEE Transactions on Neural Systems and Rehabilitation Engineering 10(4):209–218CrossRefGoogle Scholar
  8. [8]
    Bates R, Istance H (2002) Zooming interfaces! Enhancing the performance of eye controlled pointing devices. In: Proceedings of the Fifth International ACM Conference on Assistive Technologies (Assets ’02), ACM, New York, NY, USA, pp 119–126CrossRefGoogle Scholar
  9. [9]
    Bauby JD (1997) The Diving Bell and the Butterfly. Vintage BooksGoogle Scholar
  10. [10]
    Betke M (2008) Camera-based interfaces and assistive software for people with severe motion impairments. In: Augusto J, Shapiro D, Aghajan H (eds) Proceedings of the 3rd Workshop on “Artificial Intelligence Techniques for Ambient Intelligence” (AITAmI’08), Patras, Greece. 21st-22nd of July 2008. Co-located event of ECAI 2008, Springer-VerlagGoogle Scholar
  11. [11]
    Betke M, Kawai J (1999) Gaze detection via self-organizing gray-scale units. In: Proceedings of the International Workshop on Recognition, Analysis, and Tracking of Faces and Gestures in Real-Time Systems, IEEE, Kerkyra, Greece, pp 70–76Google Scholar
  12. [12]
    Betke M, Mullally WJ, Magee J (2000) Active detection of eye scleras in real time. In: Proceedings of the IEEE Workshop on Human Modeling, Analysis and Synthesis, Hilton Head Island, SCGoogle Scholar
  13. [13]
    Betke M, Gips J, Fleming P (2002) The Camera Mouse: Visual tracking of body features to provide computer access for people with severe disabilities. IEEE Transactions on Neural Systems and Rehabilitation Engineering 10(1):1–10CrossRefGoogle Scholar
  14. [14]
    Betke M, Gusyatin O, Urinson M (2006) SymbolDesign: A user-centered method to design pen-based interfaces and extend the functionality of pointer input devices. Universal Access in the Information Society 4(3):223–236CrossRefGoogle Scholar
  15. [15]
    Beymer D, Flickner M (2003) Eye gaze tracking using an active stereo head. In: Proceedings of the 2003 Conference on Computer Vision and Pattern Recognition (CVPR’03) – Volume II, Madison, Wisconsin, pp 451–458Google Scholar
  16. [16]
    Biswas P, Samanta D (2007) Designing computer interface for physically challenged persons. In: Proceedings of the International Information Technology Conference (ICIT 2007), Rourkella, India, pp 161–166Google Scholar
  17. [17]
    Boston College Campus School (2008) http://www.bc.edu/schools/lsoe/campsch
  18. [18]
    Boston Home (2008) The Boston Home is a specialized care residence for adults with advanced multiple sclerosis and other progressive neurological diseases. http://thebostonhome.org
  19. [19]
    Bouchard B, Roy P, Bouzouane A, Giroux S, Mihailidis A (2008) Towards an extension of the COACH task guidance system: Activity recognition of Alzheimer’s patients. In: Augusto J, Shapiro D, Aghajan H (eds) Proceedings of the 3rd Workshop on “Artificial Intelligence Techniques for Ambient Intelligence” (AITAmI’08), Patras, Greece. 21st-22nd of July 2008. Co-located event of ECAI 2008, Springer-VerlagGoogle Scholar
  20. [20]
    Bowyer K, Phillips PJ (1998) Empirical Evaluation Techniques in Computer Vision. IEEE Computer Society Press, Los Alamitos, CA, USAMATHGoogle Scholar
  21. [21]
    Brown C (1992) Assistive technology computers and persons with disabilities. Communications of the ACM 35(5):36–45CrossRefGoogle Scholar
  22. [22]
    Camera Mouse (2008) A video-based mouse-replacement interface for people with severe motion impairments. http://www.cameramouse.org
  23. [23]
    Center of Communication Disorders (2008) Children’s Hospital, Boston, USA. http://www.childrenshospital.org/clinicalservices/Site2016/mainpage/S2016P0.html
  24. [24]
    Chau M, Betke M (2005) Real time eye tracking and blink detection with USB cameras. Tech. Rep. 2005-012, Computer Science Department, Boston University, http://www.cs.bu.edu/techreports/pdf/2005-012-blink-detection.pdf
  25. [25]
    Cloud RL, Betke M, Gips J (2002) Experiments with a camera-based human-computer interface system. In: 7th ERCIM Workshop on User Interfaces for All, Paris, France, pp 103–110Google Scholar
  26. [26]
    Connor C, Yu E, Magee J, Cansizoglu E, Epstein S, Betke M (2009) Movement and recovery analysis of a mouse-replacement interface for users with severe disabilities. In: Proceedings of the 13th International Conference on Human-Computer Interaction (HCI International 2009), San Diego, CA, in press.Google Scholar
  27. [27]
    Crampton S, Betke M (2003) Counting fingers in real time: A webcam-based human-computer interface with game applications. In: Proceedings of the Conference on Universal Access in Human-Computer Interaction (UA-HCI), Crete, Greece, pp 1357–1361Google Scholar
  28. [28]
    Darrell T, Essa IA, Pentland A (1996) Task-specific gesture analysis in real time using interpolated views. IEEE Transactions on Pattern Analysis and Machine Intelligence 18(12):1236–1242CrossRefGoogle Scholar
  29. [29]
    DiMattia P, Curran FX, Gips J (2001) An Eye Control Teaching Device for Students without Language Expressive Capacity – EagleEyes. The Edwin Mellen Press, see also http://www.bc.edu/eagleeyes
  30. [30]
    Don Johnston Switches (2008) Volo, IL, USA. http://www.donjohnston.com
  31. [31]
    Evans DG, Drew R, Blenkhorn P (2000) Controlling mouse pointer position using an infrared head-operated joystick. IEEE Transactions on Rehabilitation Engineering 8(1):107–117CrossRefGoogle Scholar
  32. [32]
    Eye Tracking System, Applied Science Laboratories (2008) Bedford, MA, USA. http://www.a-s-l.com
  33. [33]
    Fagiani C, Betke M, Gips J (2002) Evaluation of tracking methods for human-computer interaction. In: IEEE Workshop on Applications in Computer Vision, Orlando, Florida, pp 121–126Google Scholar
  34. [34]
    Forrester Research (2003) The Aging of the US Population and Its Impact on Computer Use. A Research Report commissioned by Microsoft Corporation. http://www.microsoft.com/enable/research/computerusers.aspx
  35. [35]
    Frankish C, Hull R, Morgan P (1995) Recognition accuracy and user acceptance of pen interfaces. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’95), pp 503–510Google Scholar
  36. [36]
    Freeman WT, Beardsley PA, Kage H, Tanaka K, Kyuman C, Weissman C (2000) Computer vision for computer interaction. SIGGRAPH Computer Graphics 33(4):65–68CrossRefGoogle Scholar
  37. [37]
    Gandy M, Starner T, Auxier J, Ashbrook D (2000) The Gesture Pendant: A self-illuminating, wearable, infrared computer vision system for home automation control and medical monitoring. In: Fourth International Symposium on Wearable Computers (ISWC’00), p 87Google Scholar
  38. [38]
    Gips J, Gips J (2000) A computer program based on Rick Hoyt’s spelling method for people with profound special needs. In: Proceedings of the International Conference on Computers Helping People with Special Needs (ICCHP), Karlsruhe, Germany, pp 245–250Google Scholar
  39. [39]
    Gips J, Betke M, DiMattia PA (2001) Early experiences using visual tracking for computer access by people with profound physical disabilities. In: Stephanidis C (ed) Universal Acccess In HCI: Towards an Information Society for All, Volume 3, Proceedings of the 1st International Conference on Universal Access in Human-Computer Interaction (UA-HCI), Lawrence Erlbaum Associates, Mahwah, NJ, pp 914–918Google Scholar
  40. [40]
    Goldberg D, Richardson C (1993) Touch-typing with a stylus. In: Proceedings of the INTERCHI ’93 Conference on Human Factors in Computing Systems, IOS Press, Amsterdam, The Netherlands, pp 80–87CrossRefGoogle Scholar
  41. [41]
    Gorman M, Lahav A, Saltzman E, Betke M (2007) A camera-based music making tool for physical rehabilitation. Computer Music Journal 31(2):39–53CrossRefGoogle Scholar
  42. [42]
    Gorodnichy DO, Roth G (2004) Nouse ‘use your nose as a mouse’ perceptual vision technology for hands-free games and interfaces. Image and Vision Computing 22(12):931–942CrossRefGoogle Scholar
  43. [43]
    Grauman K, Betke M, Gips J, Bradski GR (2001) Communication via eye blinks – detection and duration analysis in real time. In: Proceedings of the IEEE Computer Vision and Pattern Recognition Conference (CVPR), Kauai, Hawaii, vol 2, pp 1010–1017Google Scholar
  44. [44]
    Grauman K, Betke M, Lombardi J, Gips J, Bradski GR (2003) Communication via eye blinks and eyebrow raises: Video-based human-computer interfaces. International Journal Universal Access in the Information Society 2(4):359–373CrossRefGoogle Scholar
  45. [45]
    Guan H, Chang J, Chen L, Feris R, Turk M (2006) Multi-view appearance-based 3D hand pose estimation. In: IEEE Workshop on Vision for Human Computer Interaction, New York, NY, pp 1–6Google Scholar
  46. [46]
    Hansen DW, Pece AEC (2005) Eye tracking in the wild. Computer Vision and Image Understanding 98(1):155–181CrossRefGoogle Scholar
  47. [47]
    Hansen JP, Tørning K, Johansen AS, Itoh K, Aoki H (2004) Gaze typing compared with input by head and hand. In: ETRA ’04: Proceedings of the 2004 Symposium on Eye Tracking Research & Applications, ACM, New York, NY, USA, pp 131–138CrossRefGoogle Scholar
  48. [48]
    Harbush K, Kühn M (2003) Towards an adaptive communication aid with text input from ambiguous keyboards. In: 11th Conference of the European Chapter of the Association for Computational Linguistics (EACL 2003)Google Scholar
  49. [49]
    HeadWay (2008) Infrared head-mounted mouse alternative, Penny & Giles, Don Johnston, Inc. http://www.synapseadaptive.com/donjohnston/pengild.htm
  50. [50]
    Hutchinson T, JR KPW, Martin WN, Reichert KC, Frey LA (1989) Human-computer interaction using eye-gaze input. IEEE Transactions on Systems, Man and Cybernetics 19(6):1527–1533CrossRefGoogle Scholar
  51. [51]
    Hwang F, Keates S, Langdon P, Clarkson J (2004) Mouse movements of motion-impaired users: a submovement analysis. In: Proceedings of the 6th International ACM SIGACCESS Conference on Computers and Accessibility (Assets ’04), pp 102–109Google Scholar
  52. [52]
    International Myotonic Dystrophy Organization (2008) http://www.myotonic\-dystrophy.org
  53. [53]
    Ivins JP, Porril J (1998) A deformable model of the human iris for measuring small three-dimensional eye movements. Machine Vision and Applications 11(1):42–51CrossRefGoogle Scholar
  54. [54]
    Jaimes A, Sebe N (2005) Multimodal human computer interaction: A survey. In: Sebe N, Lew M, Huang T (eds) Computer Vision in Human-Computer Interaction, ICCV 2005 Workshop on HCI, Beijing, China, October 21, 2005, Proceedings. Lecture Notes in Computer Science, Volume 3766, Springer-Verlag, Berlin Heidelberg, pp 1–15Google Scholar
  55. [55]
    Ji Q, Zhu Z (2004) Eye and gaze tracking for interactive graphic display. Machine Vision and Applications 15(3):139–148Google Scholar
  56. [56]
    Kaplan (2008) World Institute on Disability. http://www.accessiblesociety.org
  57. [57]
    Kapoor A, Picard RW (2002) Real-time, fully automatic upper facial feature tracking. In: Proceedings of the Fifth IEEE International Conference on Automatic Face Gesture Recognition, Washington, D.C., pp 10–15Google Scholar
  58. [58]
    Kawato S, Tetsutani N (2004) Detection and tracking of eyes for gaze-camera control. Image and Vision Computing 22(12):1031–1038CrossRefGoogle Scholar
  59. [59]
    Kim KN, Ramakrishna RS (1999) Vision-based eye-gaze tracking for human computer interface. In: Proceedings of the IEEE International Conference on Systems, Man, and Cybernetics, Vol. 2, Tokyo, Japan, pp 324–329Google Scholar
  60. [60]
    Kim WB, Kwan C, Fedyuk I, Betke M (2008) Camera canvas: Image editor for people with severe disabilities. Tech. Rep. 2008-010, Computer Science Department, Boston University, http://www.cs.bu.edu/techreports/pdf/2008-010-camera-canvas.pdf
  61. [61]
    Kollios G, Sclaroff S, Betke M (2001) Motion mining: Discovering spatio-temporal patterns in databases of human motion. In: Proceedings of the 2001 ACM SIGMOD Workshop on Research Issues in Data Mining and Knowledge Discovery (DMKD 2001), Santa Barbara, CA, pp 25–32Google Scholar
  62. [62]
    Kristensson PO, Zhai S (2004) SHARK2: a large vocabulary shorthand writing system for pen-based computers. In: Proceedings of the 17th Annual ACM Symposium on User Interface Software and Technology (UIST ’04), ACM, New York, NY, USA, pp 43–52CrossRefGoogle Scholar
  63. [63]
    LaCascia M, Sclaroff S, Athitsos V (2000) Fast, reliable head tracking under varying illumination: An approach based on robust registration of texture-mapped 3D models. IEEE Transactions on Pattern Analysis and Machine Intelligence 22(4):322–336CrossRefGoogle Scholar
  64. [64]
    Lankenau A, Röfer T (2000) Smart wheelchairs - state of the art in an emerging market. Künstliche Intelligenz, Schwerpunkt Autonome Mobile Systeme 4:37–39Google Scholar
  65. [65]
    LC Technologies Eyegaze System (2008) http://www.lctinc.com
  66. [66]
    Lombardi J, Betke M (2002) A camera-based eyebrow tracker for hands-free computer control via a binary switch. In: 7th ERCIM Workshop on User Interfaces for All, Paris, France, pp 199–200Google Scholar
  67. [67]
    Long AC, Landay JA, Rowe LA, Michiels J (2000) Visual similarity of pen gestures. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’00), ACM, New York, NY, USA, pp 360–367CrossRefGoogle Scholar
  68. [68]
    MacKenzie IS, Zhang SX (1997) The immediate usability of graffiti. In: Proceedings of the Conference on Graphics Interface ’97, Canadian Information Processing Society, Toronto, Ont., Canada, Canada, pp 129–137Google Scholar
  69. [69]
    MacKenzie IS, Zhang X (2008) Eye typing using word and letter prediction and a fixation algorithm. In: Proceedings of the 2008 Symposium on Eye Tracking Research & Applications (ETRA ’08), ACM, New York, NY, USA, pp 55–58CrossRefGoogle Scholar
  70. [70]
    Magee JJ, Betke M, Gips J, Scott MR, Waber BN (2008) A human-computer interface using symmetry between eyes to detect gaze direction. Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans 38(6):1–15CrossRefGoogle Scholar
  71. [71]
    Meyer A (1995) Pen computing: a technology overview and a vision. ACM SIGCHI Bulletin 27(3):46–90CrossRefGoogle Scholar
  72. [72]
    Morimoto CH, Koons D, Amir A, Flickner M (2000) Pupil detection and tracking using multiple light sources. Image and Vision Computing 18(4):331–335CrossRefGoogle Scholar
  73. [73]
    Myers BA, Wobbrock JO, Yang S, Yeung B, Nichols J, Miller R (2002) Using handhelds to help people with motor impairments. In: Proceedings of the Fifth International ACM Conference on Assistive Technologies (Assets ’02), ACM, New York, NY, USA, pp 89–96CrossRefGoogle Scholar
  74. [74]
    Myers GA, Sherman KR, Stark L (1991) Eye monitor: microcomputer-based instrument uses an internal mode to track the eye. Computer 24(3):14–21CrossRefGoogle Scholar
  75. [75]
    National Multiple Sclerosis Society (2008) http://www.nationalmssociety.org
  76. [76]
    Nonaka H (2003) Communication interface with eye-gaze and head gesture using successive DP matching and fuzzy inference. Journal of Intelligent Information Systems 21(2):105–112CrossRefGoogle Scholar
  77. [77]
    NSF HCC (2008) Human-Centered Computing, Division of Information & Intelligent Systems, National Science Foundation. http://www.nsf.gov
  78. [78]
    Pacchetti C, Mancini F, Aglieri R, Fundaro C, Martignoni E, Nappi G (2000) Active music therapy in Parkinson’s disease: an integrative method for motor and emotional rehabilitation. Psychosomatic Medicine 62(3):386–393Google Scholar
  79. [79]
    Paquette M (2005) IWeb Explorer, project report, Computer Science Department, Boston UniversityGoogle Scholar
  80. [80]
    Park KR (2007) A real-time gaze position estimation method based on a 3-d eye model. IEEE Transactions on Systems, Man, and Cybernetics – Part B: Cybernetics 37(1):199–212Google Scholar
  81. [81]
    Paul S, Ramsey D (2000) Music therapy in physical medicine and rehabilitation. Australian Occupational Therapy Journal 47:111–118CrossRefGoogle Scholar
  82. [82]
    Ponweiser W, Vincze M (2007) Task and context aware performance evaluation of computer vision algorithms. In: International Conference on Computer Vision Systems: Vision Systems in the Real World: Adaptation, Learning, Evaluation, Bielefeld, Germany (ICVS 2007)Google Scholar
  83. [83]
    Poulson D, Nicolle C (2004) Making the internet accessible for people with cognitive and communication impairments. Universal Access in the Information Society 3(1):48–56CrossRefGoogle Scholar
  84. [84]
    Schnabel (2007) Director of the film “The Diving Bell and the Butterfly,” France: Pathé Renn ProductionsGoogle Scholar
  85. [85]
    Schwerdt K, Crowley JL (2000) Robust face tracking using color. In: Proceedings of the Fourth IEEE International Conference on Automatic Face and Gesture Recognition, Grenoble, FranceGoogle Scholar
  86. [86]
    Sclaroff S, Betke M, Kollios G, Alon J, Athitsos V, Li R, Magee J, Tian T (2005) Tracking, analysis, recognition of human gestures in video. In: Proceedings of the 8th International Conference on Document Analysis and Recognition, Seoul, Korea, pp 806–810Google Scholar
  87. [87]
    Shih SW, Liu J (2004) A novel approach to 3-D gaze tracking using stereo cameras. IEEE Transactions on Systems, Man, and Cybernetics – Part B: Cybernetics 34(1):234–245CrossRefGoogle Scholar
  88. [88]
    Shneiderman B (1971) Computer science education and social relevance. ACM SIGCSE Bulletin 3(1):21–24CrossRefGoogle Scholar
  89. [89]
    Shugrina M, Betke M, Collomosse J (2006) Empathic painting: Interactive stylization through observed emotional state. In: Proceedings of the 4th International Symposium on Non-Photorealistic Animation and Rendering (NPAR 2006), Annecy, France, 8 pp.Google Scholar
  90. [90]
    SIGACCESS (2008) ACM special interest group on accessible computing. http://www.sigaccess.org
  91. [91]
    Sirohey S, Rosenfeld A, Duric Z (2002) A method of detecting and tracking irises and eyelids in video. Pattern Recognition 35(5):1389–1401MATHCrossRefGoogle Scholar
  92. [92]
    Sirovich L, Kirby M (1987) Low-dimensional procedure for the characterization of human faces. Journal of the Optical Society of America A 4(3):519-523CrossRefGoogle Scholar
  93. [93]
    Smart Nav Head Tracker (2008) Natural Point, Eye Control Technologies, Inc., Corvallis, OR, USA. http://www.naturalpoint.com/smartnav
  94. [94]
    Soukoreff RW, MacKenzie IS (2004) Towards a standard for pointing device evaluation, perspectives on 27 years of Fitts’ law research in HCI. International Journal on Human-Computer Studies 61(6):751–789CrossRefGoogle Scholar
  95. [95]
    Stary C (2006) Special UAIS issue on user-centered interaction paradigms for universal access in the information society. Universal Access in the Information Society 4(3):175–176CrossRefGoogle Scholar
  96. [96]
    Steriadis CE, Constantinou P (2003) Designing human-computer interfaces for quadriplegic people. ACM Transactions on Computer-Human Interaction 10(2):87–118CrossRefGoogle Scholar
  97. [97]
    StrokeIt (2008) A mouse gesture recognition engine and command processor, software by Jeff Doozan. http://tcbmi.com/strokeit
  98. [98]
    Takami O, Morimoto K, Ochiai T, Ishimatsu T (1995) Computer interface to use head and eyeball movement for handicapped people. In: IEEE International Conference on Systems, Man and Cybernetics. Intelligent Systems for the 21st Century, vol 2, pp 1119–1123Google Scholar
  99. [99]
    Team Hoydt Website (2008) Racing towards inclusion. http://www.teamhoyt.com
  100. [100]
    Tian Y, Kanade T, Cohn J (2000) Dual-state parametric eye tracking. In: Proceedings of the Fourth IEEE International Conference on Automatic Face and Gesture Recognition, Grenoble, France, pp 110–115Google Scholar
  101. [101]
    Tracker Pro (2008) Infrared-based head tracker, Madentec, Ltd., Edmonton, Alberta, Canada. http://www.madentec.com
  102. [102]
    Tsugawa S, Aoki M, Hosaka A, Seki K (1994) Recent Japanese projects of AVCS-related systems. In: Proceedings of the Symposium on Intelligent Vehicles, pp 125–130Google Scholar
  103. [103]
    Turk M (2005) RTV4HCI: A historical overview. In: Kisacanin B, Pavlovic V, Huang T (eds) Real-Time Vision for Human-Computer Interaction, Springer-VerlagGoogle Scholar
  104. [UI4All]
    ERCIM working group “User Interfaces for All’’. http://www.ui4all.gr/index.html
  105. [104]
    Vaidyanathan R, Chung B, Gupta L, Kook H, Kota S, West JD (2007) Tongue-movement communication and control concept for hands-free human-machine interfaces. IEEE Transactions on Systems, Man, and Cybernetics – Part A: Systems and Humans 37(4):533–546CrossRefGoogle Scholar
  106. [105]
    Viewpoint (2008) Eye tracking system. Arrington Research. http://www.arring\-tonresearch.com
  107. [106]
    Waber B, Magee JJ, Betke M (2006) Web mediators for accessible browsing. In: Stephanidis C, Pieper M (eds) Universal Access in Ambient Intelligence Environments – 9th International ERCIM Workshop “User Interfaces For All” UI4ALL 2006, Königswinter, Germany, September 2006, Revised Papers. LNCS 4397, Springer-Verlag, pp 447–466Google Scholar
  108. [107]
    Waber BN, Magee JJ, Betke M (2005) Fast head tilt detection for human-computer interaction. In: Sebe N, Lew M, Huang T (eds) Computer Vision in Human-Computer Interaction, ICCV 2005 Workshop on HCI, Beijing, China, October 21, 2005, Proceedings. LNCS, Vol. 3766, Springer-Verlag, pp 90–99Google Scholar
  109. [108]
    Wang J, Athitsos V, Sclaroff S, Betke M (2008) Detecting objects of variable shape structure with hidden state shape models. IEEE Transactions on Pattern Analysis and Machine Intelligence 30(3):477–492CrossRefGoogle Scholar
  110. [109]
    Wang JG, Sung E (2002) Study on eye gaze estimation. IEEE Transactions on Systems, Man, and Cybernetics – Part B: Cybernetics 32(3):332–350CrossRefGoogle Scholar
  111. [110]
    Ward DJ, Blackwell AF, MacKay DJC (2000) Dasher - a data entry interface using continuous gestures and language models. In: Proceedings UIST 2000: The 13th Annual ACM Symposium on User Interface Software and Technology, http://www.inference.phy.cam.ac.uk/dasher
  112. [111]
    Wobbrock JO, Gajos KZ (2008) Goal crossing with mice and trackballs for people with motor impairments: Performance, submovements, and design directions. ACM Transactions on Accessible Computing 1(1):1–37CrossRefGoogle Scholar
  113. [112]
    Wobbrock JO, Cutrell E, Harada S, MacKenzie IS (2008) An error model for pointing based on Fitts’ law. In: Proceeding of the Twenty-sixth Annual SIGCHI Conference on Human Factors in Computing Systems (CHI ’08), pp 1613–1622Google Scholar
  114. [113]
    Wu C, Aghajan H (2008) Context-aware gesture analysis for speaker HCI. In: Augusto J, Shapiro D, Aghajan H (eds) Proceedings of the 3rd Workshop on “Artificial Intelligence Techniques for Ambient Intelligence” (AITAmI’08), Patras, Greece. 21st-22nd of July 2008. Co-located event of ECAI 2008, Springer-VerlagGoogle Scholar
  115. [114]
    Wu TF, Chen MC (2007) Performance of different pointing devices on children with cerebral palsy. In: Stephanidis C (ed) Universal Access in Human-Computer Interaction: Applications and Services, Vol. 4556, Springer-Verlag, Berlin Heidelberg, pp 462–469CrossRefGoogle Scholar
  116. [115]
    Xie X, Sudhakar R, Zhuang H (1995) Real-time eye feature tracking from a video image sequence using Kalman filter. IEEE Transactions on Systems, Man, and Cybernetics 25(12):1568–1577CrossRefGoogle Scholar
  117. [116]
    Yanco HA, Gips J (1998) Driver performance using single switch scanning with a powered wheelchair: Robotic assisted control versus traditional control. In: Proceedings of the Rehabilitation Engineering and Assistive Technology Society of North America Annual Conference (RESNA ’98), RESNA Press, pp 298–300Google Scholar
  118. [117]
    Yang M, Kriegman D, Ahuja N (2002) Detecting faces in images: A survey. IEEE Transactions on Pattern Analysis and Machine Intelligence 24(1):34–58CrossRefGoogle Scholar
  119. [118]
    Yoo DH, Chung MJ (2004) Non-intrusive eye gaze estimation without knowledge of eye pose. In: Proceedings of the Sixth IEEE International Conference on Automatic Face and Gesture Recognition, Seoul, Korea, pp 785–790Google Scholar
  120. [119]
    Young L, Sheena D (1975) Survey of eye movement recording methods. Behavior Research Methods and Instrumentation 7(5):397–429Google Scholar
  121. [120]
    ZAC Browser (2008) Web browser designed for autistic children. http://www.alsa.org

Copyright information

© Springer Science+Business Media, LLC 2010

Authors and Affiliations

  1. 1.Department of Computer ScienceBoston UniversityBostonUSA

Personalised recommendations