Abstract
Severe motion impairments can result from non-progressive disorders, such as cerebral palsy, or degenerative neurological diseases, such as Amyotrophic Lateral Sclerosis (ALS), Multiple Sclerosis (MS), or muscular dystrophy (MD). They can be due to traumatic brain injuries, for example, due to a traffic accident, or to brainstem strokes [9, 84]. Worldwide, these disorders affect millions of individuals of all races and ethnic backgrounds [4, 75, 52]. Because disease onset of MS and ALS typically occurs in adulthood, afflicted people are usually computer literate. Intelligent interfaces can immensely improve their daily lives by allowing them to communicate and participate in the information society, for example, by browsing the web, posting messages, or emailing friends. However, people with advanced ALS, MS, or MD may reach a point when they cannot control the keyboard and mouse anymore and also cannot rely on automated voice recognition because their speech has become slurred.
Preview
Unable to display preview. Download preview PDF.
References
AbleNet Switches (2008) Roseville, MN, USA. http://www.ablenetinc.com
Akram W, Tiberii L, Betke M (2006) A customizable camera-based human computer interaction system allowing people with disabilities autonomous hands free navigation of multiple computing tasks. In: Stephanidis C, Pieper M(eds) Universal Access in Ambient Intelligence Environments – 9th International ERCIM Workshop “User Interfaces For All” UI4ALL 2006, Königswinter, Germany, September 2006, Revised Papers. LNCS 4397, Springer-Verlag, pp 28–42
Akram W, Tiberii L, Betke M (2008) Designing and evaluating video-based interfaces for users with motion impairments. Universal Access in the Information Society, in review
ALS Association (2008) http://www.alsa.org
Animate! (2008) Software for camera mouse users to create video animations of an anthropomorphic figure. http://csr.bu.edu/visiongraphics/CameraMouse/animate.html
Athitsos V, Wang J, Sclaroff S, Betke M (2006) Detecting instances of shape classes that exhibit variable structure. In: Computer Vision – ECCV 2006, 9th European Conference on Computer Vision, Graz, Austria, May 7-13, 2006, Proceedings, Part 1, LNCS, Vol. 3951, Springer Verlag, pp 121–134
Barea R, Boquete L, Mazo M, López E (2002) System for assisted mobility using eye movements based on electrooculography. IEEE Transactions on Neural Systems and Rehabilitation Engineering 10(4):209–218
Bates R, Istance H (2002) Zooming interfaces! Enhancing the performance of eye controlled pointing devices. In: Proceedings of the Fifth International ACM Conference on Assistive Technologies (Assets ’02), ACM, New York, NY, USA, pp 119–126
Bauby JD (1997) The Diving Bell and the Butterfly. Vintage Books
Betke M (2008) Camera-based interfaces and assistive software for people with severe motion impairments. In: Augusto J, Shapiro D, Aghajan H (eds) Proceedings of the 3rd Workshop on “Artificial Intelligence Techniques for Ambient Intelligence” (AITAmI’08), Patras, Greece. 21st-22nd of July 2008. Co-located event of ECAI 2008, Springer-Verlag
Betke M, Kawai J (1999) Gaze detection via self-organizing gray-scale units. In: Proceedings of the International Workshop on Recognition, Analysis, and Tracking of Faces and Gestures in Real-Time Systems, IEEE, Kerkyra, Greece, pp 70–76
Betke M, Mullally WJ, Magee J (2000) Active detection of eye scleras in real time. In: Proceedings of the IEEE Workshop on Human Modeling, Analysis and Synthesis, Hilton Head Island, SC
Betke M, Gips J, Fleming P (2002) The Camera Mouse: Visual tracking of body features to provide computer access for people with severe disabilities. IEEE Transactions on Neural Systems and Rehabilitation Engineering 10(1):1–10
Betke M, Gusyatin O, Urinson M (2006) SymbolDesign: A user-centered method to design pen-based interfaces and extend the functionality of pointer input devices. Universal Access in the Information Society 4(3):223–236
Beymer D, Flickner M (2003) Eye gaze tracking using an active stereo head. In: Proceedings of the 2003 Conference on Computer Vision and Pattern Recognition (CVPR’03) – Volume II, Madison, Wisconsin, pp 451–458
Biswas P, Samanta D (2007) Designing computer interface for physically challenged persons. In: Proceedings of the International Information Technology Conference (ICIT 2007), Rourkella, India, pp 161–166
Boston College Campus School (2008) http://www.bc.edu/schools/lsoe/campsch
Boston Home (2008) The Boston Home is a specialized care residence for adults with advanced multiple sclerosis and other progressive neurological diseases. http://thebostonhome.org
Bouchard B, Roy P, Bouzouane A, Giroux S, Mihailidis A (2008) Towards an extension of the COACH task guidance system: Activity recognition of Alzheimer’s patients. In: Augusto J, Shapiro D, Aghajan H (eds) Proceedings of the 3rd Workshop on “Artificial Intelligence Techniques for Ambient Intelligence” (AITAmI’08), Patras, Greece. 21st-22nd of July 2008. Co-located event of ECAI 2008, Springer-Verlag
Bowyer K, Phillips PJ (1998) Empirical Evaluation Techniques in Computer Vision. IEEE Computer Society Press, Los Alamitos, CA, USA
Brown C (1992) Assistive technology computers and persons with disabilities. Communications of the ACM 35(5):36–45
Camera Mouse (2008) A video-based mouse-replacement interface for people with severe motion impairments. http://www.cameramouse.org
Center of Communication Disorders (2008) Children’s Hospital, Boston, USA. http://www.childrenshospital.org/clinicalservices/Site2016/mainpage/S2016P0.html
Chau M, Betke M (2005) Real time eye tracking and blink detection with USB cameras. Tech. Rep. 2005-012, Computer Science Department, Boston University, http://www.cs.bu.edu/techreports/pdf/2005-012-blink-detection.pdf
Cloud RL, Betke M, Gips J (2002) Experiments with a camera-based human-computer interface system. In: 7th ERCIM Workshop on User Interfaces for All, Paris, France, pp 103–110
Connor C, Yu E, Magee J, Cansizoglu E, Epstein S, Betke M (2009) Movement and recovery analysis of a mouse-replacement interface for users with severe disabilities. In: Proceedings of the 13th International Conference on Human-Computer Interaction (HCI International 2009), San Diego, CA, in press.
Crampton S, Betke M (2003) Counting fingers in real time: A webcam-based human-computer interface with game applications. In: Proceedings of the Conference on Universal Access in Human-Computer Interaction (UA-HCI), Crete, Greece, pp 1357–1361
Darrell T, Essa IA, Pentland A (1996) Task-specific gesture analysis in real time using interpolated views. IEEE Transactions on Pattern Analysis and Machine Intelligence 18(12):1236–1242
DiMattia P, Curran FX, Gips J (2001) An Eye Control Teaching Device for Students without Language Expressive Capacity – EagleEyes. The Edwin Mellen Press, see also http://www.bc.edu/eagleeyes
Don Johnston Switches (2008) Volo, IL, USA. http://www.donjohnston.com
Evans DG, Drew R, Blenkhorn P (2000) Controlling mouse pointer position using an infrared head-operated joystick. IEEE Transactions on Rehabilitation Engineering 8(1):107–117
Eye Tracking System, Applied Science Laboratories (2008) Bedford, MA, USA. http://www.a-s-l.com
Fagiani C, Betke M, Gips J (2002) Evaluation of tracking methods for human-computer interaction. In: IEEE Workshop on Applications in Computer Vision, Orlando, Florida, pp 121–126
Forrester Research (2003) The Aging of the US Population and Its Impact on Computer Use. A Research Report commissioned by Microsoft Corporation. http://www.microsoft.com/enable/research/computerusers.aspx
Frankish C, Hull R, Morgan P (1995) Recognition accuracy and user acceptance of pen interfaces. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’95), pp 503–510
Freeman WT, Beardsley PA, Kage H, Tanaka K, Kyuman C, Weissman C (2000) Computer vision for computer interaction. SIGGRAPH Computer Graphics 33(4):65–68
Gandy M, Starner T, Auxier J, Ashbrook D (2000) The Gesture Pendant: A self-illuminating, wearable, infrared computer vision system for home automation control and medical monitoring. In: Fourth International Symposium on Wearable Computers (ISWC’00), p 87
Gips J, Gips J (2000) A computer program based on Rick Hoyt’s spelling method for people with profound special needs. In: Proceedings of the International Conference on Computers Helping People with Special Needs (ICCHP), Karlsruhe, Germany, pp 245–250
Gips J, Betke M, DiMattia PA (2001) Early experiences using visual tracking for computer access by people with profound physical disabilities. In: Stephanidis C (ed) Universal Acccess In HCI: Towards an Information Society for All, Volume 3, Proceedings of the 1st International Conference on Universal Access in Human-Computer Interaction (UA-HCI), Lawrence Erlbaum Associates, Mahwah, NJ, pp 914–918
Goldberg D, Richardson C (1993) Touch-typing with a stylus. In: Proceedings of the INTERCHI ’93 Conference on Human Factors in Computing Systems, IOS Press, Amsterdam, The Netherlands, pp 80–87
Gorman M, Lahav A, Saltzman E, Betke M (2007) A camera-based music making tool for physical rehabilitation. Computer Music Journal 31(2):39–53
Gorodnichy DO, Roth G (2004) Nouse ‘use your nose as a mouse’ perceptual vision technology for hands-free games and interfaces. Image and Vision Computing 22(12):931–942
Grauman K, Betke M, Gips J, Bradski GR (2001) Communication via eye blinks – detection and duration analysis in real time. In: Proceedings of the IEEE Computer Vision and Pattern Recognition Conference (CVPR), Kauai, Hawaii, vol 2, pp 1010–1017
Grauman K, Betke M, Lombardi J, Gips J, Bradski GR (2003) Communication via eye blinks and eyebrow raises: Video-based human-computer interfaces. International Journal Universal Access in the Information Society 2(4):359–373
Guan H, Chang J, Chen L, Feris R, Turk M (2006) Multi-view appearance-based 3D hand pose estimation. In: IEEE Workshop on Vision for Human Computer Interaction, New York, NY, pp 1–6
Hansen DW, Pece AEC (2005) Eye tracking in the wild. Computer Vision and Image Understanding 98(1):155–181
Hansen JP, Tørning K, Johansen AS, Itoh K, Aoki H (2004) Gaze typing compared with input by head and hand. In: ETRA ’04: Proceedings of the 2004 Symposium on Eye Tracking Research & Applications, ACM, New York, NY, USA, pp 131–138
Harbush K, Kühn M (2003) Towards an adaptive communication aid with text input from ambiguous keyboards. In: 11th Conference of the European Chapter of the Association for Computational Linguistics (EACL 2003)
HeadWay (2008) Infrared head-mounted mouse alternative, Penny & Giles, Don Johnston, Inc. http://www.synapseadaptive.com/donjohnston/pengild.htm
Hutchinson T, JR KPW, Martin WN, Reichert KC, Frey LA (1989) Human-computer interaction using eye-gaze input. IEEE Transactions on Systems, Man and Cybernetics 19(6):1527–1533
Hwang F, Keates S, Langdon P, Clarkson J (2004) Mouse movements of motion-impaired users: a submovement analysis. In: Proceedings of the 6th International ACM SIGACCESS Conference on Computers and Accessibility (Assets ’04), pp 102–109
International Myotonic Dystrophy Organization (2008) http://www.myotonic\-dystrophy.org
Ivins JP, Porril J (1998) A deformable model of the human iris for measuring small three-dimensional eye movements. Machine Vision and Applications 11(1):42–51
Jaimes A, Sebe N (2005) Multimodal human computer interaction: A survey. In: Sebe N, Lew M, Huang T (eds) Computer Vision in Human-Computer Interaction, ICCV 2005 Workshop on HCI, Beijing, China, October 21, 2005, Proceedings. Lecture Notes in Computer Science, Volume 3766, Springer-Verlag, Berlin Heidelberg, pp 1–15
Ji Q, Zhu Z (2004) Eye and gaze tracking for interactive graphic display. Machine Vision and Applications 15(3):139–148
Kaplan (2008) World Institute on Disability. http://www.accessiblesociety.org
Kapoor A, Picard RW (2002) Real-time, fully automatic upper facial feature tracking. In: Proceedings of the Fifth IEEE International Conference on Automatic Face Gesture Recognition, Washington, D.C., pp 10–15
Kawato S, Tetsutani N (2004) Detection and tracking of eyes for gaze-camera control. Image and Vision Computing 22(12):1031–1038
Kim KN, Ramakrishna RS (1999) Vision-based eye-gaze tracking for human computer interface. In: Proceedings of the IEEE International Conference on Systems, Man, and Cybernetics, Vol. 2, Tokyo, Japan, pp 324–329
Kim WB, Kwan C, Fedyuk I, Betke M (2008) Camera canvas: Image editor for people with severe disabilities. Tech. Rep. 2008-010, Computer Science Department, Boston University, http://www.cs.bu.edu/techreports/pdf/2008-010-camera-canvas.pdf
Kollios G, Sclaroff S, Betke M (2001) Motion mining: Discovering spatio-temporal patterns in databases of human motion. In: Proceedings of the 2001 ACM SIGMOD Workshop on Research Issues in Data Mining and Knowledge Discovery (DMKD 2001), Santa Barbara, CA, pp 25–32
Kristensson PO, Zhai S (2004) SHARK2: a large vocabulary shorthand writing system for pen-based computers. In: Proceedings of the 17th Annual ACM Symposium on User Interface Software and Technology (UIST ’04), ACM, New York, NY, USA, pp 43–52
LaCascia M, Sclaroff S, Athitsos V (2000) Fast, reliable head tracking under varying illumination: An approach based on robust registration of texture-mapped 3D models. IEEE Transactions on Pattern Analysis and Machine Intelligence 22(4):322–336
Lankenau A, Röfer T (2000) Smart wheelchairs - state of the art in an emerging market. Künstliche Intelligenz, Schwerpunkt Autonome Mobile Systeme 4:37–39
LC Technologies Eyegaze System (2008) http://www.lctinc.com
Lombardi J, Betke M (2002) A camera-based eyebrow tracker for hands-free computer control via a binary switch. In: 7th ERCIM Workshop on User Interfaces for All, Paris, France, pp 199–200
Long AC, Landay JA, Rowe LA, Michiels J (2000) Visual similarity of pen gestures. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’00), ACM, New York, NY, USA, pp 360–367
MacKenzie IS, Zhang SX (1997) The immediate usability of graffiti. In: Proceedings of the Conference on Graphics Interface ’97, Canadian Information Processing Society, Toronto, Ont., Canada, Canada, pp 129–137
MacKenzie IS, Zhang X (2008) Eye typing using word and letter prediction and a fixation algorithm. In: Proceedings of the 2008 Symposium on Eye Tracking Research & Applications (ETRA ’08), ACM, New York, NY, USA, pp 55–58
Magee JJ, Betke M, Gips J, Scott MR, Waber BN (2008) A human-computer interface using symmetry between eyes to detect gaze direction. Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans 38(6):1–15
Meyer A (1995) Pen computing: a technology overview and a vision. ACM SIGCHI Bulletin 27(3):46–90
Morimoto CH, Koons D, Amir A, Flickner M (2000) Pupil detection and tracking using multiple light sources. Image and Vision Computing 18(4):331–335
Myers BA, Wobbrock JO, Yang S, Yeung B, Nichols J, Miller R (2002) Using handhelds to help people with motor impairments. In: Proceedings of the Fifth International ACM Conference on Assistive Technologies (Assets ’02), ACM, New York, NY, USA, pp 89–96
Myers GA, Sherman KR, Stark L (1991) Eye monitor: microcomputer-based instrument uses an internal mode to track the eye. Computer 24(3):14–21
National Multiple Sclerosis Society (2008) http://www.nationalmssociety.org
Nonaka H (2003) Communication interface with eye-gaze and head gesture using successive DP matching and fuzzy inference. Journal of Intelligent Information Systems 21(2):105–112
NSF HCC (2008) Human-Centered Computing, Division of Information & Intelligent Systems, National Science Foundation. http://www.nsf.gov
Pacchetti C, Mancini F, Aglieri R, Fundaro C, Martignoni E, Nappi G (2000) Active music therapy in Parkinson’s disease: an integrative method for motor and emotional rehabilitation. Psychosomatic Medicine 62(3):386–393
Paquette M (2005) IWeb Explorer, project report, Computer Science Department, Boston University
Park KR (2007) A real-time gaze position estimation method based on a 3-d eye model. IEEE Transactions on Systems, Man, and Cybernetics – Part B: Cybernetics 37(1):199–212
Paul S, Ramsey D (2000) Music therapy in physical medicine and rehabilitation. Australian Occupational Therapy Journal 47:111–118
Ponweiser W, Vincze M (2007) Task and context aware performance evaluation of computer vision algorithms. In: International Conference on Computer Vision Systems: Vision Systems in the Real World: Adaptation, Learning, Evaluation, Bielefeld, Germany (ICVS 2007)
Poulson D, Nicolle C (2004) Making the internet accessible for people with cognitive and communication impairments. Universal Access in the Information Society 3(1):48–56
Schnabel (2007) Director of the film “The Diving Bell and the Butterfly,” France: Pathé Renn Productions
Schwerdt K, Crowley JL (2000) Robust face tracking using color. In: Proceedings of the Fourth IEEE International Conference on Automatic Face and Gesture Recognition, Grenoble, France
Sclaroff S, Betke M, Kollios G, Alon J, Athitsos V, Li R, Magee J, Tian T (2005) Tracking, analysis, recognition of human gestures in video. In: Proceedings of the 8th International Conference on Document Analysis and Recognition, Seoul, Korea, pp 806–810
Shih SW, Liu J (2004) A novel approach to 3-D gaze tracking using stereo cameras. IEEE Transactions on Systems, Man, and Cybernetics – Part B: Cybernetics 34(1):234–245
Shneiderman B (1971) Computer science education and social relevance. ACM SIGCSE Bulletin 3(1):21–24
Shugrina M, Betke M, Collomosse J (2006) Empathic painting: Interactive stylization through observed emotional state. In: Proceedings of the 4th International Symposium on Non-Photorealistic Animation and Rendering (NPAR 2006), Annecy, France, 8 pp.
SIGACCESS (2008) ACM special interest group on accessible computing. http://www.sigaccess.org
Sirohey S, Rosenfeld A, Duric Z (2002) A method of detecting and tracking irises and eyelids in video. Pattern Recognition 35(5):1389–1401
Sirovich L, Kirby M (1987) Low-dimensional procedure for the characterization of human faces. Journal of the Optical Society of America A 4(3):519-523
Smart Nav Head Tracker (2008) Natural Point, Eye Control Technologies, Inc., Corvallis, OR, USA. http://www.naturalpoint.com/smartnav
Soukoreff RW, MacKenzie IS (2004) Towards a standard for pointing device evaluation, perspectives on 27 years of Fitts’ law research in HCI. International Journal on Human-Computer Studies 61(6):751–789
Stary C (2006) Special UAIS issue on user-centered interaction paradigms for universal access in the information society. Universal Access in the Information Society 4(3):175–176
Steriadis CE, Constantinou P (2003) Designing human-computer interfaces for quadriplegic people. ACM Transactions on Computer-Human Interaction 10(2):87–118
StrokeIt (2008) A mouse gesture recognition engine and command processor, software by Jeff Doozan. http://tcbmi.com/strokeit
Takami O, Morimoto K, Ochiai T, Ishimatsu T (1995) Computer interface to use head and eyeball movement for handicapped people. In: IEEE International Conference on Systems, Man and Cybernetics. Intelligent Systems for the 21st Century, vol 2, pp 1119–1123
Team Hoydt Website (2008) Racing towards inclusion. http://www.teamhoyt.com
Tian Y, Kanade T, Cohn J (2000) Dual-state parametric eye tracking. In: Proceedings of the Fourth IEEE International Conference on Automatic Face and Gesture Recognition, Grenoble, France, pp 110–115
Tracker Pro (2008) Infrared-based head tracker, Madentec, Ltd., Edmonton, Alberta, Canada. http://www.madentec.com
Tsugawa S, Aoki M, Hosaka A, Seki K (1994) Recent Japanese projects of AVCS-related systems. In: Proceedings of the Symposium on Intelligent Vehicles, pp 125–130
Turk M (2005) RTV4HCI: A historical overview. In: Kisacanin B, Pavlovic V, Huang T (eds) Real-Time Vision for Human-Computer Interaction, Springer-Verlag
ERCIM working group “User Interfaces for All’’. http://www.ui4all.gr/index.html
Vaidyanathan R, Chung B, Gupta L, Kook H, Kota S, West JD (2007) Tongue-movement communication and control concept for hands-free human-machine interfaces. IEEE Transactions on Systems, Man, and Cybernetics – Part A: Systems and Humans 37(4):533–546
Viewpoint (2008) Eye tracking system. Arrington Research. http://www.arring\-tonresearch.com
Waber B, Magee JJ, Betke M (2006) Web mediators for accessible browsing. In: Stephanidis C, Pieper M (eds) Universal Access in Ambient Intelligence Environments – 9th International ERCIM Workshop “User Interfaces For All” UI4ALL 2006, Königswinter, Germany, September 2006, Revised Papers. LNCS 4397, Springer-Verlag, pp 447–466
Waber BN, Magee JJ, Betke M (2005) Fast head tilt detection for human-computer interaction. In: Sebe N, Lew M, Huang T (eds) Computer Vision in Human-Computer Interaction, ICCV 2005 Workshop on HCI, Beijing, China, October 21, 2005, Proceedings. LNCS, Vol. 3766, Springer-Verlag, pp 90–99
Wang J, Athitsos V, Sclaroff S, Betke M (2008) Detecting objects of variable shape structure with hidden state shape models. IEEE Transactions on Pattern Analysis and Machine Intelligence 30(3):477–492
Wang JG, Sung E (2002) Study on eye gaze estimation. IEEE Transactions on Systems, Man, and Cybernetics – Part B: Cybernetics 32(3):332–350
Ward DJ, Blackwell AF, MacKay DJC (2000) Dasher - a data entry interface using continuous gestures and language models. In: Proceedings UIST 2000: The 13th Annual ACM Symposium on User Interface Software and Technology, http://www.inference.phy.cam.ac.uk/dasher
Wobbrock JO, Gajos KZ (2008) Goal crossing with mice and trackballs for people with motor impairments: Performance, submovements, and design directions. ACM Transactions on Accessible Computing 1(1):1–37
Wobbrock JO, Cutrell E, Harada S, MacKenzie IS (2008) An error model for pointing based on Fitts’ law. In: Proceeding of the Twenty-sixth Annual SIGCHI Conference on Human Factors in Computing Systems (CHI ’08), pp 1613–1622
Wu C, Aghajan H (2008) Context-aware gesture analysis for speaker HCI. In: Augusto J, Shapiro D, Aghajan H (eds) Proceedings of the 3rd Workshop on “Artificial Intelligence Techniques for Ambient Intelligence” (AITAmI’08), Patras, Greece. 21st-22nd of July 2008. Co-located event of ECAI 2008, Springer-Verlag
Wu TF, Chen MC (2007) Performance of different pointing devices on children with cerebral palsy. In: Stephanidis C (ed) Universal Access in Human-Computer Interaction: Applications and Services, Vol. 4556, Springer-Verlag, Berlin Heidelberg, pp 462–469
Xie X, Sudhakar R, Zhuang H (1995) Real-time eye feature tracking from a video image sequence using Kalman filter. IEEE Transactions on Systems, Man, and Cybernetics 25(12):1568–1577
Yanco HA, Gips J (1998) Driver performance using single switch scanning with a powered wheelchair: Robotic assisted control versus traditional control. In: Proceedings of the Rehabilitation Engineering and Assistive Technology Society of North America Annual Conference (RESNA ’98), RESNA Press, pp 298–300
Yang M, Kriegman D, Ahuja N (2002) Detecting faces in images: A survey. IEEE Transactions on Pattern Analysis and Machine Intelligence 24(1):34–58
Yoo DH, Chung MJ (2004) Non-intrusive eye gaze estimation without knowledge of eye pose. In: Proceedings of the Sixth IEEE International Conference on Automatic Face and Gesture Recognition, Seoul, Korea, pp 785–790
Young L, Sheena D (1975) Survey of eye movement recording methods. Behavior Research Methods and Instrumentation 7(5):397–429
ZAC Browser (2008) Web browser designed for autistic children. http://www.alsa.org
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2010 Springer Science+Business Media, LLC
About this chapter
Cite this chapter
Betke, M. (2010). Intelligent Interfaces to Empower People with Disabilities. In: Nakashima, H., Aghajan, H., Augusto, J.C. (eds) Handbook of Ambient Intelligence and Smart Environments. Springer, Boston, MA. https://doi.org/10.1007/978-0-387-93808-0_15
Download citation
DOI: https://doi.org/10.1007/978-0-387-93808-0_15
Publisher Name: Springer, Boston, MA
Print ISBN: 978-0-387-93807-3
Online ISBN: 978-0-387-93808-0
eBook Packages: Computer ScienceComputer Science (R0)