Abstract
EagleEyes is a system that allows the user to control the computer through electrodes placed on the head. For people without disabilities it takes 15 to 30 minutes to learn to control the cursor sufficiently to spell out a message with an onscreen keyboard. We currently are working with two dozen children with profound disabilities to teach them to use EagleEyes to control computer software for entertainment, communication, and education. We have had some dramatic successes.
This is a preview of subscription content, log in via an institution.
Preview
Unable to display preview. Download preview PDF.
References
S. Baluja and D. Pomerleau. Non-Intrusive Gaze Tracking Using Artificial Neural Networks. In Advances in Neural Information Processing Systems, J.D. Cowan. G. Tesauro, J. Alspector (eds.), Morgan Kaufmann Publishers, San Francisco, CA. 1994.
W. Barry and G. Melvill Jones. Influence of Eyelid Movement Upon Electro-Oculographic Recording of Vertical Eye Movements. Aerospace Medicine. 36(9) 855–858. 1965.
J.R. Carl. Principles and Techniques of Electro-oculography. Handbook of Balance Function Testing, G. P. Jacobson, C.W. Newman, and J. M. Kartush (eds.), Mosby Year Book, 69–82. 1993.
J. Gips, P. DiMattia, F.X. Curran and C.P. Olivieri. Using EagleEyes — an Electrodes Based Device for Controlling the Computer with Your Eyes — to Help People with Special Needs. The Fifth International Conference on Computers Helping People with Special Needs (ICCHP '96), Linz, Austria. In J. Klaus; E. Auff; W. Kremser; W. Zagler; (eds.) Interdisciplinary Aspects on Computers Helping People with Special Needs. Vienna: R. Oldenbourg. 77–83. 1996.
J. Gips and C.P. Olivieri. EagleEyes: An Eye Control System for Persons with Disabilities. The Eleventh International Conference on Technology and Persons with Disabilities. Los Angeles. 1996. (See www.cs.bc.edugips/EagleEyes)
J. Gips, C.P. Olivieri and J.J. Tecce. Direct Control of the Computer through Electrodes Placed Around the Eyes. Fifth International Conference on Human Computer Interaction, Orlando, FL. In M. J. Smith and G. Salvendy (eds.) Human-Computer Interaction: Applications and Case Studies. Elsevier. 630–635. 1993.
M. Hashimoto, Y. Yonezawa and K. Itoh. New Mouse-Function using Teeth-Chattering and Potential around Eyes for the Physically Challenged. The Fifth International Conference on Computers Helping People with Special Needs (ICCHP '96), Linz, Austria. In J. Klaus; E. Auff; W. Kremser; W. Zagler; (eds.) Interdisciplinary Aspects on Computers Helping People with Special Needs. Vienna: R. Oldenbourg. 93–98. 1996.
T.E. Hutchinson. Computers that Sense Eye Position on the Display. Computer. July 1993. 65, 67. 1993
T.E. Hutchinson et al. Human-Computer Interaction Using Eye-Gaze Input. IEEE Transactions on Systems, Man, and Cybernetics. 19(6) 1527–1534. 1989
R.J.K. Jacob. The Use of Eye Movements in Human-Computer Interaction Techniques: What You Look at Is What You Get. ACM Transactions on Information Systems. 9(3) 152–169. 1991.
R.J.K. Jacob. What You Look at Is What You Get. Computer. July. 65–66. 1993.
R.J.K. Jacob, J.L. Leggett, B.A. Myers, and R. Pausch. An Agenda for Human-Computer Interaction Research: Interaction Styles and Input/Output Devices. Behaviour and Information Technology. 12(2) 69–79. 1993.
K. Keyes. Witness to a Miracle. Marshfield Reporter. October 19. Pages A1, A3. 1995.
Lusted, H.S., Knapp, R.B., and Lloyd, A. Biosignal Processing in Virtual Reality. Third Annual Virtual Reality Conference, San Jose, CA. 1992.
Morgan, B. Knowing Michael. Boston College Magazine. September 1996. 30–38. 1996.
O.H. Mowrer, R.C. Ruch, and N.E. Miller. The Corneo-Retinal Potential Difference as the Basis of the Galvanometric Method of Recording Eye Movements. American Journal of Physiology. 114, 423. 1936.
J. Nielsen. Noncommand User Interfaces. Communications of the ACM. 36(4) 83–99. 1993.
K. Rayner. Eye Movements and Cognitive Processes. In J. M. Findlay et al. (eds.), Eye Movement Research, Elsevier Science, pages 3–22. 1995.
R. Razdan and A. Kielar. Eye Tracking for Man/Machine Interfaces. Sensors. September. 1988.
S. Sardella. New Computer Gives Disabled a “Voice≓. Boston Herald. November 20. 1994.
E. Schott. Uber die Registrierung des Nystagmus. Deutsches Archiv fur Klinische Medizin. 140 79–90. 1992.
E. Unger, M. Bijak, W. Mayr, C. Schmutterer and G. Schnetz. EOG-Controller for Rehabilitation Technology. The Fifth International Conference on Computers Helping People with Special Needs (ICCHP '96), Linz, Austria, July 1996. In J. Klaus; E. Auff; W. Kremser; W. Zagler (eds.) Interdisciplinary Aspects on Computers Helping People with Special Needs. Vienna: R. Oldenbourg. 401–408. 1996.
H.A. Yanco. Wheelesley: a Robotic Wheelchair System; this volume. 1998.
H.A. Yanco and J. Gips. Preliminary Investigation of a Semi-Autonomous Robotic Wheelchair Directed Through Electrodes Proceedings of the Rehabilitation Engineering Society of North America Annual Conference, Pittsburgh, PA, RESNA Press, 1997, 414–416. 1997.
L.R. Young and D. Sheena. Survey of Eyemovement Recording Methods. Behavioral Research Methods and Instrumentation 7(5): 397–429. 1975.
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 1998 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Gips, J. (1998). On Building Intelligence into EagleEyes. In: Mittal, V.O., Yanco, H.A., Aronis, J., Simpson, R. (eds) Assistive Technology and Artificial Intelligence. Lecture Notes in Computer Science, vol 1458. Springer, Berlin, Heidelberg. https://doi.org/10.1007/BFb0055969
Download citation
DOI: https://doi.org/10.1007/BFb0055969
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-64790-4
Online ISBN: 978-3-540-68678-1
eBook Packages: Springer Book Archive