Abstract
We have developed an eye-gaze input system for people with severe physical disabilities. The system utilizes a personal computer and a home video camera to detect eye-gaze under natural light, and users can easily move the mouse cursor to any point on the screen to which they direct their gaze. We constructed this system by first confirming a large difference in the duration of voluntary (conscious) and involuntary (unconscious) blinks through a precursor experiment. Consequently, on the basis of the results obtained, we developed our eye-gaze input interface, which uses the information received from voluntary blinks. More specifically, users can decide on their input by performing voluntary blinks as substitutes for mouse clicks. In this paper, we discuss the eye-gaze and blink information input interface developed and the results of evaluations conducted.
You have full access to this open access chapter, Download conference paper PDF
Similar content being viewed by others
Keywords
1 Introduction
Eye-gaze input systems have recently been proposed as novel human-machine interfaces [1] by which users can input characters or commands to personal computers. We have developed an eye-gaze input system that utilizes a personal computer and a home video camera [2]. The system estimates the user’s gazing point with high accuracy, and allows the user to easily move the mouse cursor to the point at which they are gazing on the screen. The system can be used under natural light as well as artificial light sources such as fluorescent and LED lamps.
Our system incorporates an automatic detection method for the feature parameters of eye blinks. Using this method, we confirmed that there is a large difference in the duration of voluntary and involuntary blinks. In addition, we also confirmed that the duration of an eye blink varies widely among individuals [3]. Methods for classifying voluntary blinks on the basis of duration have been proposed; however, they all use a fixed threshold [4, 5]. In contrast, our proposed method is calibrated using individual characteristics, which enhances its operability [3]. Through this method, we developed an eye-gaze input system that uses information from voluntary blinks. That is, users can decide on their input by performing voluntary blinks that represent mouse clicks.
2 Input Interface Using Eye-Gaze and Blink Information
2.1 Eye-Gaze and Blink Detection via Image Analysis
An eye-gaze input interface needs to detect both the location of the user’s gaze and the user’s selection command. Selection can be performed with an eye blink. In our system, eye-gaze and blink are detected via image analysis [2].
The system incorporates a horizontal eye-gaze detection method in which the difference in the reflectance between the iris and the sclera is used to determine the horizontal eye-gaze (Fig. 1(a)). In other words, the horizontal eye-gaze is estimated using the difference between the integral values of the light intensity on areas A and B in Fig. 1(a). We define this differential value as the horizontal eye-gaze value. The corresponding vertical eye-gaze detection method (which is also incorporated in the system) is shown in Fig. 1(b). We estimate the vertical eye-gaze by using the integral value of the light intensity on area C in Fig. 1(b). We define this integral value as the vertical eye-gaze value. By calibrating the eye-gaze input system using the relations between these eye-gaze values and the angle of sight, we can estimate the horizontal and vertical eye-gaze of the user.
In our method, the user’s head movement induces large measurement errors. Therefore, the system must compensate for such movement. We compensate for the user’s head movement by tracing the location of the inner corner of the eye, which we estimate from the contour of the open-eye area (eye shape). The wave pattern produced by an eye blink is estimated by measuring the pixels of the open-eye area in every frame. From this wave pattern, we measure the feature parameters of each eye blink, such as its duration and maximum amplitude.
2.2 Automatic Detection of Voluntary Blinks
Our interface detects and uses voluntary blinks [3]. It prepares to receive an input command when the user firmly closes his/her eyes, and voluntary blinks are subsequently detected by using the duration of each eye blink as a feature parameter. There are many methods for estimating the duration of an eye blink. We use the half-value width of the amplitude of an eye blink’s wave pattern as its duration. There is a large difference in the duration of voluntary and involuntary blinks. Therefore, voluntary blinks can be determined from their duration. The outline of an eye blink wave pattern and its feature parameters are shown in Fig. 2. In the figure, Fph indicates the half-value of the maximum amplitude of the eye blink (Am) and Dh indicates the duration of the eye blink.
3 Evaluation Experiment
3.1 Overview of the Experiment
We evaluated our system by conducting an experiment involving eight subjects. The experimental environment comprised a home video camera (Sony HDR-HC9), and a PC (OS: Microsoft Windows 7; CPU: Intel Core i7, 2.8 GHz clock frequency). The PC was used to analyze sequential eye images captured by the video cameras. Each subject was calibrated on the system prior to the evaluation experiment. The indicators for calibration were displayed on the screen of the PC screen, as shown in Fig. 3(a). The subjects gazed at each indicator while the calibration was in progress. In addition, each subject was asked to gaze at the central indicator at the start of the calibration process, indicated by the PC emitting a beep. To calibrate the system for voluntary blink detection, the subject was asked to perform one voluntary blink. Following the calibration process, the subject selected the circle indicator (diameter: 4° as the angle of sight) on the screen of the PC by gazing, as shown in Fig. 3(b). After selecting the indicator, the subject decided on inputs via voluntary blinks. The location of the indicator randomly changed ten times per experiment.
3.2 Evaluation of the Proposed Interface
Table 1 shows the results of the evaluation experiment conducted with the eight subjects. In Table 1, the indicator selection and input decision times comprise the total time taken by the subject to control the mouse cursor on the circle indicator by eye-gaze and give an input via a voluntary blink.
The results shown in Table 1 indicate that the average time for indicator selection and input decision was 4.77 s. The results also show that all subjects were able to operate our proposed system. The average time for indicator selection and input decision in this proposed system is twice that of the eye-gaze input system developed by Hansen et al. [1]. However, their eye-gaze input system is not operated via voluntary blinks. Instead, the system decides on an input action according to eye fixations (measuring how long the eye fixates on a target such as an indicator). By contrast, our proposed interface has the advantage that it can operate general Microsoft Windows software via eye-gaze and voluntary blinks.
The results in Table 1 also show that subjects C, E, F, and G took more than ten seconds to make one selection and input. This indicates that some subjects experienced difficulty selecting the indicator by eye-gaze. We believe that these subjects failed to stably select the indicator via eye-gaze as a result of gazing point detection measurement errors caused by involuntary eye movements.
4 Conclusion
We have developed a new input interface that enables users to move a mouse cursor via eye-gaze and specify their input via voluntary blinks. The results of evaluation experiments conducted with eight subjects indicate that all subjects can operate our system. In addition, we also confirmed that the average time for indicator selection and input decision is 4.77Â s. In the future, we plan to develop a more user-friendly input interface by increasing the eye-gaze detection measurement accuracy.
References
Hansen, J.P., Torning, K., Johansen, A.S., Itoh, K., Aoki, H.: Gaze typing compared with input by head and hand. In: Proceedings of the Eye Tracking Research and Applications Symposium on Eye Tracking Research and Applications, San Antonio, Texas, USA, pp. 131–138 (2004)
Abe, K., Ohi, S., Ohyama, M.: Eye-gaze detection by image analysis under natural light. In: Jacko, J.A. (ed.) Human-Computer Interaction, Part II, HCII 2011. LNCS, vol. 6762, pp. 176–184. Springer, Heidelberg (2011)
Abe, K., Sato, H., Matsuno, S., Ohi, S., Ohyama, M.: Automatic classification of eye blink types using a frame-splitting method. In: Harris, D. (ed.) EPCE 2013, Part I. LNCS, vol. 8019, pp. 117–124. Springer, Heidelberg (2013)
Krolak, A., Strumillo, P.: Vision-based eye blink monitoring system for human-computer interfacing. In: Proceedings of the Human System Interaction Conference, pp. 994–998, Kracow, Poland (2008)
MacKenzie, I.S., Ashitani, B.: BlinkWrite: efficient text entry using eye blinks. Univ. Access Inf. Soc. 10, 69–80 (2011)
Acknowledgment
This work was supported by JSPS KAKENHI Grant Number 24700598.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this paper
Cite this paper
Abe, K., Sato, H., Matsuno, S., Ohi, S., Ohyama, M. (2015). Input Interface Using Eye-Gaze and Blink Information. In: Stephanidis, C. (eds) HCI International 2015 - Posters’ Extended Abstracts. HCI 2015. Communications in Computer and Information Science, vol 528. Springer, Cham. https://doi.org/10.1007/978-3-319-21380-4_78
Download citation
DOI: https://doi.org/10.1007/978-3-319-21380-4_78
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-21379-8
Online ISBN: 978-3-319-21380-4
eBook Packages: Computer ScienceComputer Science (R0)