Abstract
This paper proposes an eye gaze based dashboard control interface for automotive environment so that drivers need not to take their hands off from steering wheel and control the dashboard only by looking at it. With the help of our smoothing and target prediction technology, we found that first time users could operate a dashboard using their eye gaze in approximately 2.5 s for each on-screen item selection in different road conditions. As part of the study we also found that average amplitude of saccadic intrusion is a good indicator of drivers’ perceived cognitive load.
You have full access to this open access chapter, Download conference paper PDF
Similar content being viewed by others
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
1 Introduction
This paper reports a user trial on exploring the possibility of gaze control interface for operating a dashboard in an automotive environment. In particular, we evaluated the effect of two different track conditions on drivers’ performance with eye-gaze tracking interface. Previous work [3] has already compared eye-gaze tracking interface with touch-screen control. We took forward that work with a low-cost eye-gaze tracker and intelligent target prediction algorithm [2] that can reduce pointing time.
A second aim of the study was to compute and compare Saccadic Intrusion (SI [1]) in a gaze controlled interface. Saccadic Intrusion is a particular type of eye movement that has been already classified and related to mental workload [4]. SI is more robust than pupilometry based method of cognitive load measurement as SIs are less sensitive to ambient light condition than pupil dilation. Previous work already investigated SI in automotive environment, but investigation of SI parameters in a gaze controlled interface is a new contribution. We investigated whether different SI parameters like amplitude and duration can still be related to mental workload even when users were manipulating their eye-gaze to operate different screen elements.
We have described the user study in the following sections.
2 Participants
We collected data from 12 participants (age ranged from 19 years to 29 years, ; 10 males, 2 females; one student was pursuing a graduate degree; whereas, all others were pursuing undergraduate degrees at the Indian Institute of Technology, Mandi). Out of these 10 participants, 7 possessed a driving license (one had the license for about 4-years; whereas, others had obtained it in the last 1 to 3 years). Out of these 7 participants, 4 participants drove a 4-wheeler and rest drove 2-wheelers. According to the self-reports, none of the drivers had driven a 2- or 4-wheeler in the mountains before (all drivers reported to have driven vehicles in the plains). Eight participants had driving licenses although the qualities of driving tests were quite different for the participants possessing a driving license. However all participants self-reported to be were expert users of the driving simulator and used to drive cars in the simulator.
3 Design
We designed the test to evaluate the effect of an eye gaze-controlled secondary task on the primary task with participants with varying level of driving skills. The primary task involved driving a car in the left lane without veering off from the lane. We used two different track conditions – a simple track consisting of only two turns and a complex track consisting of 20 turns. There were no other traffic on the road and drivers were instructed to drive safely without veering off the driving lane and simultaneously operating the car dashboard using their eye-gaze. The secondary task was initiated through an auditory cue. It mimicked a car dashboard (Fig. 1) and participants were instructed to press a button on it after hearing the auditory cue. The auditory cue was set to appear between every 5 and 7 s interval. The target button was randomly selected in the car dashboard. The pointing was undertaken through eye gaze of users using an intelligent eye gaze tracking algorithm [2] and selection was done through a hardware button on steering. The secondary task was presented at the centre of a laptop screen (Fig. 2).
The study was a 2 × 2 factorial design where the independent variables were
-
Track Condition
-
Simple
-
Complex
-
-
Presence of Secondary Task
-
Driving without Secondary Task
-
Driving with Secondary Task
-
The dependent variables were
-
Task Completion Time
-
Average deviation from centre of road
-
Number of correct selections is gaze-controlled interface
We also measured drivers’ cognitive load in terms of pulse rate using an Oximeter and NASA TLX scores.
4 Material
We used a Logitech driving simulator hardware and Torque car simulation software. The hardware was set as an automatic transmission car. We used an Tobii EyeX eye gaze tracker and EyeX SDK for the gaze-controlled interface. The primary task was run on a Linux desktop while the secondary task was conduced on a Windows 8 Laptop. The Laptop screen had a dimension of 34.5 cm × 19.5 cm with screen resolution of 1368 × 800.
5 Procedure
Initially participants were briefed about the procedure and trained to use the driving simulator and the gaze controlled interface. Then they undertook the trial in random order of track conditions. After completion of each condition, they filled up the TLX sheet based on their toughest experience during the trial.
We used logging software that recorded the trajectory of the car with timestamp from the driving simulator and cursor and eye-gaze movements from the secondary task. We also recorded participants’ pulse rate from the Oximeter with timestamp.
6 Results
We found a statistical significant correlation between number of correct selections in the secondary task and average velocity of the car (Fig. 3, ρ = –0.46, p < 0.05). Drivers could make significantly higher number [t (1,21) = –2.2, p < 0.05] of correct selections using eye-gaze control while they were driving in the complex track than the simple track (Fig. 4). In a repeated measure ANOVA, we found
-
significant main effect of Track Condition on
-
Task completion time F(1, 11) = 88.24, p < 0.01, η2 = 0.89
-
Deviation from driving lane F(1, 11) = 6.51, p < 0.05, η2 = 0.37
-
TLX score F(1, 11) = 14.58 p < 0.01, η2 = 0.57
-
-
significant main effect of Presence of Secondary Task on
-
Task completion time F(1, 11) = 22.07, p < 0.01, η2 = 0.67
-
Deviation from driving lane F(1, 11) = 13.69, p < 0.01, η2 = 0.55
-
TLX score F(1, 11) = 23.01, p < 0.01, η2 = 0.68
-
The interaction effects were not significant for any variable at p < 0.05. It may be noted that the presence of secondary task had a bigger effect on deviation from driving lane and TLX scores than the track condition while track condition had a bigger effect on task completion time than presence of secondary task. The result indicates, users adjusted their speed of driving based on road condition and rather drove slower in the complex track. As they drove slowly, they could undertake more pointing and selection tasks in complex track than the simple track. However when they were involved in a secondary task, they tend to deviate from driving lane more often than without any secondary task.
We measured the time difference between the instances of an auditory cue and selection of a target button in the gaze controlled secondary task interface. This time difference is equal to the pointing and selection time of the target button using eye gaze. Use of the intelligent eye gaze tracking reduced the pointing and selection time to 2.5 s on average even for novice users who did not use gaze-control interface earlier (Fig. 5). The difference in selection times for two different track conditions was not significant at p < 0.05.
In summary, we concluded,
-
Complexity and presence of dual task significantly increases cognitive load and task completion times.
-
Performance with secondary task is significantly related to velocity of car. In complex road, users drove slowly and performed better with secondary task than simple road condition.
-
With present state of eye gaze tracker, users needed approximately 2.5 s for pointing and selection
7 Saccadic Intrusion (SI) Analysis
We have developed an algorithm to analyze movements of eye gaze and based on the following two criteria:
-
1.
Eye gaze returned to same position between 60 and 870 ms interval
-
2.
Maximum deviation of eye gaze within the interval is more than 0.4° in X-axis
Figure 6 shows an example of eye-gaze movement and corresponding saccadic intrusion. The pink line is the eye gaze movement and the blue line signifies SI.
We calculated the amplitude and duration of all SIs for all participants and Figs. 7 and 8 below show an histogram of SIs for simple and complex track conditions. It may be noted that the highest number of SIs had an amplitude between 0.4° and 0.6° and duration of 80 to 100 ms. The number of SIs had been decreased beyond 360 ms duration and 3.5° of visual angle. The result is similar to Abadia and Gowen’s study [1].
We compared the number, average amplitude and average duration of SIs between simple and complex track conditions and matched them with users’ perceived cognitive load in terms of TLX scores (Table 1). It may be noted that there is a significant difference in a paired t-test between simple and complex conditions for the TLX scores and number of occurrences of SI.
We matched the TLX scores with SI parameters. A match occurs when a participant rated (in TLX) a track condition higher than the other and the SI parameter is also found higher accordingly. We found that for 10 out of 12 participants, the number of SIs matched with their TLX scores while for 8 out of 12 participants the average SI amplitude matched with their TLX scores.
However, the maximum and average pulse rate (Table 2) was neither significantly different between simple and complex road conditions nor they matched with TLX scores.
8 Conclusions
Researchers already investigated eye gaze controlled interface in automotive environment for operating dashboard control and even driving the car itself. Our study further demonstrates that
-
Eye gaze can be used to operate controls inside the car.
-
Saccadic intrusions can be used to detect drivers’ mental workload simultaneously with a gaze controlled interface.
Eye gaze is advantageous over existing touch based car interface in terms of the fact that users need not to take their hands of the steering wheel or the gear. In a complex track like a mountainous road, drivers may find it advantageous not to take their hands off the steering. However in the present study, drivers took 2.5 s on average to make a selection in the car dashboard which is slightly higher than the safe time interval that drivers are allowed take their eyes off from road. It may be noted that our present study involved drivers who never used gaze-controlled interface before and our previous studies demonstrated that users can undertake pointing and selection tasks in less than 2 s using gaze-controlled interface after two to three training sessions.
Regarding cognitive load measurement, previous work on saccadic intrusion considered controlled task on free viewing. Our study considered a more realistic task of operating a car dashboard while driving. It has been found that even when drivers needed to manipulate their eye gaze to operate an interface, number of saccadic intrusion and their mean amplitude are indicative o their mental workload in most cases. However, our study did not find any significant difference in pulse rate in different driving conditions.
References
Abadia, R.V., Gowen, E.: Characteristics of saccadic intrusions. Vis. Res. 44(23), 2675–2690 (2004)
Biswas P., Langdon P.: Multimodal intelligent eye-gaze tracking system. Int. J. Hum. Comput. Interact. 31(4), 277–294 (2015). Taylor & Francis, Print ISSN: 1044-7318
Poitschke, T., Laquai, F., Stamboliev S., Rigoll, G.: Gaze-based interaction on multiple displays in an automotive environment. In: IEEE International Conference on Systems, Man, and Cybernetics (SMC), pp. 543–548 (2011). ISSN: 1062-922X
Tokuda S., Obinata G., Palmer E., Chaparo A.: Estimation of mental workload using saccadic eye movements in a free-viewing task. In: 23rd International Conference of the IEEE EMBS, pp. 4523–4529 (2011)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this paper
Cite this paper
Biswas, P., Dutt, V. (2015). Effect of Road Conditions on Gaze-Control Interface in an Automotive Environment. In: Antona, M., Stephanidis, C. (eds) Universal Access in Human-Computer Interaction. Access to the Human Environment and Culture. UAHCI 2015. Lecture Notes in Computer Science(), vol 9178. Springer, Cham. https://doi.org/10.1007/978-3-319-20687-5_11
Download citation
DOI: https://doi.org/10.1007/978-3-319-20687-5_11
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-20686-8
Online ISBN: 978-3-319-20687-5
eBook Packages: Computer ScienceComputer Science (R0)