Gender, Age, Colour, Position and Stress: How They Influence Attention at Workplace?

  • Vidas Raudonis
  • Rytis Maskeliūnas
  • Karolis Stankevičius
  • Robertas Damaševičius
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10408)

Abstract

We explore the relationship between attention and action, and focus on human reaction to stress in the Supervisory Control and Data Acquisition (SCADA) based Human Computer Interface (HCI) environment aiming to measure the reaction time and warn against attention deficit. To provoke human reaction we simulate several provocative situations mimicking real-world accidents while working on the industrial production line. During the simulation of the industrial line control, the subjects are presented on screen with affective visual stimuli imitating the possible accident and the reaction of subjects is tracked with a gaze tracker. We measure a subjects’ response time from stimuli onset to the eye fixation (gaze time) and to the pressing of “line stop” button (press time). The reaction time patterns are analysed with respect to subject’s gender, age, colour and position of stop sign. The results confirm the significance of gender, age, sign colour and position factors.

Keywords

Attention focus Stress Gaze-tracking Cognitive SCADA HCI 

References

  1. 1.
    Fersch, A.: Attention, please! IEEE Pervasive Comput. 13(1), 48–54 (2014)CrossRefGoogle Scholar
  2. 2.
    Gollan, B., Ferscha, A.: Directed Effort - a generic measure and for higher level behavior analysis. In: International Conference on Physiological Computing Systems, PhyCS 2014, pp. 83–90 (2014)Google Scholar
  3. 3.
    Bakker, S., Niemantsverdriet, K.: The interaction-attention continuum: considering various levels of human attention in interaction design. Int. J. Des. 10(2), 1–14 (2016)Google Scholar
  4. 4.
    Cummings, M.L., Gao, F., Thornburg, K.M.: Boredom in the workplace: a new look at an old problem. Hum. Factors 58(2), 279–300 (2016)CrossRefGoogle Scholar
  5. 5.
    Hopstaken, J.F., van der Linden, D., Bakker, A.B., Kompier, M.A.J., Leung, Y.K.: Shifts in attention during mental fatigue: Evidence from subjective, behavioral, physiological, and eye-tracking data. J. Exp. Psychol. Hum. Percept. Perform. 42(6), 878–889 (2016)CrossRefGoogle Scholar
  6. 6.
    Jipp, M.: Reaction times to consecutive automation failures: a function of working memory and sustained attention. Hum. Factors 58(8), 1248–1261 (2016)CrossRefGoogle Scholar
  7. 7.
    Stiefelhagen, R., Finke, M., Yang, J., Waibel, A.: From gaze to focus of attention. In: Huijsmans, D.P., Smeulders, A.W.M. (eds.) VISUAL 1999. LNCS, vol. 1614, pp. 765–772. Springer, Heidelberg (1999). doi:10.1007/3-540-48762-X_94 CrossRefGoogle Scholar
  8. 8.
    Ballard, D.H., Hayhoe, M.M.: Modelling the role of task in the control of gaze. Vis. Cogn. 17, 1185–1204 (2009)CrossRefGoogle Scholar
  9. 9.
    Horstmann, G., Herwig, A.: Novelty biases attention and gaze in a surprise trial. Atten. Percept. Psychophys. 78, 69 (2016)CrossRefGoogle Scholar
  10. 10.
    Pieters, R., Warlop, L.: Visual attention during brand choice: the impact of time pressure and task motivation. Int. J. Res. Mark. 16, 1–16 (1999)CrossRefGoogle Scholar
  11. 11.
    Rosa, P.J., Esteves, F., Arriaga, P.: Effects of fear-relevant stimuli on attention: integrating gaze data with subliminal exposure. In: IEEE International Symposium on Medical Measurements and Applications (MeMeA), pp. 1–6 (2014)Google Scholar
  12. 12.
    Yang, Z., Jackson, T., Gao, X., Chen, H.: Identifying selective visual attention biases related to fear of pain by tracking eye movements within a dot-probe paradigm. PAIN 153(8), 1742–1748 (2012)CrossRefGoogle Scholar
  13. 13.
    Armstrong, T., Olatunji, B.O., Sarawgi, S., Simmons, C.: Orienting and maintenance of gaze in contamination fear: biases for disgust and fear cues. Behav. Res. Ther. 48(5), 402–408 (2010)CrossRefGoogle Scholar
  14. 14.
    Terburg, D., Hooiveld, N., Aarts, H., Kenemans, J.L., van Honk, J.: Eye tracking unconscious face-to-face confrontations dominance motives prolong gaze to masked angry faces. Psychol. Sci. 22(3), 314–319 (2011)CrossRefGoogle Scholar
  15. 15.
    Isaacowitz, D.M., Wadlinger, H.A., Goren, D., Wilson, H.R.: Selective preference in visual fixation away from negative images in old age? An eye-tracking study. Psychol. Aging 21(1), 40–48 (2006)CrossRefGoogle Scholar
  16. 16.
    Shechner, T., Jarcho, J.M., Wong, S., Leibenluft, E., Pine, D.S., Nelson, E.E.: Threats, rewards, and attention deployment in anxious youth and adults: an eye tracking study. Biol. Psychol. 122, 121–129 (2017)CrossRefGoogle Scholar
  17. 17.
    Du, W., Kim, J.H.: Performance-based eye-tracking analysis in a dynamic monitoring task. In: Schmorrow, D.D., Fidopiastis, C.M. (eds.) AC 2016. LNCS, vol. 9744, pp. 168–177. Springer, Cham (2016). doi:10.1007/978-3-319-39952-2_17 Google Scholar
  18. 18.
    Wang, Y., Chen, X., Chen, Z.: Towards region-of-attention analysis in eye tracking protocols. Electron. Imaging VII(6), 1–6 (2016)Google Scholar
  19. 19.
    Vasiljevas, M., Gedminas, T., Ševčenko, A., Jančiukas, M., Blažauskas, T., Damaševičius, R.: Modelling eye fatigue in gaze spelling task. In: 2016 IEEE 12th International Conference on Intelligent Computer Communication and Processing (ICCP), pp. 95–102 (2016)Google Scholar
  20. 20.
    Pfeiffer, T., Hellmers, J., Schön, E.-M., Thomaschewski, J.: Empowering user interfaces for industrie 4.0. Proc. IEEE 104(5), 986–996 (2016)CrossRefGoogle Scholar
  21. 21.
    Raudonis, V.: Development and investigation of portable eye tracking system: theory and practice. Lambert, Saarbrucken (2012)Google Scholar
  22. 22.
    Kaklauskas, A., Vlasenko, A., Raudonis, V., Zavadskas, E.K.: Intelligent pupil analysis of student progress system. In: Yang, D. (ed.) Informatics in Control, Automation and Robotics. LNEE, vol 133, pp. 165–168. Springer, Heidelberg (2011). doi:10.1007/978-3-642-25992-0_24
  23. 23.
    Raudonis, V., Paulauskaitė-Tarasevičienė, A., Kižauskienė, L.: The Gaze tracking system with natural head motion compensation. Informatica 23(1), 105–124 (2012)Google Scholar
  24. 24.
    Kim, H.-Y.: Statistical notes for clinical researchers: assessing normal distribution (2) using skewness and kurtosis. Restor. Dent. Endod. 38(1), 52–54 (2013)CrossRefGoogle Scholar
  25. 25.
    Jain, A., Bansal, R., Kumar, A., Singh, K.D.: A comparative study of visual and auditory reaction times on the basis of gender and physical activity levels of medical first year students. Int. J. Appl. Basic Med. Res. 5(2), 124–127 (2015)CrossRefGoogle Scholar
  26. 26.
    Tun, P.A., Lachman, M.E.: Age differences in reaction time and attention in a national telephone sample of adults: education, sex, and task complexity matter. Dev. Psychol. 44(5), 1421–1429 (2008)CrossRefGoogle Scholar
  27. 27.
    Silic, M., Cyr, D., Back, A., Holzer, A.: Effects of color appeal, perceived risk and culture on user’s decision in presence of warning banner message. In: 50th Hawaii International Conference on Sciences (HICSS), pp. 527–536 (2007)Google Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  • Vidas Raudonis
    • 1
  • Rytis Maskeliūnas
    • 2
  • Karolis Stankevičius
    • 1
  • Robertas Damaševičius
    • 3
  1. 1.Department of Automation, Faculty of Electrical and Electronics EngineeringKaunas University of TechnologyKaunasLithuania
  2. 2.Department of Multimedia Engineering, Faculty of InformaticsKaunas University of TechnologyKaunasLithuania
  3. 3.Department of Software Engineering, Faculty of InformaticsKaunas University of TechnologyKaunasLithuania

Personalised recommendations