Advertisement

Recognizing Visual Attention Allocation Patterns Using Gaze Behaviors

  • Cheng Wu
  • Feng XieEmail author
  • Changsheng Yan
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 623)

Abstract

Human Attention Allocation Strategy (HAAS) is related closely to operating performance when he/she is interacting a machine through a human-machine interface. Gaze behaviors, which is acquisited by eye tracking technology, can be used to observe attention allocation. But the performance-sensitive attention allocation strategy is still hard to measure using gaze cue. In this paper, we attempt to understand visual attention allocation behavior and reveal the relationship between attention allocation strategy and interactive performance in a quantitative manner. By using a novel Multiple-Level Clustering approach, we give some results on probabilistic analysis about interactive performance of HAAS patterns in a simulation platform of thermal-hydraulic process plant. It can be observed that these patterns are sensitive to interactive performance. We conclude that our Multiple-Level Clustering approach can extract efficiently human attention allocation patterns and evaluate interactive performance using gaze movements.

Keywords

Bioinformatic Human Attention Allocation Strategy (HAAS) Multiple-Level Clustering Algorithm Human-Machine Interface 

References

  1. 1.
    Gonalves, M.G., Luiz, A.A.F., Oliveira, R.A., Grupen, D.S., Wheeler, A.H.F.: Tracing patterns and attention: Humanoid robot cognition. IEEE Intell. Syst. Mag. 15, 70–75 (2000)CrossRefGoogle Scholar
  2. 2.
    Gabaix, X., Laibson, D.I., Moloche, G., Weinberg, S.E.: The allocation of attention: theory and evidence. MIT Department of Economics Working Paper No. 03–31 (2003)Google Scholar
  3. 3.
    Admoni, H., Dragan, A., Srinivasa, S.S., Scassellati, B.: Deliberate delays during robot-to-human handovers improve compliance with gaze communication, pp. 49–56 (2014)Google Scholar
  4. 4.
    Lin, Y., Zhang, W.J., Wu, C., Yang, G., Dy, J.: A fuzzy logics clustering approach to computing human attention allocation using eyegaze movementcue. Int. J. Hum. Comput. Stud. 67(5), 455–463 (2009)CrossRefGoogle Scholar
  5. 5.
    Brown, R., Pham, B., Maeder, A.: Visual importance-biased image synthesis animation. In: Proceedings of the 1st International Conference on Computer Graphics and Interactive Techniques, New York, pp. 63–70 (2003)Google Scholar
  6. 6.
    Wu, C.: Extracting performance-related attention pattern for human-computer interaction. J. Comput. Inf. Syst. 10(17) (2014)Google Scholar
  7. 7.
    Harr, R.: Cognitive Science. Sage Publications, New Jersey (2002)Google Scholar
  8. 8.
    Wu, C., Lin, Y., Zhang, W.J.: Human attention modeling in a human-machine interface based on the incorporation of contextual features in a Bayesian network. In: Proceedings - IEEE International Conference on Systems, Man and Cybernetics, vol. 1, pp. 760–766 (2005)Google Scholar
  9. 9.
    Horvitz, E., Kadie, C., Paek, T., Hovel, D.: Models of attention in computing and communication: from principles to applications. Commun. ACM 46(3), 52–59 (2003)CrossRefGoogle Scholar
  10. 10.
    Hoque, M.M., Onuki, T., Kobayashi, Y., Kuno, Y.: Effect of robots gaze behaviors for attracting and controlling human attention. Adv. Robot. 27(11), 813–829 (2013)CrossRefGoogle Scholar
  11. 11.
    Erkelens, C.J., Vogels, I.: The initial direction and landing position of saccades. In: Findlay, J.M., Walker, R., Kentridge, R.W. (eds.) Eye Movement Research: Mechanisms, Processes, and Applications, pp. 133–144. Elsevier Science Publishing, New York (1995)CrossRefGoogle Scholar
  12. 12.
    Asteriadis, S., Karpouzis, K., Kollias, S.: Visual focus of attention in non-calibrated environments using gaze estimation. Int. J. Comput. Vis. 107(3), 293–316 (2014)CrossRefMathSciNetGoogle Scholar
  13. 13.
    Sorostinean, M., Ferland, F., Dang, T.-H.-H., Tapus, A.: Motion-oriented attention for a social gaze robot behavior. In: Beetz, M., Johnston, B., Williams, M.-A. (eds.) ICSR 2014. LNCS, vol. 8755, pp. 310–319. Springer, Heidelberg (2014)Google Scholar
  14. 14.
    Schilling, H.E.H., Rayner, K., Chumbley, J.I.: Comparing naming, lexical decision, and eye fixation times: word frequency effects and individual differences. Mem. Cognit. 26, 1270–1281 (1998)CrossRefGoogle Scholar
  15. 15.
    Pomarjanschi, L., Dorr, M., Bex, P.J., Barth, E.: Simple gaze-contingent cues guide eye movements in a realistic driving simulator, vol. 8651 (2013)Google Scholar
  16. 16.
    Pejsa, T., Andrist, S., Gleicher, M., Mutlu, B.: Gaze and attention management for embodied conversational agents. ACM Trans. Interact. Intell. Syst. 5, 3 (2015)CrossRefGoogle Scholar
  17. 17.
    Suppes, P.: Eye-movement models for arithmetic and reading performance. In: Kowler, E. (ed.) Eye Movements and their Role in Visual and Cognitive Processes, pp. 455–477. Elsevier Science Publishing, New York (1990)Google Scholar
  18. 18.
    Hegarty, M., Mayer, R.E., Green, C.E.: Comprehension of arithmetic word problems: evidence from students eye fixations. J. Educ. Psychol. 84, 76–84 (1992)CrossRefGoogle Scholar
  19. 19.
    Bampatzia, S., Vouloutsi, V., Grechuta, K., Lallée, S., Verschure, P.F.M.J.: Effects of gaze synchronization in human-robot interaction. In: Duff, A., Lepora, N.F., Mura, A., Prescott, T.J., Verschure, P.F.M.J. (eds.) Living Machines 2014. LNCS, vol. 8608, pp. 370–373. Springer, Heidelberg (2014)Google Scholar
  20. 20.
    Michael, E.J., Kim, J.V.: Attention allocation within the abstraction hierarchy. Int. J. Hum. Comput. Stud. 48, 521–545 (1992)Google Scholar

Copyright information

© Springer Science+Business Media Singapore 2016

Authors and Affiliations

  1. 1.School of Urban Rail TransportationSoochow UniversitySuzhouChina

Personalised recommendations