Advertisement

Intelligent Service Robotics

, Volume 11, Issue 3, pp 279–288 | Cite as

SoTCM: a scene-oriented task complexity metric for gaze-supported teleoperation tasks

  • Haitham El-Hussieny
  • Samy F. M. Assal
  • Jee-Hwan Ryu
Original Research Paper
  • 108 Downloads

Abstract

Recent developments in human–robot interaction (HRI) research have heightened the need to incorporate indirect human signals to implicitly facilitate intuitive human–guided interactions. Eye-gaze has been widely used nowadays as an input interface in multi-modal teleoperation scenarios due to their advantage in revealing human intentions and forthcoming actions. However, to date, there has been no discussion about how the structure of the environment, that the human is interacting with, could affect the complexity of the teleoperation task. In this paper, a new metric named “Scene-oriented Task Complexity Metric” (SoTCM) is proposed to estimate the complexity of a certain scene that is involved in eye-gaze-supported teleoperation tasks. The proposed SoTCM objectively estimates the effort that could be exerted by the human operator in terms of the expected time required to point at all the informative locations retrieved from the scene under discussion. The developed SoTCM depends on both the density and distribution of the informative locations in the scene, while incorporates the eye movement behavior found in the psychology literature. The proposed SoTCM is subjectively validated by using the time-to-complete index in addition to the standard (NASA-TLX) workload measure in eight varying structure scenes. Results confirmed a significant relation between SoTCM and the measured task workload which endorses the applicability of using SoTCM in predicting scene complexities and subsequently the task workload in advance.

Keywords

Human–robot interaction Teleoperation Eye-gaze tracking Task workload Complexity metrics 

Notes

Acknowledgements

This research was partially supported by the Civil-Military Technology Cooperation Program (15-CM-RB-09) and the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIP) (No. NRF- 2016R1E1A1A02921594). The first author is also supported by a Post-doctoral fellowship from Korea University of Education and Technology (KOREATECH) which is gratefully acknowledged.

References

  1. 1.
    Anderson RJ (1996) Autonomous, teleoperated, and shared control of robot systems. In: Proceedings of the 1996 IEEE international conference on robotics and automation, 1996, IEEE, vol 3, pp 2025–2032Google Scholar
  2. 2.
    Aronson RM, Santini T, Kübler TC, Kasneci E, Srinivasa S, Admoni H (2018) Eye-hand behavior in human-robot shared manipulation. In: Proceedings of the 2018 ACM/IEEE international conference on human-robot interaction, ACM, pp 4–13Google Scholar
  3. 3.
    Bonev B, Chuang L, Escolano F (2013) How do image complexity, task demands and looking biases influence human gaze behavior? Pattern Recogn Lett 34(7):723–730CrossRefGoogle Scholar
  4. 4.
    Bruce V, Green PR, Georgeson MA (2003) Visual perception: physiology, psychology, and ecology. Psychology Press, LondonGoogle Scholar
  5. 5.
    Carpenter RH (1988) Movements of the eyes, 2nd edn. Pion Limited, LondonGoogle Scholar
  6. 6.
    Castaldo R, Montesinos L, Wan TS, Serban A, Massaro S, Pecchia L (2017) Heart rate variability analysis and performance during a repeated mental workload task. In: EMBEC & NBC 2017, Springer, Berlin, pp 69–72Google Scholar
  7. 7.
    Cheng M, Mitra NJ, Huang X, Torr PH, Hu S (2015) Global contrast based salient region detection. IEEE Trans Pattern Anal Mach Intell 37(3):569–582CrossRefGoogle Scholar
  8. 8.
    Dautenhahn K (2007) Methodology & themes of human-robot interaction: a growing research field. Int J Adv Rob Syst 4(1):15CrossRefGoogle Scholar
  9. 9.
    De Waard D (1996) The measurement of drivers’ mental workload. Groningen University, Traffic Research Center, GroningenGoogle Scholar
  10. 10.
    Donderi DC (2006) Visual complexity: a review. Psychol Bull 132(1):73CrossRefGoogle Scholar
  11. 11.
    Dragan AD, Srinivasa SS (2013) A policy-blending formalism for shared control. Int J Robot Res 32(7):790–805CrossRefGoogle Scholar
  12. 12.
    Drewes H (2010) Eye gaze tracking for human computer interaction. Ph.D. thesis, lmuGoogle Scholar
  13. 13.
    El-Hussieny H, Assal SF, Abouelsoud A, Megahed SM (2015) A novel intention prediction strategy for a shared control tele-manipulation system in unknown environments. In: 2015 IEEE international conference on mechatronics (ICM), IEEE, pp 204–209Google Scholar
  14. 14.
    Fitts PM (1954) The information capacity of the human motor system in controlling the amplitude of movement. J Exp Psychol 47(6):381CrossRefGoogle Scholar
  15. 15.
    Freixenet J, Muñoz X, Raba D, Martí J, Cufí X (2002) Yet another survey on image segmentation: region and boundary information integration. In: Computer VisionECCV 2002, Springer, Berlin, pp 408–422Google Scholar
  16. 16.
    Fu KS, Mui J (1981) A survey on image segmentation. Pattern Recogn 13(1):3–16MathSciNetCrossRefGoogle Scholar
  17. 17.
    Gomes J, Marques F, Lourenço A, Mendonça R, Santana P, Barata J (2016) Gaze-directed telemetry in high latency wireless communications: the case of robot teleoperation. In: IECON 2016-42nd annual conference of the IEEE industrial electronics society, IEEE, pp 704–709Google Scholar
  18. 18.
    Hansen JP, Alapetite A, MacKenzie IS, Møllenbach E (2014) The use of gaze to control drones. In: Proceedings of the symposium on eye tracking research and applications, ACM, pp 27–34Google Scholar
  19. 19.
    Hart SG, Staveland LE (1988) Development of nasa-tlx (task load index): results of empirical and theoretical research. Adv Psychol 52:139–183CrossRefGoogle Scholar
  20. 20.
    Hayhoe M, Ballard D (2005) Eye movements in natural behavior. Trends Cognit Sci 9(4):188–194CrossRefGoogle Scholar
  21. 21.
    Heyer C (2010) Human-robot interaction and future industrial robotics applications. In: 2010 IEEE/RSJ international conference on intelligent robots and systems (IROS), pp 4749–4754. IEEEGoogle Scholar
  22. 22.
    Hou X, Zhang L (2007) Saliency detection: a spectral residual approach. In: IEEE conference on computer vision and pattern recognition, 2007. CVPR’07, IEEE, pp 1–8Google Scholar
  23. 23.
    Jacob RJ (1995) Eye tracking in advanced interface design. Virtual Environ Adv Interface Des pp 258–288Google Scholar
  24. 24.
    Jorna PG (1992) Spectral analysis of heart rate and psychological state: a review of its validity as a workload index. Biol Psychol 34(2):237–257CrossRefGoogle Scholar
  25. 25.
    Kim DJ, Hazlett-Knudsen R, Culver-Godfrey H, Rucks G, Cunningham T, Portee D, Bricout J, Wang Z, Behal A (2012) How autonomy impacts performance and satisfaction: results from a study with spinal cord injured subjects using an assistive robot. IEEE Trans Syst, Man Cybernet, Part A: Syst Hum 42(1):2–14CrossRefGoogle Scholar
  26. 26.
    Kramer J, Burrus N, Echtler F, Daniel H.C, Parker M (2012) Object modeling and detection. In: Hacking the Kinect, Springer, Berlin, pp 173–206Google Scholar
  27. 27.
    Latif HO, Sherkat N, Lotfi A (2009) Teleoperation through eye gaze (telegaze): a multimodal approach. In: 2009 IEEE international conference on robotics and biomimetics (ROBIO), IEEE, pp 711–716Google Scholar
  28. 28.
    Liu P, Li Z (2011) Toward understanding the relationship between task complexity and task performance. In: Internationalization, design and global development, pp 192–200Google Scholar
  29. 29.
    MacKenzie IS (1992) Fitts’ law as a research and design tool in human–computer interaction. Hum–Comput Interact 7(1):91–139MathSciNetCrossRefGoogle Scholar
  30. 30.
    Marquart G, Cabrall C, de Winter J (2015) Review of eye-related measures of drivers mental workload. Proced Manuf 3:2854–2861CrossRefGoogle Scholar
  31. 31.
    Niemeyer G, Preusche C, Hirzinger G (2008) Telerobotics. In: Springer handbook of robotics, Springer, Berlin, pp 741–757Google Scholar
  32. 32.
    Rouse WB, Rouse SH (1979) Measures of complexity of fault diagnosis tasks. IEEE Trans Syst Man Cybernet 9(11):720–727CrossRefGoogle Scholar
  33. 33.
    Rubio S, Díaz E, Martín J, Puente JM (2004) Evaluation of subjective mental workload: a comparison of swat, nasa-tlx, and workload profile methods. Appl Psychol 53(1):61–86CrossRefGoogle Scholar
  34. 34.
    Rusu RB, Cousins S (2011) 3d is here: point cloud library (pcl). In: 2011 IEEE international conference on robotics and automation (ICRA), IEEE, pp 1–4Google Scholar
  35. 35.
    Saeb S, Weber C, Triesch J (2011) Learning the optimal control of coordinated eye and head movements. PLoS Comput Biol 7(11):e1002253CrossRefGoogle Scholar
  36. 36.
    Schwab DP, Cummings L (1976) A theoretical analysis of the impact of task scope on employee performance. Acad Manag Rev 1(2):23–35CrossRefGoogle Scholar
  37. 37.
    Sibert LE, Jacob RJ (2000) Evaluation of eye gaze interaction. In: Proceedings of the SIGCHI conference on human factors in computing systems, ACM, New York, pp 281–288Google Scholar
  38. 38.
    Vertegaal R (2008) A fitts law comparison of eye tracking and manual input in the selection of visual targets. In: Proceedings of the 10th international conference on multimodal interfaces, ACM, New York, pp 241–248Google Scholar
  39. 39.
    Wolfe JM, Horowitz TS (2004) What attributes guide the deployment of visual attention and how do they do it? Nat Rev Neurosci 5(6):495–501CrossRefGoogle Scholar
  40. 40.
    Xiao L, Hung E (2007) An efficient distance calculation method for uncertain objects. In: CIDM 2007. IEEE symposium on computational intelligence and data mining, 2007, IEEE, pp 10–17Google Scholar
  41. 41.
    You E, Hauser K (2012) Assisted teleoperation strategies for aggressively controlling a robot arm with 2d input. In: Robotics: science and systems, vol 7, p 354Google Scholar
  42. 42.
    Zhang L, Tong MH, Marks TK, Shan H, Cottrell WG (2008) Sun: a bayesian framework for saliency using natural statistics. J Vis 8(7):32–32CrossRefGoogle Scholar
  43. 43.
    Zhang X, MacKenzie IS (2007) Evaluating eye tracking with iso 9241-part 9. In: HCI intelligent multimodal interaction environments human–computer interaction, Springer, Berlin, pp 779–788Google Scholar
  44. 44.
    Zijlstra FRH (1993) Efficiency in work behaviour: a design approach for modern tools. Delft University of Technology, DelftGoogle Scholar

Copyright information

© Springer-Verlag GmbH Germany, part of Springer Nature 2018

Authors and Affiliations

  1. 1.School of Mechanical EngineeringKorea University of Technology and EducationCheonan-CitySouth Korea
  2. 2.Electrical Engineering Department, Faculty of Engineering (at Shoubra)Benha UniversityBenhaEgypt
  3. 3.Mechatronics and Robotics Engineering Department, School of Innovative Design EngineeringEgypt-Japan University of Science and Technology (E-JUST)New Borg El ArabEgypt
  4. 4.Faculty of Engineering, Department of Production Engineering and Mechanical DesignTanta UniversityTantaEgypt

Personalised recommendations