Skip to main content
Log in

Attention Sharing Handling Through Projection Capability Within Human–Robot Collaboration

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

The link between situation awareness (SA) and the distribution of human attention, has been explored within a human robot collaboration framework. According to Endsley (1995), SA is divided into three levels: perception, comprehension and projection. It is involved in the process of making decisions and carrying out actions in a dynamic environment. This work investigates three hypotheses. First, that the ability to project a robot’s future actions improves performance in a collaborative task. Second, that the more participants are involved in tasks in a collaborative environment, the better their SA will be. Finally, that the use of a robot’s non-verbal communication motions attracts a participant’s attention more promptly than if the robot remains motionless. A within-participants study has been designed to investigate our three hypotheses. Participants were asked to perform a collaborative task with a robot. It required them to assist the robot at different moments while they were engaged in a distracting task that was catching their attention (tower of Hanoi puzzle). These moments could either be anticipated and taken into account in the human decision-making and action loop or not. Lastly, the robot could either use non-verbal communication gestures to draw human attention or not. The results have demonstrated the significance of considering the human capability to project a robot next actions in their own personal attention management. Moreover, the subjective measures showed no difference in the assessment of SA, in contrast to the objective measures, which are in line with our second hypothesis. Finally, it seems that standing stationary can be considered a gesture of non-verbal communication. In the present work, robot waiting was more salient in capturing human attention when the robot remained motionless rather than making a signaling motion.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

Data Availability

The datasets generated during and/or analysed during the current study are available from the corresponding author on reasonable request.

Notes

  1. https://pupil-labs.com/products/core/.

References

  1. Alberto NT, Skuric A, Joseph L, Padois V, Daney D (2023) Model predictive control for robots adapting their task space motion online

  2. Alonso V, De La Puente P (2018) System transparency in shared autonomy: a mini review. Front Neurorobot 12:83

    Article  PubMed  PubMed Central  Google Scholar 

  3. Bartlett CE, Cooke NJ (2015) Human-robot teaming in urban search and rescue. In: Proceedings of the human factors and ergonomics society annual meeting, SAGE Publications, Los Angeles, vol 59-1, pp 250–254

  4. Bracken B, Tobyne S, Winder A, Shamsi N, Endsley MR (2021) Can situation awareness be measured physiologically? In: Advances in neuroergonomics and cognitive engineering: proceedings of the AHFE 2021 virtual conferences on neuroergonomics and cognitive engineering, industrial cognitive ergonomics and engineering psychology, and cognitive computing and internet of things, Springer, pp 31–38

  5. Breazeal C, Kidd CD, Thomaz AL, Hoffman G, Berlin M (2005) Effects of nonverbal communication on efficiency and robustness in human-robot teamwork. In: 2005 IEEE/RSJ international conference on intelligent robots and systems. IEEE, pp 708–713

  6. Camblor B, Benhabib N, Daney D, Padois V, Salotti J-M (2022a) Task-consistent signaling motions for improved understanding in human-robot interaction and workspace sharing. In: 2022 17th ACM/IEEE international conference on human-robot interaction (HRI). IEEE, pp 275–283

  7. Camblor B, Salotti J-M, Fage C, Daney D (2022) Degraded situation awareness in a robotic workspace: accident report analysis. Theor Issues Ergonom Sci 23(1):60–79

    Article  Google Scholar 

  8. Chen JY, Procci K, Boyce M, Wright J, Garcia A, Barnes M (2014) Situation awareness-based agent transparency. US Army Research Laboratory, pp 1–29

  9. Chen JYC, Lakhmani SG, Stowers K, Selkowitz AR, Wright JL, Barnes M (2018a) Situation awareness-based agent transparency and human-autonomy teaming effectiveness. Theor Issues Ergonom Sci 19(3):259–282

    Article  Google Scholar 

  10. Chen M, Nikolaidis S, Soh H, Hsu D, Srinivasa S (2018b) Planning with trust for human-robot collaboration. In: Proceedings of the 2018 ACM/IEEE international conference on human-robot interaction, pp 307–315

  11. Clodic A, Pacherie E, Alami R, Chatila R (2017) Key elements for human-robot joint action. In: Sociality and normativity for robots: philosophical inquiries into human-robot interactions, pp 159–177

  12. Cohen PR, Levesque HJ (1991) Teamwork. Nous 25(4):487–512

    Article  Google Scholar 

  13. Edward CJ, Wannasuphoprasit W, Peshkin MA (1996) Cobots: robots for collaboration with human operators. In: Proceedings of the 1996 ASME international mechanical engineering congress and exposition

  14. Crossman J, Marinier R, Olson EB (2012) A hands-off, multi-robot display for communicating situation awareness to operators. In: 2012 international conference on collaboration technologies and systems (CTS). IEEE, pp 109–116

  15. de Winter JCF, Eisma YB, Cabrall CDD, Hancock PA, Stanton NA (2019) Situation awareness based on eye movements in relation to the task environment. Cogn Technol Work 21(1):99–111

    Article  Google Scholar 

  16. Devin S, Alami R (2016) An implemented theory of mind to improve human-robot shared plans execution. In: 2016 11th ACM/IEEE international conference on human-robot interaction (HRI). IEEE, pp 319–326

  17. Dini A, Murko C, Yahyanejad S, Augsdörfer U, Hofbaur M, Paletta L (2017) Measurement and prediction of situation awareness in human-robot interaction based on a framework of probabilistic attention. In: 2017 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 4354–4361

  18. Endsley MR (1988) Situation awareness global assessment technique (SAGAT). In: Proceedings of the IEEE 1988 national aerospace and electronics conference. IEEE, pp 789–795

  19. Endsley MR (1995) Toward a theory of situation awareness in dynamic systems. Hum Factors 37(1):32–64

    Article  Google Scholar 

  20. Endsley MR (2016) Designing for situation awareness: an approach to user-centered design. CRC Press

  21. Endsley MR (2017) From here to autonomy: lessons learned from human-automation research. Hum Factors 59(1):5–27

    Article  PubMed  Google Scholar 

  22. Endsley MR (2020) The divergence of objective and subjective situation awareness: a meta-analysis. J Cogn Eng Decis Mak 14(1):34–53

    Article  Google Scholar 

  23. Endsley MR, Selcon SJ, Hardiman TD, Croft DG (1998) A comparative analysis of SAGAT and SART for evaluations of situation awareness. In: Proceedings of the human factors and ergonomics society annual meeting, SAGE Publications, Los Angeles, vol 42-1, pp 82–86

  24. Ezenyilimba A, Wong M, Hehr A, Demir M, Wolff A, Chiou E, Cooke N (2023) Impact of transparency and explanations on trust and situation awareness in human-robot teams. J Cogn Eng Decis Mak 17(1):75–93

    Article  Google Scholar 

  25. Fournier É, Kilgus D, Landry A, Hmedan B, Pellier D, Fiorino H, Jeoffrion C (2022) The impacts of human-cobot collaboration on perceived cognitive load and usability during an industrial task: an exploratory experiment. IISE Trans Occup Ergonom Hum Factors 10(2):83–90

    Article  Google Scholar 

  26. Gateau T, Chanel CPC, Le M-H, Dehais F (2016) Considering human’s non-deterministic behavior and his availability state when designing a collaborative human-robots system. In: 2016 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 4391–4397

  27. Gombolay M, Bair A, Huang C, Shah J (2017) Computational design of mixed-initiative human-robot teaming that considers human factors: situational awareness, workload, and workflow preferences. Int J Robot Res 36(5–7):597–617

    Article  Google Scholar 

  28. Haddadin S, Croft E (2016) Physical human–robot interaction. Springer handbook of robotics, pp 1835–1874

  29. Hart SG (2006) Nasa-task load index (NASA-TLX); 20 years later. In: Proceedings of the human factors and ergonomics society annual meeting. Sage publications, Los Angeles, vol 50-9, pp 904–908

  30. Hasanzadeh S, Esmaeili B, Dodd MD (2018) Examining the relationship between construction workers’ visual attention and situation awareness under fall and tripping hazard conditions: using mobile eye tracking. J Constr Engi Manag 144(7):04018060

    Article  Google Scholar 

  31. Hoffman G (2019) Evaluating fluency in human-robot collaboration. IEEE Trans Hum Mach Syst 49(3):209–218

    Article  Google Scholar 

  32. Hoffman G, Breazeal C (2007) Effects of anticipatory action on human-robot teamwork efficiency, fluency, and perception of team. In: Proceedings of the ACM/IEEE international conference on Human-robot interaction, pp 1–8

  33. Hoffman G, Wendy J (2014) Designing robots with movement in mind. J Hum Robot Interact 3(1):91–122

    Article  Google Scholar 

  34. Hopko SK, Khurana R, Mehta RK, Pagilla PR (2021) Effect of cognitive fatigue, operator sex, and robot assistance on task performance metrics, workload, and situation awareness in human-robot collaboration. IEEE Robot Automa Lett 6(2):3049–3056

    Article  Google Scholar 

  35. Jones DG, Endsley MR (2004) Use of real-time probes for measuring situation awareness. Int J Aviat Psychol 14(4):343–367

    Article  Google Scholar 

  36. Kaber DB, Endsley MR (2004) The effects of level of automation and adaptive automation on human performance, situation awareness and workload in a dynamic control task. Theor Issues Ergonom Sci 5(2):113–153

    Article  Google Scholar 

  37. Knepper RA, Mavrogiannis CI, Proft J, Liang C (2017) Implicit communication in a joint action. In: Proceedings of the 2017 ACM/IEEE international conference on human-robot interaction, pp 283–292

  38. Koppenborg M, Nickel P, Naber B, Lungfiel A, Huelke M (2017) Effects of movement speed and predictability in human-robot collaboration. Hum Factors Ergonom Manuf Serv Ind 27(4):197–209

    Article  Google Scholar 

  39. Kulic D, Croft E (2005) Anxiety detection during human-robot interaction. In: 2005 IEEE/RSJ international conference on intelligent robots and systems. IEEE, pp 616–621

  40. Lasota PA, Fong T, Shah JA et al (2017) A survey of methods for safe human-robot interaction. Found Trends® Robot 5(4):261–349

    Article  Google Scholar 

  41. Lyons JB (2013) Being transparent about transparency: a model for human-robot interaction. In: 2013 AAAI spring symposium series

  42. McKerral A, Pammer K (2021) Identifying objective behavioural measures of expert driver situation awareness. Accid Anal Prev 163:106465

    Article  PubMed  Google Scholar 

  43. Merchant S, Kwon Y, Schnell T, Etherington T, Vogl T (2001) Evaluation of synthetic vision information system (SVIS) displays based on pilot performance. In: 20th DASC. 20th digital avionics systems conference (Cat. No. 01CH37219). IEEE, vol 1, pp 2C1–1

  44. Messeri C, Masotti G, Zanchettin AM, Rocco P (2021) Human-robot collaboration: optimizing stress and productivity based on game theory. IEEE Robot Autom Lett 6(4):8061–8068

    Article  Google Scholar 

  45. Paletta L, Dini A, Murko C, Yahyanejad S, Schwarz M, Lodron G, Ladstätter S, Paar G, Velik R (2017) Towards real-time probabilistic evaluation of situation awareness from human gaze in human-robot interaction. In: Proceedings of the companion of the 2017 ACM/IEEE international conference on human-robot interaction, pp 247–248

  46. Peternel L, Tsagarakis N, Caldwell D, Ajoudani A (2016) Adaptation of robot physical behaviour to human fatigue in human-robot co-manipulation. In: 2016 IEEE-RAS 16th international conference on humanoid robots (humanoids). IEEE, pp 489–494

  47. Rosenholtz R (2016) Capabilities and limitations of peripheral vision. Annu Rev Vis Sci 2:437–457

    Article  PubMed  Google Scholar 

  48. Roy S, Smith T, Coltin B, Williams T (2023) I need your help... or do I? Maintaining situation awareness through performative autonomy. In: Proceedings of the 2023 ACM/IEEE international conference on human-robot interaction, pp 122–131

  49. Salmon PM, Stanton NA, Walker GH, Jenkins D, Ladva D, Rafferty L, Young M (2009) Measuring situation awareness in complex systems: comparison of measures study. Int J Ind Ergonom 39(3):490–500

    Article  Google Scholar 

  50. Sauer V, Sauer A, Mertens A (2021) Zoomorphic gestures for communicating cobot states. IEEE Robot Autom Lett 6(2):2179–2185

    Article  Google Scholar 

  51. Simon HA (1975) The functional equivalence of problem solving skills. Cogn Psychol 7(2):268–288

  52. Skuric A, Padois V, Daney D (2021) On-line force capability evaluation based on efficient polytope vertex search. In: 2021 IEEE international conference on robotics and automation (ICRA). IEEE, pp 1700–1706

  53. Story M, Webb P, Fletcher SR, Tang G, Jaksic C, Carberry J (2022) Do speed and proximity affect human-robot collaboration with an industrial robot arm? Int J Soc Robot 14(4):1087–1102

    Article  Google Scholar 

  54. Tabrez A, Luebbers MB, Hayes B (2022) Descriptive and prescriptive visual guidance to improve shared situational awareness in human-robot teaming. In: Proceedings of the 21st international conference on autonomous agents and multiagent systems, pp 1256–1264

  55. Taylor RM (2017) Situational awareness rating technique (SART): the development of a tool for aircrew systems design. In: Situational awareness. Routledge, pp 111–128

  56. Terzioğlu Y, Mutlu B, Şahin E (2020) Designing social cues for collaborative robots: the role of gaze and breathing in human-robot collaboration. In: Proceedings of the 2020 ACM/IEEE international conference on human-robot interaction, pp 343–357

  57. van de Merwe K, Mallam S, Nazir S (2022) Agent transparency, situation awareness, mental workload, and operator performance: a systematic literature review. Human Factors, p 00187208221077804

  58. Vasic M, Billard A (2013) Safety issues in human-robot interactions. In: 2013 IEEE international conference on robotics and automation. IEEE, pp 197–204

  59. Weiss A, Wortmeier A-K, Kubicek B (2021) Cobots in industry 4.0: a roadmap for future practice studies on human-robot collaboration. IEEE Trans Hum Mach Syst 51(4):335–345

    Article  Google Scholar 

  60. Zanchettin AM, Casalino A, Piroddi L, Rocco P (2018) Prediction of human activity patterns for human-robot collaborative assembly tasks. IEEE Trans Ind Inform 15(7):3934–3942

    Article  Google Scholar 

  61. Zhang T, Yang J, Liang N, Pitts BJ, Prakah-Asante KO, Curry R, Duerstock BS, Wachs JP, Yu D (2020) Physiological measurements of situation awareness: a systematic review. Human factors, p 0018720820969071

Download references

Acknowledgements

The authors would like to thank Axel Plantey–Veux for his help in carrying out the experimental runs.

Funding

The authors acknowledge the support of the French Agence Nationale de la Recherche (ANR) under reference ANR-20-CE10-0005 (PACBOT Project).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Benjamin Camblor.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Ethical Approval

This study has been approved by the Comité Opérationnel d’Évaluation des Risques Légaux et Éthiques (COERLE) Ethics Committee (Application ID: 32).

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Camblor, B., Daney, D., Joseph, L. et al. Attention Sharing Handling Through Projection Capability Within Human–Robot Collaboration. Int J of Soc Robotics (2024). https://doi.org/10.1007/s12369-024-01101-9

Download citation

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s12369-024-01101-9

Keywords

Navigation