Language-free graphical signage improves human performance and reduces anxiety when working collaboratively with robots

Abstract

As robots become more ubiquitous, and their capabilities extend, novice users will require intuitive instructional information related to their use. This is particularly important in the manufacturing sector, which is set to be transformed under Industry 4.0 by the deployment of collaborative robots in support of traditionally low-skilled, manual roles. In the first study of its kind, this paper reports how static graphical signage can improve performance and reduce anxiety in participants physically collaborating with a semi-autonomous robot. Three groups of 30 participants collaborated with a robot to perform a manufacturing-type process using graphical information that was relevant to the task, irrelevant, or absent. The results reveal that the group exposed to relevant signage was significantly more accurate in undertaking the task. Furthermore, their anxiety towards robots significantly decreased as a function of increasing accuracy. Finally, participants exposed to graphical signage showed positive emotional valence in response to successful trials. At a time when workers are concerned about the threat posed by robots to jobs, and with advances in technology requiring upskilling of the workforce, it is important to provide intuitive and supportive information to users. Whilst increasingly sophisticated technical solutions are being sought to improve communication and confidence in human-robot co-working, our findings demonstrate how simple signage can still be used as an effective tool to reduce user anxiety and increase task performance.

Notes

  1. 1.

    A member of the research team enabled the robot’s safety switches whist sat out of view from the participant behind a screen. The researcher monitored participants interacting with the robot via CCTV, and could stop the robot at any time

References

  1. 1.

    Aykin NM, Aykin T (1991) Individual differences in human-computer interaction. Comput Indus Eng 20 (3):373–379

    Article  Google Scholar 

  2. 2.

    Bahar G, Masliah M, Wolff R, Park P (2007) Desktop reference for crash reduction factors. Tech. rep., U.S Department of Transportation, Federal Highway Administration, Office of Safety

  3. 3.

    Banks MR, Willoughby LM, Banks WA (2008) Animal-assisted therapy and loneliness in nursing homes: use of robotic versus living dogs. J Am Med Dir Assoc 9(3):173–177

    Article  Google Scholar 

  4. 4.

    Bartneck C, Suzuki T, Kanda T, Nomura T (2007) The influence of people’s culture and prior experiences with aibo on their attitude towards robots. Ai Soc 21(1-2):217–230

    Article  Google Scholar 

  5. 5.

    Ben-Bassat T, Shinar D (2006) Ergonomic guidelines for traffic sign design increase sign comprehension. Hum Factors 48(1):182–195

    Article  Google Scholar 

  6. 6.

    Broadbent E, Stafford R, MacDonald B (2009) Acceptance of healthcare robots for the older population: review and future directions. Int J Soc Robot 1(4):319–330

    Article  Google Scholar 

  7. 7.

    Cameron D, Aitken JM, Collins EC, Boorman L, Chua A, Fernando S, McAree O, Martinez-Hernandez U, Law J (2015) Framing factors: the importance of context and the individual in understanding trust in human-robot interaction In: IEEE/RSJ International conference on intelligent robots and systems (IROS). Workshop on Designing and Evaluating Social Robots for Public Settings

  8. 8.

    Chan AH, Ng AW (2010) Investigation of guessability of industrial safety signs: effects of prospective-user factors and cognitive sign features. Int J Ind Ergon 40(6):689–697

    Article  Google Scholar 

  9. 9.

    Cole S (2006) Information and empowerment: the keys to achieving sustainable tourism. J Sustain Tourism 14(6):629–644

    MathSciNet  Article  Google Scholar 

  10. 10.

    Eimontaite I, Gwilt I, Cameron D, Aitken JM, Rolph J, Mokaram S, Law J (2016) Assessing graphical robot aids for interactive co-working. In: Advances in ergonomics of manufacturing: managing the enterprise of the future. Springer, pp 229–239

  11. 11.

    European Factories of the Future Research Association, et al. (2013) Factories of the future: multi-annual roadmap for the contractual PPP under horizon 2020 publications office of the European Union. Brussels

  12. 12.

    Frixione M, Lombardi A (2015) Street signs and ikea instruction sheets: pragmatics and pictorial communication. Rev Philos Psychol 6(1):133–149

    Article  Google Scholar 

  13. 13.

    Galea ER, Xie H, Lawrence PJ, et al. (2014) Experimental and survey studies on the effectiveness of dynamic signage systems. Fire Safety Sci 11:1129–1143

    Article  Google Scholar 

  14. 14.

    Gwilt I, Rolph J, Eimontaite I, Cameron D, Aitken J, Mokaram S, Law J (2018) Cobotics: developing a visual language for human- robotic collaborations. In: Proceedings of the cumulus conference

  15. 15.

    Hayes AF (2012) PROCESS: a versatile computational tool for observed variable mediation, moderation, and conditional process modeling. Tech rep., University of Kansas, KS

  16. 16.

    ISO3864-1:2011 (2011) Graphical symbols - safety colours and safety signs - Part 1: design principles for safety signs and safety markings. Standard, International Organization for Standardization, Geneva, CH

  17. 17.

    Kanda T, Hirano T, Eaton D, Ishiguro H (2004) Interactive robots as social partners and peer tutors for children: a field trial. Human-Comput Interact 19(1):61–84

    Article  Google Scholar 

  18. 18.

    Kenworthy JB, Jones J (2009) The roles of group importance and anxiety in predicting depersonalized ingroup trust. Group Processes Intergroup Relat 12(2):227–239

    Article  Google Scholar 

  19. 19.

    Khansari-Zadeh SM, Khatib O (2015) Learning potential functions from human demonstrations with encapsulated dynamic and compliant behaviors. Auton Robot, 1–25

  20. 20.

    Lamont D, Kenyon S, Lyons G (2013) Dyslexia and mobility-related social exclusion: the role of travel information provision. J Transp Geogr 26:147–157

    Article  Google Scholar 

  21. 21.

    Laughery KR (2006) Safety communications: warnings. Appl Ergonom 37(4):467–478

    Article  Google Scholar 

  22. 22.

    Lautizi M, Laschinger HK, Ravazzolo S (2009) Workplace empowerment, job satisfaction and job stress among Italian mental health nurses: an exploratory study. J Nurs Manag 17(4):446–452

    Article  Google Scholar 

  23. 23.

    Lewinski P, den Uyl TM, Butler C (2014) Automated facial coding: validation of basic emotions and facs aus in facereader. J Neurosci Psychol Econ 7(4):227

    Article  Google Scholar 

  24. 24.

    MacDorman KF, Vasudevan SK, Ho CC (2009) Does Japan really have robot mania? comparing attitudes by implicit and explicit measures. AI Soc 23(4):485–510

    Article  Google Scholar 

  25. 25.

    Madhavan P, Phillips RR (2010) Effects of computer self-efficacy and system reliability on user interaction with decision support systems. Comput Hum Behav 26(2):199–204

    Article  Google Scholar 

  26. 26.

    McAree O, Aitken JM, Veres SM (2016) A model based design framework for safety verification of a semi-autonomous inspection drone. In: 2016 UKACC 11th International conference on control (CONTROL), pp 1–6

  27. 27.

    Metta G, Fitzpatrick P, Natale L (2006) Yarp: yet another robot platform. Int J Adv Robot Syst 3(1):8

    Article  Google Scholar 

  28. 28.

    Mills ME, Sullivan K (1999) The importance of information giving for patients newly diagnosed with cancer: a review of the literature. J Clin Nurs 8(6):631–642

    Article  Google Scholar 

  29. 29.

    Mokaram S, Aitken JM, Martinez-Hernandez U, Eimontaite I, Cameron D, Rolph J, Gwilt I, McAree O, Law J (2017) A ROS-integrated API for the KUKA LBR iiwa collaborative robot. IFAC-PapersOnLine 50(1):15,859–15,864

    Article  Google Scholar 

  30. 30.

    Moreno-Jiménez B, Rodríguez-Carvajal R, Garrosa Hernández E, Morante Benadero M, et al. (2008) Terminal versus non-terminal care in physician burnout: the role of decision-making processes and attitudes to death. Salud Mental 31(2):93–101

    Google Scholar 

  31. 31.

    Muir BM (1987) Trust between humans and machines, and the design of decision aids. Int J Man-Mach Stud 27(5–6):527–539

    Article  Google Scholar 

  32. 32.

    Nicholson N, Soane E, Fenton-O’Creevy M, Willman P (2005) Personality and domain-specific risk taking. J Risk Res 8(2):157–176

    Article  Google Scholar 

  33. 33.

    Nomura T, Suzuki T, Kanda T, Kato K (2006) Measurement of anxiety toward robots. In: The 15th IEEE International symposium on robot and human interactive communication, 2006. ROMAN 2006. IEEE, pp 372–377

  34. 34.

    Nomura T, Suzuki T, Kanda T, Kato K (2006b) Measurement of negative attitudes toward robots. Interact Stud 7(3):437–454

    Article  Google Scholar 

  35. 35.

    Nomura T, Shintani T, Fujii K, Hokabe K (2007) Experimental investigation of relationships between anxiety, negative attitudes, and allowable distance of robots. In: Proceedings of the 2nd IASTED international conference on human computer interaction. ACTA Press, Chamonix, pp 13–18

    Google Scholar 

  36. 36.

    Ozer EM, Bandura A (1990) Mechanisms governing empowerment effects: a self-efficacy analysis. J Person Soc Psychol 58(3):472

    Article  Google Scholar 

  37. 37.

    Pawar VM, Law J, Maple C (2016) Manufacturing robotics - the next robotic industrial revolution. Tech. rep., UK Robotics and Autonomous Systems Network

  38. 38.

    Pearson LC, Moomaw W (2005) The relationship between teacher autonomy and stress, work satisfaction, empowerment, and professionalism. Educ Res Quart 29(1):37

    Google Scholar 

  39. 39.

    Quigley M, Conley K, Gerkey B, Faust J, Foote T, Leibs J, Wheeler R, Ng AY (2009) ROS: an open-source robot operating system. In: ICRA workshop on open source software, p 5

  40. 40.

    SPARC The Partnership for Robotics in Europe (2015) Robotics 2020 multi-annual roadmap for robotics in europe. horizon 2020 call ict-2016 (ict-25 & ict-26) (white paper) release b 03/12/2015 rev a. Tech. rep., EU Commission

  41. 41.

    Stafford R, Broadbent E, Jayawardena C, Unger U, Kuo IH, Igic A, Wong R, Kerse N, Watson C, MacDonald BA (2010) Improved robot attitudes and emotions at a retirement home after meeting a robot. In: 2010 IEEE RO-MAN. IEEE, pp 82–87

  42. 42.

    Tang CH, Wu WT, Lin CY (2009) Using virtual reality to determine how emergency signs facilitate way-finding. Appl Ergonom 40(4):722–730

    Article  Google Scholar 

  43. 43.

    Thorvald P, Lindblom J (2014) Initial development of a cognitive load assessment tool. In: The 5th AHFE International conference on applied human factors and ergonomics, AHFE, pp 223–232

  44. 44.

    Torkzadeh G, Koufteros X, Pflughoeft K (2003) Confirmatory analysis of computer self-efficacy. Struct Equ Model 10(2):263–275

    MathSciNet  Article  Google Scholar 

  45. 45.

    Tufte E, Graves-Morris P (1983) The visual display of quantitative information. Graphics Press, Connecticut

    Google Scholar 

  46. 46.

    Urry HL, Gross JJ (2010) Emotion regulation in older age. Curr Dir Psychol Sci 19(6):352–357

    Article  Google Scholar 

  47. 47.

    Ussher J, Kirsten L, Butow P, Sandoval M (2006) What do cancer support groups provide which other supportive relationships do not? The experience of peer support groups for people with cancer. Soc Sci Med 62(10):2565–2576

    Article  Google Scholar 

  48. 48.

    Vilar E, Rebelo F, Noriega P (2014) Indoor human wayfinding performance using vertical and horizontal signage in virtual reality. Human Factors Ergonom Manuf Serv Indus 24(6):601– 615

    Article  Google Scholar 

  49. 49.

    Virga S, Zettinig O, Esposito M, Pfister K, Frisch B, Neff T, Navab N, Hennersperger C (2016) Automatic force-compliant robotic ultrasound screening of abdominal aortic aneurysms. In: IEEE International conference on intelligent robots and systems (IROS)

  50. 50.

    Wada K, Shibata T, Saito T, Tanie K (2002) Analysis of factors that bring mental effects to elderly people in robot assisted activity. In: 2002 IEEE/RSJ International conference on intelligent robots and systems, vol 2. IEEE, pp 1152–1157

  51. 51.

    Wilcox R, Nikolaidis S, Shah J (2012) Optimization of temporal dynamics for adaptive human-robot interaction in assembly manufacturing. Robot Sci Syst VIII:441–448

    Google Scholar 

  52. 52.

    Wilkes L, White K, O’Riordan L (2000) Empowerment through information: supporting rural families of oncology patients in palliative care. Aust J Rural Heal 8(1):41–46

    Article  Google Scholar 

Download references

Funding

The authors acknowledge support from the EPSRC Centre for Innovative Manufacturing in Intelligent Automation, in undertaking this research work under grant reference number EP/IO33467/1.

Author information

Affiliations

Authors

Corresponding author

Correspondence to James Law.

Ethics declarations

The study was approved by the University of Sheffield Psychology Department ethics committee.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Author contribution

All authors contributed equally to this work.

Electronic supplementary material

Below is the link to the electronic supplementary material.

(AVI 13.3 MB)

Appendices

Appendix A: robot control

The practical human-robot collaborative process used in this experiment was carried out using a KUKA LBR iiwa 7 R800 operating in both hand-guided and autonomous modes. For the purposes of safety, the robot was operated in ‘T1’ mode.Footnote 1 This resulted in a maximum Cartesian velocity at the end effector of 250mm/s.

The position of the robot in Cartesian space can defined by the tuple, Position = {X,Y,Z,A,B,C}, where {X,Y,Z} represents the displacement around the X,Y, and Z axes respectively, and {A,B,C} represents the rotation about the X,Y and Z axes. A series of X-Y tube locations, {Tubes}, were set up on the table at a height of (Zlower). An operating height (Zraised) was defined that would allow the telescopic picking tool mounted on the robot to move above freely above the tubes in the X-Y plane with a clearance of around 1cm. A home position, (Positionhome), was set between the operator and tube locations, at a height of (Zraised).

In hand-guided mode, triggered by applying a 0.2-N force to the wrist of the robot, the operator was able to move the end effector in the X-Y plane but prevented from movement rotationally about each axis and in the Z-plane by restricting the robot’s compliance settings. The compliance settings for hand-guided (ComplianceManual) and autonomous (ComplianceAuto) modes are shown in Table 7.

Table 7 Robot parameters in hand-guided and autonomous modes

Control algorithm

The robot control algorithm comprises repeated loops of hand-guided operation followed by autonomous-mode operation. These begin with the robot in the home position, waiting for the operator to enable hand-guiding mode by applying the necessary >0.2-N force.

Once in hand-guiding mode, the operator is free to move the end effector within the X-Y plane above the tube locations. Once the operator releases the robot (i.e. no X-Y forces are applied), the robot switches to autonomous mode and moves to the nearest tube. If the user applies a force, a 3-s timer is started. If an X-Y force is applied, the timer is reset so that the the robot remains in hand-guided mode until the forces are removed again and the timer expires. If no X-Y forces are applied, the robot switches to autonomous mode.

Once in autonomous mode, the robot makes a refining move (in the X-Y plane) to the nearest known tube location. Once in position, the robot then makes two vertical movements: firstly to Z = Zlower, which places the magnetic probe in contact with potential objects for picking, then back to Z = Zraised, which retrieves picked objects from the tubes. Finally, the robot moves back to the home position for the operator to retrieve any picked objects, and waits for initiation of the next hand-guided sequence. All autonomous end effector motions are linear in X-Y or Z directions, with joint accelerations governed by the KUKA control software.

Pseudo-code for the process described above is given in Algorithm 1.

figurea

Appendix B: questionnaires

Questionnaires used in the study:

Fig. 7
figure7

Signage recollection question. “Please indicate which signs you have seen during the experiment”, shown in Figure 7 (graphics also used in signage effectiveness questions)

Table 8 Negative attitudes towards robot scale [33]
Table 9 Robot anxiety scale [34] (subscale S2 - anxiety towards behavioural characteristics of robot)
Table 10 Risk taking index [32]
Table 11 Prior robot experience [24]
Table 12 Graphical signage effectiveness questionnaire 1 (adapted from [13])
Table 13 Graphical signage effectiveness questionnaire 2 (adapted from [13])
Table 14 Technology usage and programming experience questions
Table 15 Programming experience questionnaire

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Eimontaite, I., Gwilt, I., Cameron, D. et al. Language-free graphical signage improves human performance and reduces anxiety when working collaboratively with robots. Int J Adv Manuf Technol 100, 55–73 (2019). https://doi.org/10.1007/s00170-018-2625-2

Download citation

Keywords

  • Human-robot interaction
  • Graphical signage
  • Anxiety towards robots
  • Flexible manufacturing
  • Collaborative robotics
  • Industry 4.0
  • Technology acceptance