Skip to main content

Advertisement

Log in

Human-Inspired Robotic Eye-Hand Coordination Enables New Communication Channels Between Humans and Robots

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

This paper concerns human-inspired robotic eye-hand coordination algorithms using custom built robotic eyes that were interfaced with a Baxter robot. Eye movement was programmed anthropomorphically based on previously reported research on human eye-hand coordination during grasped object transportation. Robotic eye tests were first performed on a component level where accurate position and temporal control were achieved. Next, 11 human subjects were recruited to observe the novel robotic system to quantify the ability of robotic eye-hand coordination algorithms to convey two kinds of information to people during object transportation tasks: first, the transported object’s delivery location and second, the level of care exerted by the robot to transport the object. Most subjects correlated decreased frequency in gaze fixations on an object’s target location with increased care of transporting an object, although these results were somewhat mixed among the 11 human subjects. Additionally, the human subjects were able to reliably infer the delivery location of the transported object purely by the robotic eye-hand coordination algorithm with an overall success rate of 91.4%. These results suggest that anthropomorphic eye-hand coordination of robotic entities could be useful in pedagogical or industrial settings.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  1. Geller T (2008) Overcoming the uncanny valley. IEEE Comput Graph Appl 28(4):11–17

    Article  Google Scholar 

  2. Ham J, Cuijpers RH, Cabibihan JJ (2015) Combining robotic persuasive strategies: the persuasive power of a storytelling robot that uses gazing and gestures. Int J Soc Robot 7(4):479–487

    Article  Google Scholar 

  3. Moon A et al. (2014) Meet me where i’m gazing: How shared attention gaze affects human-robot handover timing. In: Proceedings of 2014 ACM/IEEE international conference human-robot interact-HRI’14, no. March, pp 334–341

  4. Stanton CJ, Stevens CJ (2017) Don’t stare at me: the impact of a humanoid robot’s gaze upon trust during a cooperative human-robot visual task. Int J Soc Robot 9(5):745–753

    Article  Google Scholar 

  5. Kidd CD, Breazeal C (2004) Effect of a robot on user perceptions. In: 2004 IEEE/RSJ international conference intelligent robotics system (IEEE Cat. No.04CH37566), vol 4, pp 3559–3564

  6. Ramos L, Valencia S, Verma S, Zornoza M, Morris M, Tosunoglu S (2017) Robotic face to simulate humans undergoing eye surgery

  7. Eye Mechanism|InMoov. http://inmoov.fr/eye-mechanism/. (Accessed on 15 Apr 2018)

  8. Gosselin CM, Hamel J-F (1994) The agile eye: a high-performance three-degree-of-freedom camera-orienting device. In: Proceedings of 1994 IEEE international conference robotics automation, pp 781–786

  9. Bang YB, Paik JK, Shin BH, Lee C (2006) A three-degree-of-freedom anthropomorphic oculomotor simulator. Int J Control Autom Syst 4(2):227–235

    Google Scholar 

  10. Pateromichelakis N et al. (2014) Head-eyes system and gaze analysis of the humanoid robot Romeo. In: IEEE international conference intelligent robotics system, pp 1374–1379

  11. Bassett K, Hammond M, Smoot L (2009) A fluid-suspension, electromagnetically driven eye with video capability for animatronic applications. In: 9th IEEE-RAS international conference humanoid robotics HUMANOIDS09, pp 40–46

  12. Irmler H et al. (2014) United States Patent: system and method for generating realistic eyes. US 8,651,916 B2

  13. Brockmeyer E, Poupyrev I, Hudson S (2013) Papillon: designing curved display surfaces with printed optics. In: Proceedings of 26th annual ACM symposium user interface software technology–UIST’13, pp 457–462

  14. Allen PK, Timcenko A, Yoshimi B, Michelman P (1993) Automated tracking and grasping of a moving object with a robotic hand-eye system

  15. Hager GD, Chang W-C, Morse AS (1995) Robot hand-eye coordination based on stereo vision. IEEE Control Syst 15(1):30–39

    Article  Google Scholar 

  16. Hong W, Slotine J-JE (1997) Experiments in hand-eye coordination using active vision. In: Experimental robotics IV, London: Springer, pp 130–139

  17. J Su, H Ma, W Qiu, Y Xi (2004) Task-independent robotic uncalibrated hand-eye coordination based on the extended state observer 34(4):1917–1922

  18. Levine S, Pastor P, Krizhevsky A, Ibarz J, Quillen D (2018) Learning hand-eye coordination for robotic grasping with deep learning and large-scale data collection. Int J Rob Res 37(5):421–436

    Article  Google Scholar 

  19. Lukic L, Santos-Victor J, Billard A (2014) Learning robotic eye–arm–hand coordination from human demonstration: a coupled dynamical systems approach. Biol Cybern 108(2):223–248

    Article  MathSciNet  Google Scholar 

  20. Razin Y, Feigh K (2017) Learning to predict intent from gaze during robotic hand-eye coordination. In: Proceedings of 31th conference artifical intelligent (AAAI 2017), pp 4596–4602

  21. Chao F et al (2018) Enhanced robotic hand-eye coordination inspired from human-like behavioral patterns. IEEE Trans Cogn Dev Syst 10(2):384–396

    Article  Google Scholar 

  22. Olson ST (2018) Human-inspired robotic hand-eye coordination. Florida Atlantic University, Florida

    Google Scholar 

  23. Atchison DA (2017) Optics of the human eye. Ref Modul Mater Sci Mater Eng 10:1–19

    MathSciNet  Google Scholar 

  24. Guitton D, Volle M (1987) Gaze control in humans: eye-head coordination during orienting movements to targets within and beyond the oculomotor range. J Neurophysiol 58(3):427–459

    Article  Google Scholar 

  25. Misslisch H, Tweed D, Vilis T (1998) Neural constraints on eye motion in human eye-head saccades. J Neurophysiol 79(2):859–869

    Article  Google Scholar 

  26. Freedman EG (2008) Coordination of the eyes and head during visual orienting. Exp Brain Res 190(4):369–387

    Article  Google Scholar 

  27. Bahill AT, Clark MR, Stark L (1975) The main sequence, a tool for studying human eye movements. Math Biosci 24(3–4):191–204

    Article  Google Scholar 

  28. Johansson RS, Westling G, Bäckström A, Flanagan R (2001) Eye—hand coordination in object manipulation. J Neurosci 21(17):6917–6932

    Article  Google Scholar 

  29. Bronstein AM, Kennard C (1987) Predictive eye saccades are different from visually triggered saccades. Vis Res 27(4):517–520

    Article  Google Scholar 

  30. Robinson DA (1964) The mechanics of human saccadic eye movement. J Physiol 76:245–264

    Article  Google Scholar 

  31. Land M, Mennie N, Rusted J (1999) The roles of vision and eye movements in the control of activities of daily living. Perception 28(11):1311–1328

    Article  Google Scholar 

  32. Hayhoe M (2000) Vision using routines: a functional account of vision. Vis Cogn 7(1–3):43–64

    Article  Google Scholar 

  33. Land MF, Hayhoe MM (2001) In what ways do eyemovements contribute to everyday activities? Vis Res 41(25–26):3559–3565

    Article  Google Scholar 

  34. Iqbal ST, Bailey BP (2004) Using eye gaze patterns to identify user tasks. Grace Hopper Celebr Women Comput 6

  35. Victor TW, Harbluk JL, Engström JA (2005) Sensitivity of eye-movement measures to in-vehicle task difficulty. Transp Res Part F Traffic Psychol Behav 8(2):167–190

    Article  Google Scholar 

  36. Smith BA, Ho J, Ark W, Zhai S (2000) Hand eye coordination patterns in target selection. In: Proceedings of symposium eye tracking research applied-ETRA’00, pp 117–122

  37. Baxer User Guide for Intera 3.0 Software (2014) http://mfg.rethinkrobotics.com/mfg-mediawiki-1.22.2/images/1/12/Baxter_User_Guide_for_Intera_3.0.0.pdf

  38. Baxter Collaborative Robot Tech Specs | Rethink Robotics. https://www.rethinkrobotics.com/baxter/tech-specs/. (Accessed on 24 Aug 2018)

  39. Workstation Setup-sdk-wiki. http://sdk.rethinkrobotics.com/wiki/Workstation_Setup. (Accessed on 15 Apr 2018)

  40. Baxter PyKDL-sdk-wiki. http://sdk.rethinkrobotics.com/wiki/Baxter_PyKDL. (Accessed on 15 Apr 2018)

  41. Joint Trajectory Playback Example-sdk-wiki. http://sdk.rethinkrobotics.com/wiki/Joint_Trajectory_Playback_Example. (Accessed on 22 Jun 2018)

  42. Biondi M, Boas DA, Wilcox T (2016) On the other hand: increased cortical activation to human versus mechanical hands in infants. Neuroimage 141:143–153

    Article  Google Scholar 

  43. Fathaliyan AH, Wang X, Santos VJ (2018) Exploiting three-dimensional gaze tracking for action recognition during bimanual manipulation to enhance human–robot collaboration. Front Robot AI 5:25

    Article  Google Scholar 

  44. Stanton C, Stevens C (2014) Robot pressure: the impact of robot eye gaze and lifelike bodily movements upon decision-making and trust. In: International conference on social robotics, pp 330–339

  45. Karreman D, Bradford G, Dijk E, Lohse M, Evers V (2013) Picking favorites: the influence of robot eye-gaze on interactions with multiple users. In: 2013 IEEE/RSJ international conference on intelligent robots and systems, Tokyo, Japan

  46. Miyauchi D, Nakamura A, Kuno Y (2005) Bidirectional eye contact for human-robot communication. IEICE Trans Inf Syst E88-D:2509–2516

  47. Simoens P, Dragone M, Saffiotti A (2018) The internet of robotic things: a review of concept, added value and applications. Int J Adv Rob Syst 15:1729881418759424

    Google Scholar 

  48. Admoni H, Scassellati B (2017) Social eye gaze in human-robot interaction: a review. J Hum Robot Inter 6:25–63

    Article  Google Scholar 

  49. Abd MA, Gonzalez I, Ades C, Nojoumian M, Engeberg ED (2019) Simulated robotic device malfunctions resembling malicious cyberattacks impact human perception of trust, satisfaction, and frustration. Int J Adv Robot Syst 16(5):1729881419874962

    Article  Google Scholar 

Download references

Acknowledgement

Research reported in this publication was supported by the National Institute Of Biomedical Imaging And Bioengineering of the National Institutes of Health under Award Number R01EB025819. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health. This research was also supported by the National Institute of Aging under 3R01EB025819-04S1, National Science Foundation award #1317952, and Department of Energy contracts TOA#0000332969 and TOA#0000403076.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Erik D. Engeberg.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 1 (MP4 7243 kb)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Olson, S., Abd, M. & Engeberg, E.D. Human-Inspired Robotic Eye-Hand Coordination Enables New Communication Channels Between Humans and Robots. Int J of Soc Robotics 13, 1033–1046 (2021). https://doi.org/10.1007/s12369-020-00693-2

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-020-00693-2

Keywords

Navigation