Toward an Expressive Bipedal Robot: Variable Gait Synthesis and Validation in a Planar Model

Abstract

Humans are efficient, yet expressive in their motion. Human walking behaviors can be used to walk across a great variety of surfaces without falling and to communicate internal state to other humans through variable gait styles. This provides inspiration for creating similarly expressive bipedal robots. To this end, a framework is presented for stylistic gait generation in a compass-like under-actuated planar biped model. The gait design is done using model-based trajectory optimization with variable constraints. For a finite range of optimization parameters, a large set of 360 gaits can be generated for this model. In particular, step length and cost function are varied to produce distinct cyclic walking gaits. From these resulting gaits, 6 gaits are identified and labeled, using embodied movement analysis, with stylistic verbs that correlate with human activity, e.g., “lope” and “saunter”. These labels have been validated by conducting user studies in Amazon Mechanical Turk and thus demonstrate that visually distinguishable, meaningful gaits are generated using this framework. This lays groundwork for creating a bipedal humanoid with variable socially competent movement profiles.

This is a preview of subscription content, log in to check access.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

References

  1. 1.

    Breazeal CL (2000) Sociable machines: expressive social exchange between humans and robots. PhD dissertation, Massachusetts Institute of Technology

  2. 2.

    Bartl A, Bosch S, Brandt M, Dittrich M, Lugrin B (2016) The influence of a social robot’s persona on how it is perceived and accepted by elderly users. In: Agah A, Cabibihan J-J, Howard AM, Salichs MA, He H (eds) Social robotics. Springer, Cham, pp 681–691

    Google Scholar 

  3. 3.

    Hamacher A, Bianchi-Berthouze N, Pipe AG, Eder K (2016) Believing in BERT: using expressive communication to enhance trust and counteract operational error in physical human–robot interaction. In: 2016 25th IEEE international symposium on robot and human interactive communication (RO-MAN), Aug 2016, pp 493–500

  4. 4.

    Leite I, Castellano G, Pereira A, Martinho C, Paiva A (2014) Empathic robots for long-term interaction. Int J Soc Robot 6(3):329–341. https://doi.org/10.1007/s12369-014-0227-1

    Article  Google Scholar 

  5. 5.

    Rossi S, Staffa M, Tamburro A (2018) Socially assistive robot for providing recommendations: comparing a humanoid robot with a mobile application. Int J Soc Robot 10(2):265–278. https://doi.org/10.1007/s12369-018-0469-4

    Article  Google Scholar 

  6. 6.

    Kuroki Y, Fujita M, Ishida T, Nagasaka K, Yamaguchi J (2003) A small biped entertainment robot exploring attractive applications. In: IEEE international conference on robotics and automation, 2003. Proceedings. ICRA’03. vol 1. IEEE, pp 471–476

  7. 7.

    Oh J-H, Hanson D, Kim W-S, Han Y, Kim J-Y, Park I-W (2006) Design of android type humanoid robot Albert Hubo. In: 2006 IEEE/RSJ international conference on intelligent robots and systems, pp 1428–1433

  8. 8.

    Becker-Asano C, Ishiguro H (2011) Evaluating facial displays of emotion for the android robot Geminoid F. In: 2011 IEEE workshop on affective computational intelligence (WACI), April 2011, pp 1–8

  9. 9.

    Kishi T, Otani T, Endo N, Kryczka P, Hashimoto K, Nakata K, Takanishi A (2012) Development of expressive robotic head for bipedal humanoid robot. In: 2012 IEEE/RSJ international conference on intelligent robots and systems (IROS), pp 4584–4589

  10. 10.

    Kishi T, Otani T, Endo N, Kryczka P, Hashimoto K, Nakata K, Takanishi A (2013) Development of expressive robotic head for bipedal humanoid robot with wide moveable range of facial parts, facial color. In: Padois V, Bidaud P, Khatib O (eds) Romansy 19—robot design, dynamics and control. Springer, Vienna, pp 151–158

    Google Scholar 

  11. 11.

    Liu C, Ham J, Postma E, Midden C, Joosten B, Goudbeek M (2013) Representing affective facial expressions for robots and embodied conversational agents by facial landmarks. Int J Soc Robot 5(4):619–626. https://doi.org/10.1007/s12369-013-0208-9

    Article  Google Scholar 

  12. 12.

    Asheber WT, Lin C-Y, Yen SH (2016) Humanoid head face mechanism with expandable facial expressions. Int J Adv Robot Syst 13(1):29. https://doi.org/10.5772/62181

    Article  Google Scholar 

  13. 13.

    Stephens-Fripp B, Naghdy F, Stirling D, Naghdy G (2017) Automatic affect perception based on body gait and posture: a survey. Int J Soc Robot 9(5):617–641. https://doi.org/10.1007/s12369-017-0427-6

    Article  Google Scholar 

  14. 14.

    Cutting JE, Kozlowski LT (1977) Recognizing friends by their walk: gait perception without familiarity cues. Bull Psychon Soc 9(5):353–356. https://doi.org/10.3758/BF03337021

    Article  Google Scholar 

  15. 15.

    Davis JW (2001) Visual categorization of children and adult walking styles. In: International conference on audio-and video-based biometric person authentication. Springer, Berlin, pp 295–300

  16. 16.

    Blake R, Shiffrar M (2007) Perception of human motion. Ann Rev Psychol 58:47–73

    Article  Google Scholar 

  17. 17.

    Rosenthal-von der Pütten AM, Krämer NC, Herrmann J (2018) The effects of humanlike and robot-specific affective nonverbal behavior on perception, emotion, and behavior. Int J Soc Robot 10(5):569–582. https://doi.org/10.1007/s12369-018-0466-7

    Article  Google Scholar 

  18. 18.

    Van den Stock J, Righart R, De Gelder B (2007) Body expressions influence recognition of emotions in the face and voice. Emotion 7(3):487–494

    Article  Google Scholar 

  19. 19.

    McGeer T (1990) Passive dynamic walking. Int J Robot Res 9(2):62–82

    Article  Google Scholar 

  20. 20.

    Collins SH, Wisse M, Ruina A (2001) A three-dimensional passive-dynamic walking robot with two legs and knees. Int J Robot Res 20(7):607–615

    Article  Google Scholar 

  21. 21.

    Kuo AD (2007) Choosing your steps carefully. IEEE Robot Autom Mag 14(2):18–29

    MathSciNet  Article  Google Scholar 

  22. 22.

    Vukobratović M, Borovac B (2005) Zero-moment point—thirty five years of its life. Int J Humanoid Robot 1(1): 157–173. http://www.cs.cmu.edu/~cga/legs/vukobratovic.pdf

  23. 23.

    Chestnutt J, Lau M, Cheung G, Kuffner J, Hodgins J, Kanade T (2005) Footstep planning for the Honda ASIMO humanoid. In: Proceedings of the 2005 IEEE international conference on robotics and automation, 2005. ICRA 2005, pp 629–634

  24. 24.

    Feng S, Whitman E, Xinjilefu X, Atkeson CG (2014) Optimization based full body control for the atlas robot. In: 2014 14th IEEE-RAS international conference on humanoid robots (humanoids), pp 120–127

  25. 25.

    Gouaillier D, Hugel V, Blazevic P, Kilner C, Monceaux J, Lafourcade P, Marnier B, Serre J, Maisonnier B (2009) Mechatronic design of NAO humanoid. In: IEEE international conference on robotics and automation, pp 769–774

  26. 26.

    Bhounsule PA, Cortell J, Grewal A, Hendriksen B, Karssen JD, Paul C, Ruina A (2014) Low-bandwidth reflex-based control for lower power walking: 65 km on a single battery charge. Int J Robot Res 33(10):1305–1321

    Article  Google Scholar 

  27. 27.

    Spong MW, Bullo F (2002) Controlled symmetries and passive walking. In: IFAC triennial World Congress

  28. 28.

    Wisse M, Keliksdal G, Frankenhyyzen JV, Moyer B (2007) Passive-based walking robot. IEEE Robot Autom Mag 14(2):52–62

    Article  Google Scholar 

  29. 29.

    Hobbelen D, de Boer T, Wisse M (2008) System overview of bipedal robots Flame and TUlip: tailor-made for limit cycle walking. In: 2008 IEEE/RSJ international conference on intelligent robots and systems, pp 2486–2491

  30. 30.

    Goswami A, Espiau B, Keramane A (1997) Limit cycles in a passive compass gait biped and passivity-mimicking control laws. Auton Robots 4(3):273–286

    Article  Google Scholar 

  31. 31.

    Ramezani A (2013) Feedback control design for MARLO, a 3D-bipedal robot. PhD dissertation, The University of Michigan

  32. 32.

    Grizzle J, Hurst J, Morris B, Park H-W, Sreenath K (2009) MABEL, a new robotic bipedal walker and runner. In: American control conference, ACC’09. pp 2030–2036

  33. 33.

    Sinnet RW, Jiang S, Ames AD (2014) A human-inspired framework for bipedal robotic walking design. Int J Biomechatron Biomed Robot 3(1):20–41

    Article  Google Scholar 

  34. 34.

    Hereid A, Cousineau EA, Hubicki CM, Ames AD (2016) 3D dynamic walking with underactuated humanoid robots: a direct collocation framework for optimizing hybrid zero dynamics. In: 2016 IEEE international conference on robotics and automation (ICRA), May 2016, pp 1447–1454

  35. 35.

    Huang R, Cheng H, Chen Y, Chen Q, Lin X, Qiu J (2016) Optimisation of reference gait trajectory of a lower limb exoskeleton. Int J Soc Robot 8(2):223–235. https://doi.org/10.1007/s12369-015-0334-7

    Article  Google Scholar 

  36. 36.

    Da X, Harib O, Hartley R, Griffin B, Grizzle JW (2016) From 2D design of underactuated bipedal gaits to 3D implementation: walking with speed tracking. IEEE Access 4:3469–3478

    Article  Google Scholar 

  37. 37.

    Motahar MS, Veer S, Poulakakis I (2016) Composing limit cycles for motion planning of 3D bipedal walkers. In: 2016 IEEE 55th conference on decision and control (CDC). pp 6368–6374

  38. 38.

    Or J (2012) Computer simulations of a humanoid robot capable of walking like fashion models. IEEE Trans Syst Man Cybern Part C (Appl Rev) 42(2):241–248

    Article  Google Scholar 

  39. 39.

    Kwon O, Park JH (2003) Gait transitions for walking and running of biped robots. In: 2003 IEEE international conference on robotics and automation (Cat. No. 03CH37422), vol 1, Sept 2003, pp 1350–1355

  40. 40.

    Morimoto J, Atkeson CG (2007) Learning biped locomotion. IEEE Robot Autom Mag 14(2):41–51

    Article  Google Scholar 

  41. 41.

    Yin K, Loken K, van de Panne M (2007) Simbicon: Simple biped locomotion control. ACM Trans Graph, 26(3), Article 105

  42. 42.

    Venture G, Kadone H, Zhang T, Grèzes J, Berthoz A, Hicheur H (2014) Recognizing emotions conveyed by human gait. Int J Soc Robot 6(4):621–632

    Article  Google Scholar 

  43. 43.

    Etemad SA, Arya A (2016) Expert-driven perceptual features for modeling style and affect in human motion. IEEE Trans Hum Mach Syst 46(4):534–545

    Article  Google Scholar 

  44. 44.

    Kavafoglu Z, Kavafoglu E, Cimen G, Capin T, Gurcay H (2016) Style-based biped walking control. Visual Comput. Dec 2016 (online). https://doi.org/10.1007/s00371-016-1338-5

  45. 45.

    Asano F, Yamakita M (2001) Virtual gravity and coupling control for robotic gait synthesis. IEEE Trans Syst Man Cybern Part A Syst Hum 31(6):737–745

    Article  Google Scholar 

  46. 46.

    Todorov E, Jordan MI (2002) Optimal feedback control as a theory of motor coordination. Nat Neurosci 5(11):1226

    Article  Google Scholar 

  47. 47.

    Gallup AC, Hale JJ, Sumpter DJ, Garnier S, Kacelnik A, Krebs JR, Couzin ID (2012) Visual attention and the acquisition of information in human crowds. Proc Natl Acad Sci 109(19):7245–7250

    Article  Google Scholar 

  48. 48.

    Hortensius R, Cross ES (2018) From automata to animate beings: the scope and limits of attributing socialness to artificial agents. Mar 2018 (online). https://psyarxiv.com/sr2c8

  49. 49.

    Heimerdinger M, LaViers A (2019) Modeling the interactions of context and style on affect in motion perception: stylized gaits across multiple environmental contexts. Int J Soc Robot (online). https://doi.org/10.1007/s12369-019-00514-1

  50. 50.

    Studd K, Cox L (2013) Everybody is a body. Dog Ear Publishing, Indianapolis

    Google Scholar 

  51. 51.

    Bartenieff I, Lewis D (1980) Body movement: coping with the environment. Gordon and Breach Science Publishers, New York

    Google Scholar 

  52. 52.

    Nicholson G (2008) The lost art of walking: the history, science, philosophy, and literature of pedestrianism. Penguin

  53. 53.

    Synonyms and antonyms of walk (online). https://www.merriam-webster.com/thesaurus/walk

  54. 54.

    Goswami A, Thuilot B, Espiau B (1996) Compass-like biped robot part I: stability and bifurcation of passive gaits. PhD dissertation, INRIA

  55. 55.

    Grizzle JW, Plestan F, Abba G (1999) Poincare’s method for systems with impulse effects: application to mechanical biped locomotion. In: Proceedings of the 38th IEEE conference on decision and control, 1999, vol 4. pp 3869–3876

  56. 56.

    Westervelt ER, Grizzle JW, Chevallereau C, Choi JH, Morris B (2007) Feedback control of dynamic bipedal robot locomotion, vol 28. CRC Press, Boca Raton

    Google Scholar 

  57. 57.

    Hurmuzlu Y, Marghitu DB (1994) Rigid body collisions of planar kinematic chains with multiple contact points. Int J Robot Res 13(1):82–92. https://doi.org/10.1177/027836499401300106

    Article  Google Scholar 

  58. 58.

    Betts JT (2009) Practical methods for optimal control and estimation using nonlinear programming, 2nd edn. Cambridge University Press, New York

    Google Scholar 

  59. 59.

    Wächter A, Biegler LT (2006) On the implementation of an interior-point filter line-search algorithm for large-scale nonlinear programming. Math Program 106(1):25–57. https://doi.org/10.1007/s10107-004-0559-y

    MathSciNet  Article  MATH  Google Scholar 

  60. 60.

    Patterson MA, Rao AV (2014) GPOPS-II: a MATLAB software for solving multiple-phase optimal control problems using hp-adaptive gaussian quadrature collocation methods and sparse nonlinear programming. ACM Trans Math Softw (TOMS) 41(1):1

    MathSciNet  Article  Google Scholar 

  61. 61.

    Kelly MP. OptimTraj user’s guide, Version 1.5 (online). https://github.com/MatthewPeterKelly/OptimTraj. Accessed 9 Apr 2019

  62. 62.

    Drag (2019) Oxford dictionary. Retrieved from https://en.oxforddictionaries.com/definition/drag

  63. 63.

    (online). http://sentence.yourdictionary.com/lope. Accessed 9 Apr 2019

  64. 64.

    (online) https://www.nytimes.com/2011/12/05/nyregion/another-animal-escapes-but-this-time-city-shrugs.html. Accessed 9 Apr 2019

  65. 65.

    (online). https://en.oxforddictionaries.com/definition/shuffle. Accessed 9 Apr 2019

  66. 66.

    Kellogg E (1892) John Godsoe’s legacy, ser. American Boy’s series. Lee and Shepard 1892 (online). https://books.google.com/books?id=Gfo0AAAAMAAJ. Accessed 9 Apr 2019

  67. 67.

    Whitnell B (1982) The ring of bells. Coward, McCann and Geoghegan 1982 (online). https://books.google.com/books?id=uV7qBxLFg5IC. Accessed 9 Apr 2019

  68. 68.

    Qualtrics (online). https://www.qualtrics.com/. Accessed 9 Apr 2019

  69. 69.

    Amazon mechanical turk (online). http://www.mturk.com. Accessed 9 Apr 2019

  70. 70.

    Tamuz O, Liu C, Belongie S, Shamir O, Kalai AT (2011) Adaptively learning the crowd kernel. arXiv preprint arXiv:1105.1033

  71. 71.

    Huzaifa U, Bernier C, Calhoun Z, Kohout C, Heddy J, Libowitz B, Moenning A, Ye J, Maguire C, LaViers A (2016) Embodied movement strategies for development of a core-located actuation walker. In: IEEE BioRob

Download references

Acknowledgements

This work was conducted under IRB #17697 and funded by National Science Foundation (NSF) Grant #1701295. The authors would like to thank Prof. Hae Won Park for useful discussions about the controller design and trajectory optimization and Prof. Joshua Schultz for useful discussions about how this control scheme might be implemented through a physical mechanism.

Author information

Affiliations

Authors

Corresponding author

Correspondence to Umer Huzaifa.

Ethics declarations

Conflict of Interest

A LaViers owns stock in AE Machines, an automation software company.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 1 (mp4 4396 KB)

Appendix

Appendix

The modeling matrices of Eq. 2 are as follows:

$$\begin{aligned} D_s = \begin{bmatrix} \left( \frac{5}{4}m\right) r^2&\quad - \frac{m}{2}r^2c_{12} \\ - \frac{m}{2}r^2c_{12}&\quad \frac{m}{4}r^2 \\ \end{bmatrix}, \end{aligned}$$
(18)

where \(c_{12}=\cos (q_{st}-q_{sw})\) and

$$\begin{aligned} C_s= \begin{bmatrix} 0&\quad -\frac{m}{2}r^2\dot{q}_{sw}s_{12} \\ \frac{m}{2}r^2\dot{q}_{st} s_{12}&\quad 0 \\ \end{bmatrix}, \end{aligned}$$
(19)

where \(s_{12} = \sin (q_{st}-q_{sw}))\) and

$$\begin{aligned} G_s= \begin{bmatrix} -(\frac{3m}{2})gr\sin (q_{st}) \\ \frac{m}{2}gr \sin (q_{sw}))\\ \end{bmatrix}, \end{aligned}$$
(20)

The matrices from Eq. 5 are as follows:

$$\begin{aligned} D_e= \begin{bmatrix} D_s&\quad D_{12} \\ D^T_{12}&\quad D_{22} \\ \end{bmatrix} \end{aligned}$$
(21)
$$\begin{aligned} D_{12}= \begin{bmatrix} \frac{3m}{2}r\cos q_{st}&\quad -\frac{3m}{2}r\sin q_{st}\\ -\frac{m}{2}r\cos q_{sw}&\quad \frac{mr}{2}\sin q_{sw} \\ M_t&0 \end{bmatrix} \end{aligned}$$
(22)

and

$$\begin{aligned} D_{22}= \begin{bmatrix} 2m&\quad 0\\ 0&\quad 2m \end{bmatrix}. \end{aligned}$$
(23)

Similarly,

$$\begin{aligned} C_e= \begin{bmatrix} C_s&\quad 0_3\\ C_1&\quad 0_2 \end{bmatrix}, \end{aligned}$$
(24)

where

$$\begin{aligned} C_1= \begin{bmatrix} -\frac{3m}{2} r \dot{q}_{st} \sin q_{st}&\quad \frac{mr}{2} \dot{q}_{sw} \sin q_{sw} \\ -\frac{3m}{2} r \dot{q}_{st} \cos q_{st}&\quad \frac{mr}{2} \dot{q}_{sw} \cos q_{sw} \\ \end{bmatrix}, \end{aligned}$$
(25)

and

$$\begin{aligned} G_e= \begin{bmatrix} G_s\\ G_1\\ \end{bmatrix}, \end{aligned}$$
(26)

where

$$\begin{aligned} G_1= \begin{bmatrix} 0\\ 2mg\\ \end{bmatrix}. \end{aligned}$$
(27)

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Huzaifa, U., Maguire, C. & LaViers, A. Toward an Expressive Bipedal Robot: Variable Gait Synthesis and Validation in a Planar Model. Int J of Soc Robotics 12, 129–141 (2020). https://doi.org/10.1007/s12369-019-00547-6

Download citation

Keywords

  • Biped locomotion
  • Human-like natural motions
  • Stylistic motion variation synthesis
  • Expressivity
  • Optimization
  • Embodied movement analysis