Skip to main content
Log in

Can Using Pointing Gestures Encourage Children to Ask Questions?

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

Even though asking questions is fundamental for self-motivated learning, children often have difficulty verbalizing them. Hence, we hypothesized that a robot’s capability to perceive pointing gestures will encourage children to ask more questions. We experimentally tested this hypothesis with the Wizard-of-Oz technique with 92 elementary-school students who interacted with our robot in a situation where it served as a guide who explains a museum exhibit. The children asked the robot significantly more questions when it could perceive pointing gestures than when it lacked such a capability. We also discuss the possibility of implementing autonomous robots based on the findings of our Wizard-of-Oz approach.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14

Similar content being viewed by others

References

  1. Thrun S et al (1999) MINERVA: a second-generation museum tour-guide robot. In: IEEE international conference on on Robotics and Automation (ICRA1999), pp 1999–2005

  2. Ghosh M, Kuzuoka H (2014) An ethnomethodological study of a museum guide robot’s attempt at engagement and disengagement. J Robot 2014, 876439. https://doi.org/10.1155/2014/876439

  3. Saerbeck M, Schut T, Bartneck C, Janse MD (2010) Expressive robots in education: varying the degree of social supportive behavior of a robotic tutor. In: ACM conference on human factors in computing systems (CHI2010), pp 1613–1622

  4. Han J, Jo M, Park S, Kim S (2005) The educational use of home robots for children. In: IEEE international workshop on robot and human interactive communication (RO-MAN2005), pp 378–383

  5. McCombs BL, Whisler JS (1997) The learner-centered classroom and school: strategies for increasing student motivation and achievement. The Jossey-Bass Education Series. Jossey-Bass Inc., San Francisco

    Google Scholar 

  6. Howley I, Kanda T, Hayashi K, Rosé C (2014) Effects of social presence and social role on help-seeking and learning. In: ACM/IEEE international conference on human–robot interaction (HRI2014), pp 415–422

  7. Goldin-Meadow S (2005) Hearing gesture: how our hands help us think. Harvard University Press, Cambridge

    Google Scholar 

  8. McNeill D (1992) Hand and mind: what gestures reveal about thought. University of Chicago Press, Chicago

    Google Scholar 

  9. Dittmann AT, Llewellyn LG (1969) Body movement and speech rhythm in social conversation. J Personal Soc Psychol 11(2):98–106

    Article  Google Scholar 

  10. Hewes GW (1992) Primate communication and the gestural origin of language. Curr Anthropol 32:65–84

    Article  Google Scholar 

  11. Rauscher FH, Krauss RM, Chen Y (1996) Gesture, speech, and lexical access: the role of lexical movements in speech production. Psychol Sci 7(4):226–231

    Article  Google Scholar 

  12. Alibali MW, DiRusso AA (1999) The function of gesture in learning to count: more than keeping track. Cognit Dev 14(1):37–56

    Article  Google Scholar 

  13. Goldin-Meadow S, Alibali MW, Church RB (1993) Transitions in concept acquisition: using the hand to read the mind. Psychol Rev 100(2):279–297

    Article  Google Scholar 

  14. Pine KJ, Bird H, Kirk E (2007) The effects of prohibiting gestures on children’s lexical retrieval ability. Dev Sci 10(6):747–754

    Article  Google Scholar 

  15. Sauter MG, Uttal DH, Schaal A, Levine SC, Goldin-Meadow S (2012) Learning what children know about space from looking at their hands: the added value of gesture in spatial communication. J Exp Child Psychol 111(4):587–606 (Epub 2011)

    Article  Google Scholar 

  16. Matlen BJ, Atit K, Göksun T, Rau MA, Ptouchkina M (2012) Representing space: exploring the relationship between gesturing and geoscience understanding in children. In: International Conference on Spatial Cognition VIII, pp 405–415

  17. Krauss RM (1998) Why do we gesture when we speak? Curr Dir Psychol Sci 7:54–60

    Article  Google Scholar 

  18. Alibali M (2005) Gesture in spatial cognition: expressing, communicating, and thinking about spatial information. Spat Cognit Comput 5:307–331

    Article  Google Scholar 

  19. Kuzuoka H, Oyama S, Yamazaki K, Suzuki K, Mitsuishi M (2000) Gestureman: a mobile robot that embodies a remote instructor’s actions. In: ACM conference on computer-supported cooperative work (CSCW2000), pp. 155–162

  20. Scassellati B (2000) Investigating models of social development using a humanoid robot. In: Webb B, Consi T (eds) Biorobotics. MIT Press, Cambridge

    Google Scholar 

  21. Okuno Y, Kanda T, Imai M, Ishiguro H, Hagita N (2009) Providing route directions: design of robot’s utterance, gesture, and timing. In: ACM/IEEE international conference on human–robot interaction (HRI2009), pp 53–60

  22. Lohse M, Rothuis R, Gallego-Pérez J, Karreman DE, Evers V (2014) Robot gestures make difficult tasks easier: the impact of gestures on perceived workload and task performance. In: ACM conference on human factors in computing systems, pp. 1459–1466

  23. Sauppé A, Mutlu B (2014) Robot deictics: how gesture and context shape referential communication. In: ACM/IEEE international conference on human–robot interaction (HRI2014), pp 342–349

  24. Ng-Thow-Hing V, Luo P, Okita S (2010) Synchronized gesture and speech production for humanoid robots. In: IEEE/RSJ international conference on intelligent robots and systems (IROS 2010), pp 4617–4624

  25. Huang C-M, Mutlu B (2013) Modeling and evaluating narrative gestures for humanlike robots. In: Robotics: science and system. pp. 57–64

  26. Hato Y, Satake S, Kanda T, Imai M, Hagita N (2010) Pointing to space: modeling of deictic interaction referring to regions. In: ACM/IEEE international conference on human–robot interaction (HRI2010), pp 301–308

  27. Bremner P, Leonards U (2016) Iconic gestures for robot avatars, recognition and integration with speech. Front Psychol 7:1–14

    Article  Google Scholar 

  28. Dautenhahn K (2007) Methodology and themes of human-robot interaction: a growing research field. Int J Adv Robot Syst 4(1):103–108

    Google Scholar 

  29. Van den Bergh M et al (2011) Real-time 3D hand gesture interaction with a robot for understanding directions from humans. In: IEEE international symposium on robot and human interactive communication (RO-MAN2011), pp 357–362

  30. Droeschel D, Stückler J, Holz D, Behnke S (2011) Towards joint attention for a domestic service robot-person awareness and gesture recognition using time-of-flight cameras. In: IEEE international conference on robotics and automation (ICRA2011), pp 1205–1210

  31. Breazeal C, Kidd CD, Thomaz AL, Hoffman G, Berlin M (2005) Effects of nonverbal communication on efficiency and robustness in human–robot teamwork. In: IEEE/RSJ international conference on intelligent robots and systems (IROS2005), pp 383–388

  32. Sugiyama O, Kanda T, Imai M, Ishiguro H, Hagita N (2007) Natural deictic communication with humanoid robots. In: IEEE/RSJ international conference on intelligent robots and systems (IROS2007), pp 1441–1448

  33. Dahlbäck N, Jönsson A, Ahrenberg L (1993) Wizard of Oz studies: why and how. In: International conference on intelligent user interfaces (IUI1993), pp 193–200

  34. Brscic D, Kanda T, Ikeda T, Miyashita T (2013) Person tracking in large public spaces using 3d range sensors. IEEE Trans Hum Mach Syst 43:522–534

    Article  Google Scholar 

  35. Anderson LW, Krathwohl DR, Bloom BS (2001) A taxonomy for learning, teaching, and assessing: a revision of Bloom’s taxonomy of educational objectives. Allyn & Bacon, Boston

    Google Scholar 

  36. Schauerte B, Richarz J, Fink GA (2010) A Saliency-based identification and recognition of pointed-at objects. In: IEEE/RSJ international conference on intelligent robots and systems (IROS2010), pp. 4638–4643

  37. Schauerte B, Stiefelhagen R (2014) Look at this! learning to guide visual saliency in human-robot interaction. In: IEEE/RSJ interantional conference on intelligent robots and systems (IROS2014), pp. 995–1002

  38. Nagi J, Giusti A, Gambardella LM, Di Caro GA (2014) Human–swarm interaction using spatial gestures. In: IEEE/RSJ international conference on intelligent robots and systems (IROS2014), pp 3834–3841

  39. Cosgun A, Trevor AJB, Christensen HI (2015) Did you mean this object?: detecting ambiguity in pointing gesture targets. In: International workshop on towards a framework for joint action workshop in HRI2015

  40. Shiomi M, Sakamoto D, Kanda T, Ishi CT, Ishiguro H, Hagita N (2011) Field trial of a networked robot at a train station. Int J Soci Robot 3:27–40

    Article  Google Scholar 

  41. Li Q, Russell MJ (2001) Why is automatic recognition of children’s speech difficult? In: European conference on speech communication and technology, pp 2671–2674

Download references

Acknowledgements

We thank the assistants for their support during our experiments. This work was in part supported by JSPS Grants-in-aid for Scientific Research Nos. 25240042 and 25280095 and JSPS KAKENHI Grant Nos. JP15H05322 and JP16K12505.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Masahiro Shiomi.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Komatsubara, T., Shiomi, M., Kanda, T. et al. Can Using Pointing Gestures Encourage Children to Ask Questions?. Int J of Soc Robotics 10, 387–399 (2018). https://doi.org/10.1007/s12369-017-0444-5

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-017-0444-5

Keywords

Navigation