Advertisement

International Journal of Social Robotics

, Volume 6, Issue 4, pp 489–505 | Cite as

Interpretation of Social Touch on an Artificial Arm Covered with an EIT-based Sensitive Skin

  • David Silvera-TawilEmail author
  • David Rye
  • Mari Velonaki
Article

Abstract

During social interaction humans extract important information from tactile stimuli that improves their understanding of the interaction. The development of a similar capacity in a robot will contribute to the future success of intuitive human–robot interactions. This paper presents experiments on the classification of social touch on a full-sized mannequin arm covered with touch-sensitive artificial skin. The flexible and stretchable sensitive skin was implemented using electrical impedance tomography. A classifier based on the LogitBoost algorithm was used to classify six emotions and six social messages transmitted by humans when touching the artificial arm. Experimental results show that classification of social touch can be achieved with accuracies comparable to those achieved by humans.

Keywords

Social touch Human–robot interaction (HRI) Social robotics Supervised machine learning LogitBoost  Artificial sensitive skin Electrical impedance tomography 

References

  1. 1.
    Adler A, Guardo R (1996) Electrical impedance tomography: regularized imaging and contrast detection. IEEE Trans Med Imaging 15(2):170–179CrossRefGoogle Scholar
  2. 2.
    Argall B, Billard A (2010) A survey of tactile human-robot interactions. Robot Auton Syst 58:1159–1176CrossRefGoogle Scholar
  3. 3.
    Black M, Yacoob Y (1995) Tracking and recognizing rigid and non-rigid facial motions using local parametric models of image motion. In: Proceedings of international conference on computer vision, pp 374–381Google Scholar
  4. 4.
    Busso C, Deng Z, Yildirim S, Bulut M, Lee CM, Kazemzadeh A, Lee S, Neumann U, Narayanan S (2004) Analysis of emotion recognition using facial expressions, speech and multimodal information. In: Proceedings of international conference on multimodal interfaces, pp 205–211Google Scholar
  5. 5.
    Chang J, MacLean K, Yohanan S (2010) Gesture recognition in the Haptic Creature. In: Proceedings of EuroHaptics, pp 385–391Google Scholar
  6. 6.
    Cheney M, Isaacson D, Newell J (1999) Electrical impedance tomography. Soc Ind Appl Math 41(1):85–101zbMATHMathSciNetGoogle Scholar
  7. 7.
    Cooney M, Nishio S, Ishiguro H (2012) Recognizing affection for a touch-based interaction with a humanoid robot. In: Proceedings of IEEE/RSJ international conference on intelligent robots and systems, pp 1420–1427Google Scholar
  8. 8.
    Cutkosky M, Howe R, Provancher W (2008) Force and tactile sensors, chap 19. Handbook of robotics. Springer, Berlin, pp 455–476CrossRefGoogle Scholar
  9. 9.
    Dahiya R, Metta G, Valle M, Sandini G (2010) Tactile sensing-from humans to humanoids. IEEE Trans Robot 26(1):1–20CrossRefGoogle Scholar
  10. 10.
    Dario P, Laschi C, Micera S, Vecchi F, Zecca M, Menciassi A, Mazzolai B, Carrozza M (2003) Biologically-inspired microfabricated force and position mechano-sensors, chap 8. Sensors and sensing in biology and engineering. Springer, Berlin, pp 109–128Google Scholar
  11. 11.
    De Rossi D, Scilingo E (2006) Encyclopedia of sensors. In: Grimes C, Dickey E, Pishko M (eds) Skin-like sensor arrays. American Scientific, New York, pp 535–556Google Scholar
  12. 12.
    Diftler M, Mehling J, Abdallah M, Radford N, Bridgwater L, Sanders A, Askew R, Linn D, Yamokoski J, Permenter F, Hargrave B, Platt R, Savely R, Ambrose R (2011) Robonaut 2—The first humanoid robot in space. In: Proceedings of IEEE international conference on robotics and automation, pp 2178–2183Google Scholar
  13. 13.
    Dinh A, Shi Y, Teng D, Ralhan A, Chen L, Bello-Haas V, Basran J, Ko S, McCrowsky C (2009) A fall and near-fall assessment and evaluation system. Open Biomed Eng J 3:1–7CrossRefGoogle Scholar
  14. 14.
    Ekman P, Friesen W (1971) Constants across cultures in the face and emotion. J Personal Soc Psychol 17(2):124–129CrossRefGoogle Scholar
  15. 15.
    Frank M, Stennett J (2001) The forced-choice paradigm and the perception of facial expressions of emotion. J Personal Soc Psychol 80(1):75–85CrossRefGoogle Scholar
  16. 16.
    Fridlund A (1997) The new ethology of human facial expressions, chap 5. The psychology of facial expression. Cambridge University Press, Cambridge, pp 103–129CrossRefGoogle Scholar
  17. 17.
    Friedman J, Hastie T, Tibshirani R (2000) Additive logistic regression: a statistical view of boosting. Ann Stat 28(2):337–407CrossRefzbMATHMathSciNetGoogle Scholar
  18. 18.
    Goodrich M, Schultz A (2007) Human–robot interaction: a survey. Found Trends Hum Comput Interact 1(3):203–275CrossRefzbMATHGoogle Scholar
  19. 19.
    Gouizi K, Reguig F, Maaoui C (2011) Analysis physiological signals for emotion recognition. In: Proceedings of IEEE international workshop on systems, signal processing and their applications, pp 147–150Google Scholar
  20. 20.
    Guerrero L, Ebesu A (1993) While at play: an observational analysis of children’s touch during interpersonal interaction. In: Annual Meeting of the International Communication Association.Google Scholar
  21. 21.
    Hall M, Frank E, Holmes G, Pfahringer B, Reutemann P, Witten I (2009) The WEKA data mining software: an update. ACM SIGKDD Explor Newsl 11(1):10–18CrossRefGoogle Scholar
  22. 22.
    Hertenstein M, Keltner D, App B, Bulleit B, Jaskolka A (2006) Touch communicates distinct emotions. Emotion 6(3):528–533CrossRefGoogle Scholar
  23. 23.
    Hertenstein M, Holmes R, McCullough M, Keltner D (2009) The communication of emotion via touch. Emotion 9(4):566–573CrossRefGoogle Scholar
  24. 24.
    Heslin R (1974) Steps toward a taxomony of touching. Presented at the annual meeting of the Midwestern Psychological Association, Chicago, ILGoogle Scholar
  25. 25.
    Heslin R, Patterson M (1982) Nonverbal Behavior and Social Psychology. Plenum Press, New YorkCrossRefGoogle Scholar
  26. 26.
    Heslin R, Nguyen T, Nguyen M (1983) Meaning of touch: the case of touch from a stranger or same sex person. J Nonverbal Behav 7(3):147–157CrossRefGoogle Scholar
  27. 27.
    Hunter M, Struve J (1998) The ethical use of touch in psycotherapy. Sage Publications, Thousand OaksCrossRefGoogle Scholar
  28. 28.
    Iwata H, Sugano S (2003) A system design for tactile recognition of human-robot contact state. In: Proceedings of IEEE/RSJ international conference on intelligent robots and systems, vol 1, pp 7–12Google Scholar
  29. 29.
    Iwata H, Sugano S (2005) Human-robot-contact-state identification based on tactile recognition. IEEE Trans Ind Electron 52(6): 1468–1477CrossRefGoogle Scholar
  30. 30.
    Johnson KL, Edwards R (1991) The effects of gender and type of romantic touch on perceptions of relational commitment. Nonverbal Behav 15:1Google Scholar
  31. 31.
    Jones S (1994) The right touch: understanding and using the language of physical contact. Hampton Press, CresskillGoogle Scholar
  32. 32.
    Jones S, Yarbrough E (1985) A naturalistic study of the meaning of touch. Commun Monogr 52(1):19–56CrossRefGoogle Scholar
  33. 33.
    Jourard SM (1966) An exploratory study of body-accessibility. Br J Soc Clin Psychol 5(3):221–231CrossRefGoogle Scholar
  34. 34.
    Kanda T, Hirano T, Eaton D, Ishiguro H (2004) Interactive robots as social partners and peer tutors for children: a field trial. Hum Comput Interact 19:61–84CrossRefGoogle Scholar
  35. 35.
    Kelley J (1984) An iterative design methodology for user-friendly natural language office information applications. ACM Trans Off Inf Syst 2:26–41CrossRefGoogle Scholar
  36. 36.
    Knight H, Toscano R, Stiehl W, Chang A, Wang Y, Breazeal C (2009) Real-time social touch gesture recognition for sensate robots. In: Proceedings of IEEE/RSJ international conference on intelligent robots and systems, pp 3715–3720Google Scholar
  37. 37.
    Koo S, Lim JG, Kwon D (2008) Online touch behavior recognition of hard-cover robot using temporal decision tree classifier. In: Proceedings of IEEE international symposium on robot and human interactive communication, pp 425–429Google Scholar
  38. 38.
    Lederman S (1991) Skin and touch. In: Dulbecco R (ed) Encyclopedia of human biology, vol 7. Academic Press, San Diego, pp 51–63Google Scholar
  39. 39.
    Leite I, Martinho C, Paiva A (2013) Social robots for long-term interaction: a survey. Int J Soc Robot 5(2):291–308CrossRefGoogle Scholar
  40. 40.
    Lionheart W, Polydorides N, Borsic A (2005) The reconstruction problem, chap 1. Electrical impedance tomography: methods, history and applications. Institute of Physics Publishing, Bristol, pp 3–64Google Scholar
  41. 41.
    Liu C, Rani P, Sarkar N (2005) An empirical study of machine learning techniques for affect recognition in human–robot interaction. In: Proceedings of IEEE/RSJ international conference on intelligent robots and systems, pp 2662–2667Google Scholar
  42. 42.
    Mallios N, Papageorgiou E, Samarinas M (2011) Comparison of machine learning techniques using the WEKA environment for prostate cancer therapy plan. In: Proceedings of IEEE international workshops on enabling technologies, infrastructure for collaborative enterprises, pp 151–155Google Scholar
  43. 43.
    McDaniel E, Andersen P (1998) International patterns of interpersonal tactile communication: a field of study. J Nonverbal Behav 22(1):59–75CrossRefGoogle Scholar
  44. 44.
    Naya F, Yamato J, Shinozawa K (1999) Recognizing human touching behaviors using a haptic interface for a pet-robot. In: Proceedings of IEEE international conference on systems, man, and cybernetics, vol 2, pp 1030–1034Google Scholar
  45. 45.
    Noda T, Miyashita T, Ishiguro H, Hagita N (2007) Map acquisition and classification of haptic interaction using cross correlation between distributed tactile sensors on the whole body surface. In: Proceedings of IEEE/RSJ international conference on intelligent robots and systems, pp 1099–1105Google Scholar
  46. 46.
    Noda T, Miyashita T, Ishiguro H, Hagita N (2008) Super-flexible skin sensors embedded on the whole body, self-organizing based on haptic interactions. In: Proceedings of robotics , science and systems, pp 294–301Google Scholar
  47. 47.
    Nwe TL, Wei FS, De Silva L (2001) Speech based emotion classification. In: Proceedings of IEEE region 10 international conference on electrical and electronic technology, vol 1, pp 297–301Google Scholar
  48. 48.
    Persson PO, Strang G (2004) A simple mesh generator in MATLAB. Soc Ind Appl Math 46(2):329–345zbMATHMathSciNetGoogle Scholar
  49. 49.
    Robins B, Amirabdollahian F, Ji Z, Dautenhahn K (2010) Tactile interaction with a humanoid robot for children with autism: a case study analysis involving user requirements and results of an initial implementation. In: Proceedings of IEEE international symposium on robot and human interactive communication, pp 704–711Google Scholar
  50. 50.
    Robins B, Dautenhahn K, Lehmann H (2012a) Tactile interaction and imitation games in human–robot interaction studies with children with autism. In: Proceedings of ACM/IEEE international conference on human–robot interactionGoogle Scholar
  51. 51.
    Robins D, Dautenhahn K, Dickerson P (2012b) Embodiment and cognitive learning—Can a humanoid robot help children with autism to learn about tactile social behaviour? In: Proceedings of international conference on social robotics, pp 66–75Google Scholar
  52. 52.
    Russell J (1980) A circumplex model of affect. J Personal Soc Psychol 39(6):1161–1178CrossRefGoogle Scholar
  53. 53.
    Russell J (1993) Forced-choice response format in the study of facial expression. Motiv Emot 17(1):41–51CrossRefGoogle Scholar
  54. 54.
    Rye D, Velonaki M, Williams S, Scheding S (2005) Fish-bird: human–robot interaction in a contemporary arts setting. In: Proceedings of Australasian conference on robotics and automationGoogle Scholar
  55. 55.
    Scherer K (2005) What are emotions? And how can they be measured? Soc Sci Inf 44(4):695–729CrossRefGoogle Scholar
  56. 56.
    Shibata T, Mitsui T, Wada K, Touda A, Kumasaka T, Tagami K, Tanie K (2001) Mental commit robot and its application to therapy of children. In: Proceedings of IEEE international conference on advanced intelligent mechatronics, vol 2, pp 1053–1058Google Scholar
  57. 57.
    Shibata T, Kawaguchi Y, Wada K (2010) Investigation on people living with Paro at home. In: IEEE international symposium on robot and human interactive communication, pp 470–475Google Scholar
  58. 58.
    Shibata T, Kawaguchi Y, Wada K (2012) Investigation on people living with seal robot at home. Int J Soc Robot 4(1):53–63CrossRefGoogle Scholar
  59. 59.
    Silvera Tawil D, Rye D, Velonaki M (2009) Improved EIT drive patterns for a robotics sensitive skin. In: Proceedings of Australasian conference on robotics and automationGoogle Scholar
  60. 60.
    Silvera Tawil D, Rye D, Velonaki M (2011a) Improved image reconstruction for an EIT-based sensitive skin with multiple internal electrodes. IEEE Trans Robot 27(3):425–435CrossRefGoogle Scholar
  61. 61.
    Silvera Tawil D, Rye D, Velonaki M (2011b) Touch modality interpretation for an EIT-based sensitive skin. In: Proceedings of IEEE international conference on robotics and automation, pp 3770–3776Google Scholar
  62. 62.
    Silvera Tawil D, Rye D, Velonaki M (2012) Interpretation of the modality of touch on an artificial arm covered with an EIT-based sensitive skin. Int J Robot Res 31(13):1627–1642CrossRefGoogle Scholar
  63. 63.
    Smith J (2003) Communicating emotion through a haptic link. Master’s thesis, The University of British ColumbiaGoogle Scholar
  64. 64.
    Soleimani M, Gómez-Laberge C, Adler A (2006) Imaging of conductivity changes and electrode movement in EIT. Physiol Meas 27(5):S103–S113CrossRefGoogle Scholar
  65. 65.
    Stiehl W, Breazeal C (2005) Affective touch for robotic companions. In: Proceedings of international conference on affective computing and intelligent interactionGoogle Scholar
  66. 66.
    Stiehl W, Lieberman J, Breazeal C, Basel L, Lalla L, Wolf M (2005) Design of a therapeutic robotic companion for relational, affective touch. In: Proceedings of IEEE international workshop on robot and human interactive communication, pp 408–415Google Scholar
  67. 67.
    Taichi T, Takahiro M, Hiroshi I, Norihiro H (2006) Automatic categorization of haptic interactions—What are the typical haptic interactions between a human and a robot? In: Proceedings of IEEE-RAS international conference on humanoid robots, pp 490–496Google Scholar
  68. 68.
    Tan PN, Steinbach M, Kumar V (2006) Introduction to data mining. Addison Wesley, ReadingGoogle Scholar
  69. 69.
    Vauhkonen M (1997) Electrical impedance tomography and prior information. Ph.D. thesis, Kuopio UniversityGoogle Scholar
  70. 70.
    Wada K, Shibata T (2007) Living with seal robots—its sociopsychological and physiological influences on the elderly at a care house. IEEE Trans Robot 23(5):972–980CrossRefGoogle Scholar
  71. 71.
    Wada K, Shibata T (2009) Social effects of robot therapy in a care house—change of social network of the residents for one year. J Adv Comput Intell Intell Inf 13(4):386–387Google Scholar
  72. 72.
    Yohanan S, MacLean K (2008) The haptic creature project: social human–robot interaction through affective touch. In: The reign of catz & dogz: Proceedings of second AISB symposium on the role of virtual creatures in a computerised society, vol 1, pp 7–11Google Scholar
  73. 73.
    Yohanan S, MacLean K (2011) Design and assessment of the haptic creature’s affect display. In: Proceedings of ACM/IEEE international conference on human–robot interactionGoogle Scholar
  74. 74.
    Yohanan S, MacLean K (2012) The role of affective touch in human–robot interaction: human intent and expectations in touching the haptic creature. Int J Soc Robot 4:163–180CrossRefGoogle Scholar
  75. 75.
    Zur O, Nordmarken N (2010) To touch or not to touch: exploring the myth of prohibition on touch in psychotherapy and counseling. http://www.zurinstitute.com/touchintherapy.html. Accessed 1 March 2012

Copyright information

© Springer Science+Business Media Dordrecht 2014

Authors and Affiliations

  1. 1.Creative Robotics Lab, National Institute for Experimental ArtsUniversity of New South WalesKensingtonAustralia
  2. 2.Centre for Social Robotics, Australian Centre for Field RoboticsThe University of SydneyDarlingtonAustralia

Personalised recommendations