Skip to main content

Advertisement

Log in

Interpretation of Social Touch on an Artificial Arm Covered with an EIT-based Sensitive Skin

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

During social interaction humans extract important information from tactile stimuli that improves their understanding of the interaction. The development of a similar capacity in a robot will contribute to the future success of intuitive human–robot interactions. This paper presents experiments on the classification of social touch on a full-sized mannequin arm covered with touch-sensitive artificial skin. The flexible and stretchable sensitive skin was implemented using electrical impedance tomography. A classifier based on the LogitBoost algorithm was used to classify six emotions and six social messages transmitted by humans when touching the artificial arm. Experimental results show that classification of social touch can be achieved with accuracies comparable to those achieved by humans.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Notes

  1. The word modality is often used in the term “sensory modality” to refer to a specific sense (visual, auditory, tactile, etc.). Our usage of “touch modality” here is consistent with that previously used in [26, 30] and [62], for example.

  2. WEKA is a widely-used [13, 42], Java-based, open source data mining environment developed at the University of Waikato, New Zealand.

  3. Obtained from http://dictionary.cambridge.org/ and http://oxforddictionaries.com/

  4. The semantically non-exclusive labels “Catholic,” “Christian” and “Protestant” occurred through self-reporting of religion.

References

  1. Adler A, Guardo R (1996) Electrical impedance tomography: regularized imaging and contrast detection. IEEE Trans Med Imaging 15(2):170–179

    Article  Google Scholar 

  2. Argall B, Billard A (2010) A survey of tactile human-robot interactions. Robot Auton Syst 58:1159–1176

    Article  Google Scholar 

  3. Black M, Yacoob Y (1995) Tracking and recognizing rigid and non-rigid facial motions using local parametric models of image motion. In: Proceedings of international conference on computer vision, pp 374–381

  4. Busso C, Deng Z, Yildirim S, Bulut M, Lee CM, Kazemzadeh A, Lee S, Neumann U, Narayanan S (2004) Analysis of emotion recognition using facial expressions, speech and multimodal information. In: Proceedings of international conference on multimodal interfaces, pp 205–211

  5. Chang J, MacLean K, Yohanan S (2010) Gesture recognition in the Haptic Creature. In: Proceedings of EuroHaptics, pp 385–391

  6. Cheney M, Isaacson D, Newell J (1999) Electrical impedance tomography. Soc Ind Appl Math 41(1):85–101

    MATH  MathSciNet  Google Scholar 

  7. Cooney M, Nishio S, Ishiguro H (2012) Recognizing affection for a touch-based interaction with a humanoid robot. In: Proceedings of IEEE/RSJ international conference on intelligent robots and systems, pp 1420–1427

  8. Cutkosky M, Howe R, Provancher W (2008) Force and tactile sensors, chap 19. Handbook of robotics. Springer, Berlin, pp 455–476

    Chapter  Google Scholar 

  9. Dahiya R, Metta G, Valle M, Sandini G (2010) Tactile sensing-from humans to humanoids. IEEE Trans Robot 26(1):1–20

    Article  Google Scholar 

  10. Dario P, Laschi C, Micera S, Vecchi F, Zecca M, Menciassi A, Mazzolai B, Carrozza M (2003) Biologically-inspired microfabricated force and position mechano-sensors, chap 8. Sensors and sensing in biology and engineering. Springer, Berlin, pp 109–128

  11. De Rossi D, Scilingo E (2006) Encyclopedia of sensors. In: Grimes C, Dickey E, Pishko M (eds) Skin-like sensor arrays. American Scientific, New York, pp 535–556

    Google Scholar 

  12. Diftler M, Mehling J, Abdallah M, Radford N, Bridgwater L, Sanders A, Askew R, Linn D, Yamokoski J, Permenter F, Hargrave B, Platt R, Savely R, Ambrose R (2011) Robonaut 2—The first humanoid robot in space. In: Proceedings of IEEE international conference on robotics and automation, pp 2178–2183

  13. Dinh A, Shi Y, Teng D, Ralhan A, Chen L, Bello-Haas V, Basran J, Ko S, McCrowsky C (2009) A fall and near-fall assessment and evaluation system. Open Biomed Eng J 3:1–7

    Article  Google Scholar 

  14. Ekman P, Friesen W (1971) Constants across cultures in the face and emotion. J Personal Soc Psychol 17(2):124–129

    Article  Google Scholar 

  15. Frank M, Stennett J (2001) The forced-choice paradigm and the perception of facial expressions of emotion. J Personal Soc Psychol 80(1):75–85

    Article  Google Scholar 

  16. Fridlund A (1997) The new ethology of human facial expressions, chap 5. The psychology of facial expression. Cambridge University Press, Cambridge, pp 103–129

    Chapter  Google Scholar 

  17. Friedman J, Hastie T, Tibshirani R (2000) Additive logistic regression: a statistical view of boosting. Ann Stat 28(2):337–407

    Article  MATH  MathSciNet  Google Scholar 

  18. Goodrich M, Schultz A (2007) Human–robot interaction: a survey. Found Trends Hum Comput Interact 1(3):203–275

    Article  MATH  Google Scholar 

  19. Gouizi K, Reguig F, Maaoui C (2011) Analysis physiological signals for emotion recognition. In: Proceedings of IEEE international workshop on systems, signal processing and their applications, pp 147–150

  20. Guerrero L, Ebesu A (1993) While at play: an observational analysis of children’s touch during interpersonal interaction. In: Annual Meeting of the International Communication Association.

  21. Hall M, Frank E, Holmes G, Pfahringer B, Reutemann P, Witten I (2009) The WEKA data mining software: an update. ACM SIGKDD Explor Newsl 11(1):10–18

    Article  Google Scholar 

  22. Hertenstein M, Keltner D, App B, Bulleit B, Jaskolka A (2006) Touch communicates distinct emotions. Emotion 6(3):528–533

    Article  Google Scholar 

  23. Hertenstein M, Holmes R, McCullough M, Keltner D (2009) The communication of emotion via touch. Emotion 9(4):566–573

    Article  Google Scholar 

  24. Heslin R (1974) Steps toward a taxomony of touching. Presented at the annual meeting of the Midwestern Psychological Association, Chicago, IL

  25. Heslin R, Patterson M (1982) Nonverbal Behavior and Social Psychology. Plenum Press, New York

    Book  Google Scholar 

  26. Heslin R, Nguyen T, Nguyen M (1983) Meaning of touch: the case of touch from a stranger or same sex person. J Nonverbal Behav 7(3):147–157

    Article  Google Scholar 

  27. Hunter M, Struve J (1998) The ethical use of touch in psycotherapy. Sage Publications, Thousand Oaks

    Book  Google Scholar 

  28. Iwata H, Sugano S (2003) A system design for tactile recognition of human-robot contact state. In: Proceedings of IEEE/RSJ international conference on intelligent robots and systems, vol 1, pp 7–12

  29. Iwata H, Sugano S (2005) Human-robot-contact-state identification based on tactile recognition. IEEE Trans Ind Electron 52(6): 1468–1477

    Article  Google Scholar 

  30. Johnson KL, Edwards R (1991) The effects of gender and type of romantic touch on perceptions of relational commitment. Nonverbal Behav 15:1

    Google Scholar 

  31. Jones S (1994) The right touch: understanding and using the language of physical contact. Hampton Press, Cresskill

    Google Scholar 

  32. Jones S, Yarbrough E (1985) A naturalistic study of the meaning of touch. Commun Monogr 52(1):19–56

    Article  Google Scholar 

  33. Jourard SM (1966) An exploratory study of body-accessibility. Br J Soc Clin Psychol 5(3):221–231

    Article  Google Scholar 

  34. Kanda T, Hirano T, Eaton D, Ishiguro H (2004) Interactive robots as social partners and peer tutors for children: a field trial. Hum Comput Interact 19:61–84

    Article  Google Scholar 

  35. Kelley J (1984) An iterative design methodology for user-friendly natural language office information applications. ACM Trans Off Inf Syst 2:26–41

    Article  Google Scholar 

  36. Knight H, Toscano R, Stiehl W, Chang A, Wang Y, Breazeal C (2009) Real-time social touch gesture recognition for sensate robots. In: Proceedings of IEEE/RSJ international conference on intelligent robots and systems, pp 3715–3720

  37. Koo S, Lim JG, Kwon D (2008) Online touch behavior recognition of hard-cover robot using temporal decision tree classifier. In: Proceedings of IEEE international symposium on robot and human interactive communication, pp 425–429

  38. Lederman S (1991) Skin and touch. In: Dulbecco R (ed) Encyclopedia of human biology, vol 7. Academic Press, San Diego, pp 51–63

  39. Leite I, Martinho C, Paiva A (2013) Social robots for long-term interaction: a survey. Int J Soc Robot 5(2):291–308

    Article  Google Scholar 

  40. Lionheart W, Polydorides N, Borsic A (2005) The reconstruction problem, chap 1. Electrical impedance tomography: methods, history and applications. Institute of Physics Publishing, Bristol, pp 3–64

  41. Liu C, Rani P, Sarkar N (2005) An empirical study of machine learning techniques for affect recognition in human–robot interaction. In: Proceedings of IEEE/RSJ international conference on intelligent robots and systems, pp 2662–2667

  42. Mallios N, Papageorgiou E, Samarinas M (2011) Comparison of machine learning techniques using the WEKA environment for prostate cancer therapy plan. In: Proceedings of IEEE international workshops on enabling technologies, infrastructure for collaborative enterprises, pp 151–155

  43. McDaniel E, Andersen P (1998) International patterns of interpersonal tactile communication: a field of study. J Nonverbal Behav 22(1):59–75

    Article  Google Scholar 

  44. Naya F, Yamato J, Shinozawa K (1999) Recognizing human touching behaviors using a haptic interface for a pet-robot. In: Proceedings of IEEE international conference on systems, man, and cybernetics, vol 2, pp 1030–1034

  45. Noda T, Miyashita T, Ishiguro H, Hagita N (2007) Map acquisition and classification of haptic interaction using cross correlation between distributed tactile sensors on the whole body surface. In: Proceedings of IEEE/RSJ international conference on intelligent robots and systems, pp 1099–1105

  46. Noda T, Miyashita T, Ishiguro H, Hagita N (2008) Super-flexible skin sensors embedded on the whole body, self-organizing based on haptic interactions. In: Proceedings of robotics , science and systems, pp 294–301

  47. Nwe TL, Wei FS, De Silva L (2001) Speech based emotion classification. In: Proceedings of IEEE region 10 international conference on electrical and electronic technology, vol 1, pp 297–301

  48. Persson PO, Strang G (2004) A simple mesh generator in MATLAB. Soc Ind Appl Math 46(2):329–345

    MATH  MathSciNet  Google Scholar 

  49. Robins B, Amirabdollahian F, Ji Z, Dautenhahn K (2010) Tactile interaction with a humanoid robot for children with autism: a case study analysis involving user requirements and results of an initial implementation. In: Proceedings of IEEE international symposium on robot and human interactive communication, pp 704–711

  50. Robins B, Dautenhahn K, Lehmann H (2012a) Tactile interaction and imitation games in human–robot interaction studies with children with autism. In: Proceedings of ACM/IEEE international conference on human–robot interaction

  51. Robins D, Dautenhahn K, Dickerson P (2012b) Embodiment and cognitive learning—Can a humanoid robot help children with autism to learn about tactile social behaviour? In: Proceedings of international conference on social robotics, pp 66–75

  52. Russell J (1980) A circumplex model of affect. J Personal Soc Psychol 39(6):1161–1178

    Article  Google Scholar 

  53. Russell J (1993) Forced-choice response format in the study of facial expression. Motiv Emot 17(1):41–51

    Article  Google Scholar 

  54. Rye D, Velonaki M, Williams S, Scheding S (2005) Fish-bird: human–robot interaction in a contemporary arts setting. In: Proceedings of Australasian conference on robotics and automation

  55. Scherer K (2005) What are emotions? And how can they be measured? Soc Sci Inf 44(4):695–729

    Article  Google Scholar 

  56. Shibata T, Mitsui T, Wada K, Touda A, Kumasaka T, Tagami K, Tanie K (2001) Mental commit robot and its application to therapy of children. In: Proceedings of IEEE international conference on advanced intelligent mechatronics, vol 2, pp 1053–1058

  57. Shibata T, Kawaguchi Y, Wada K (2010) Investigation on people living with Paro at home. In: IEEE international symposium on robot and human interactive communication, pp 470–475

  58. Shibata T, Kawaguchi Y, Wada K (2012) Investigation on people living with seal robot at home. Int J Soc Robot 4(1):53–63

    Article  Google Scholar 

  59. Silvera Tawil D, Rye D, Velonaki M (2009) Improved EIT drive patterns for a robotics sensitive skin. In: Proceedings of Australasian conference on robotics and automation

  60. Silvera Tawil D, Rye D, Velonaki M (2011a) Improved image reconstruction for an EIT-based sensitive skin with multiple internal electrodes. IEEE Trans Robot 27(3):425–435

    Article  Google Scholar 

  61. Silvera Tawil D, Rye D, Velonaki M (2011b) Touch modality interpretation for an EIT-based sensitive skin. In: Proceedings of IEEE international conference on robotics and automation, pp 3770–3776

  62. Silvera Tawil D, Rye D, Velonaki M (2012) Interpretation of the modality of touch on an artificial arm covered with an EIT-based sensitive skin. Int J Robot Res 31(13):1627–1642

    Article  Google Scholar 

  63. Smith J (2003) Communicating emotion through a haptic link. Master’s thesis, The University of British Columbia

  64. Soleimani M, Gómez-Laberge C, Adler A (2006) Imaging of conductivity changes and electrode movement in EIT. Physiol Meas 27(5):S103–S113

    Article  Google Scholar 

  65. Stiehl W, Breazeal C (2005) Affective touch for robotic companions. In: Proceedings of international conference on affective computing and intelligent interaction

  66. Stiehl W, Lieberman J, Breazeal C, Basel L, Lalla L, Wolf M (2005) Design of a therapeutic robotic companion for relational, affective touch. In: Proceedings of IEEE international workshop on robot and human interactive communication, pp 408–415

  67. Taichi T, Takahiro M, Hiroshi I, Norihiro H (2006) Automatic categorization of haptic interactions—What are the typical haptic interactions between a human and a robot? In: Proceedings of IEEE-RAS international conference on humanoid robots, pp 490–496

  68. Tan PN, Steinbach M, Kumar V (2006) Introduction to data mining. Addison Wesley, Reading

    Google Scholar 

  69. Vauhkonen M (1997) Electrical impedance tomography and prior information. Ph.D. thesis, Kuopio University

  70. Wada K, Shibata T (2007) Living with seal robots—its sociopsychological and physiological influences on the elderly at a care house. IEEE Trans Robot 23(5):972–980

    Article  Google Scholar 

  71. Wada K, Shibata T (2009) Social effects of robot therapy in a care house—change of social network of the residents for one year. J Adv Comput Intell Intell Inf 13(4):386–387

    Google Scholar 

  72. Yohanan S, MacLean K (2008) The haptic creature project: social human–robot interaction through affective touch. In: The reign of catz & dogz: Proceedings of second AISB symposium on the role of virtual creatures in a computerised society, vol 1, pp 7–11

  73. Yohanan S, MacLean K (2011) Design and assessment of the haptic creature’s affect display. In: Proceedings of ACM/IEEE international conference on human–robot interaction

  74. Yohanan S, MacLean K (2012) The role of affective touch in human–robot interaction: human intent and expectations in touching the haptic creature. Int J Soc Robot 4:163–180

    Article  Google Scholar 

  75. Zur O, Nordmarken N (2010) To touch or not to touch: exploring the myth of prohibition on touch in psychotherapy and counseling. http://www.zurinstitute.com/touchintherapy.html. Accessed 1 March 2012

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to David Silvera-Tawil.

Additional information

This work was supported in part by the ARC Centres of Excellence programme funded by the Australian Research Council (ARC), the New South Wales State Government and the Australian Centre for Field Robotics, The University of Sydney.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Silvera-Tawil, D., Rye, D. & Velonaki, M. Interpretation of Social Touch on an Artificial Arm Covered with an EIT-based Sensitive Skin. Int J of Soc Robotics 6, 489–505 (2014). https://doi.org/10.1007/s12369-013-0223-x

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-013-0223-x

Keywords

Navigation