Tactile Interaction with a Humanoid Robot: Effects on Physiology and Subjective Impressions

This study investigated how touching and being touched by a humanoid robot affects human physiology, impressions of the interaction, and attitudes towards humanoid robots. 21 healthy adult participants completed a 3 (touch style: touching, being touched, pointing) × 2 (body part: hand vs buttock) within-subject design using a Pepper robot. Skin conductance response (SCR) was measured during each interaction. Perceived impressions of the interaction (i.e., friendliness, comfort, arousal) were measured per questionnaire after each interaction. Participants’ demographics and their attitude towards robots were also considered. We found shorter SCR rise times in the being touched compared to the touching condition, possibly reflecting psychological alertness to the unpredictability of robot-initiated contacts. The hand condition had shorter rise times than the buttock condition. Most participants evaluated the hand condition as most friendly and comfortable and the robot-initiated interactions as most arousing. Interacting with Pepper improved attitudes towards robots. Our findings require future studies with larger samples and improved procedures. They have implications for robot design in all domains involving tactile interactions, such as caring and intimacy.


Social touch in Human-Human Interactions
Social touch, as opposed to self-touch, forms an important channel for human non-verbal communication [25,28]. Compared to verbal and other forms of non-verbal communication (e.g., facial expressions, gesture, posture), touch primarily conveys intimate emotions [1,26,59] and thus engenders interpersonal closeness [54]. In our daily life, social touch takes many different forms [44]. For example, shaking hands and kissing during greetings, cuddling and holding hands in intimate interactions, and spanking someone on the bottom as a form of correction. These touches can elicit a wide range of experiences and responses at the physiological and emotional level, each with direct impact on behaviour. A comprehensive overview of the consistent effects of social touch at these different levels has been provided by a number of studies [25,26,28,40]. In the present study, we will first review specific findings on social touch in human-human interactions before looking at related work on human-robot interactions. We will then describe the methods and results of an experiment that recorded both physiological and subjective responses while humans either touched or were touched by a humanoid robot. Results will be discussed with respect to theoretical, methodological and practical implications.
Consider first the physiological level. Receiving social touch has been shown to reduce cortisol levels, the so-called "stress-hormone" [36] and to lower blood pressure and heart rate [21,32]. Interpersonal touch is thus a method of comforting someone in stress and/or while experiencing negative arousal [19]. For example, apart from being frequently employed in nursing care [6,8,37], social touch has been used effectively in a variety of stress-prone situations, ranging from pre-operation [85] and prior to public speaking [18,32] to being threatened by physical pain [10] or even watching scary videos [48]. In addition to attenuating physiological stress responses, social touch can also evoke higher skin conductance responses (SCRs), an indicator for physiological arousal that reflects increased electrical conductivity of the skin following sweat production. For example, an increased SCR was experienced by individuals when briefly touched by a stranger in passing [83] and by socially-anxious participants touched by an experimenter [86]. These physiological effects elicited by social touch can be interpreted as pleasant or unpleasant, depending critically on the context [7,11,13].
Consider now the emotional level of touch interactions. The sense of touch can be used to communicate emotions [38,39] as well as to elicit emotions [75]. Emotional consequences of touching also depend on the gender combination of the dyad [74] and the sexual attitudes of its members (e.g., [76]). Moreover, social touch has been shown to enhance interpersonal relationships via establishment of trust [1,26] and can influence people's (pro)social behaviour, referred to as the "Midas Touch" effect [15]. This effect involves a brief casual touch on the arm or shoulder by a stranger which results in an increased willingness to comply with that stranger's subsequent request [33,45,76]. Social touch in human-human interactions can therefore evoke powerful physiological responses, strengthen social bonds and enhance our prosocial behaviours.
It is the brain's interpretation of the body's physiological states that determines our conscious experience of distinct emotions [29]. Together with the existence of specific neurophysiological channels for affective touch (the so-called C-tactile afferents in human skin; cf. [81]), somatosensory representations of emotions in the human body [67], and direct physiological reactions to skin stimulation in early animal research [34,58], the evidence to date indicates a close link between tactile stimulation, physiological responses and emotional experiences.

Social touch in Human-Robot Interactions
Nowadays, robots are increasingly stepping into our social and even private lives. Due to their physical embodiment, toy and care robots can be touched and handled by us. Humanoid robots in particular (i.e., robots with their body built to resemble the human body) offer active touching of humans as a unique mode of communication. The United Nations Economic Commission for Europe and the International Federation of Robotics [80] have recognized potential benefits of natural physical contact between people and personal service robots. Given that human social touch is grounded in specific neurophysiological processing channels and intricately related with emotions, it is clear that touch interaction ultimately impacts on our social relationships with and behaviours towards humanoid robots. However, there is so far only a very limited understanding of whether touch interactions with humanoid robots in a social context can elicit physiological responses in human participants that resemble those in human-human interactions. Evidence gathered from this new line of research can ultimately serve as an important first step towards improved human-robot interactions. A brief review of the state of the art follows.
Although research on human-robot social touch interactions is still relatively new, preliminary findings suggest that touch interaction with artificial entities can elicit physiological, emotional and behavioural responses in humans that are similar to those elicited in human-human interactions [23]. Previous studies of touch interactions with human users were implemented to assess affective and social benefits in healthcare [68,69,71,90], education [78] and entertainment settings [27,84]. For example, elderly individuals touching a pet robot reported improved empathy [69] and people with dementia touching a pet robot reported improved mood [60] and social connection [90]. Moreover, a humanoid robot's touch (i.e., being touched by a robot) increased motivation to carry out a monotonous task [62,72] and decreased perceived machine-likeness of the robot in participants with general positive attitude towards robots [12]. It also reduced perceived unfairness of unfair monetary offers proposed by the robot [27].
Previous research shows that touch styles in human-robot interaction influence the interaction outcomes. Hirano et al. [42] found that, in comparison to passively being touched by a robot, people actively touching a humanoid robot perceived the robot as more friendly and the interaction as more comfortable. Furthermore, the gender of the toucher and the receiver [56], as well as people's attitude towards robots [9,12,77], influence the interpretation of touch interactions.
A relatively small set of studies have investigated human physiological responses to either touching the robot [53,68,71] or being touched by the robot [9,87,88]. In studies where robots initiated a social touch, Willemse et al. [87,88] compared human physiological responses between a "robottouch human" condition (robot actively touched participant's right shoulder and upper arm while it spoke soothing words) and a "robot-not-touch-human" condition (where the robot only moved its head and arms without making physical contact with the participant while it spoke soothing words).
Most of these studies, however, did not distinguish the physiological effects of physical contact with different body parts. For example, whether touching or being touched by the robot on one's hands elicits a similar response as contact on other body parts is not well understood. In human nonverbal communication, we not only limit our body parts to be touched by others ("body accessibility"; [46]), but in turn also touch only limited body parts of others ("touch zones"; [79]), depending on the nature of our social relationship. The human hand is one of the most frequently touched body parts 1 3 (a "high accessibility region"; [46]) and one of the regions we most publicly touch on others (the "public touch zone"; [79]), whereas other regions such as the buttock and genital areas fall into the "low body accessibility" or "private touch zone" category [46,79]. Nevertheless, it is important to investigate how humans respond to touching and being touched in such areas because of the increasingly intimate relationships between humans and humanoids in care and companionship settings. It is plausible to anticipate different physiological responses when physical contacts with these different body regions are made in human-human interactions.
In human-robot social interactions, there are many contexts where physical contacts with different body regions of both humans and robots are unavoidable or even necessary. Examples include using care robots for patient transfer from floor to chair in healthcare [61], early childhood education in the form of playing with children [78] and the emerging and controversial field of intimate relationships with humanoid robots [52,91]. It becomes increasingly important to collect empirical evidence on this topic in order to help inform the design of social robots in domains where physically contacting different body parts forms an essential part of the human-robot interaction and fundamentally shapes their relationship. This is the reason why, in the present study, we included also a touch condition involving an intimate body part, namely the buttocks. We controlled the robot's perceived gender to reduce further complications that might arise from the gender combination of the interaction dyads.
Li et al. [53] provided some initial evidence on the physiological responses to touching different body parts of a small humanoid robot Nao (approximately 57 cm tall). The authors categorised 13 body parts of the Nao robot into three groups based on the social construct of "body accessibility" [46]: high accessibility (hand, arm, forehead and neck), medium accessibility (back, ear, eye, foot) and low accessibility (inner thigh, heart, breast, buttocks and genital area 1 ). In an anatomy learning scenario 31 participants were asked to touch these body parts of the robot while their SCRs were measured, with pointing as a control condition. Participants generated significantly (effect size estimate ƞ 2 = 0.10) increased SCRs (indicating arousal) while touching low accessible body parts, compared to touching high accessible body parts, but not when merely pointing to the robot. The authors interpreted their results as showing that people treat touching body parts as an act of closeness in itself that does not require a human recipient. Alternatively, the results might also signal that people attribute private bodily zones to the humanoid robot, and such attributions constitute an important aspect of understanding the nature of human-robot interaction (cf. InStance project; [57]). What remains unknown is whether these physiological effects occur when touching a non-toy-sized humanoid robot that resembles more a human in terms of body height. More importantly, it is not known whether similar effects obtain when a robot initiates the touch (i.e., people are touched by a robot rather than actively touching the robot).

The Current Study
The current study aims to investigate how human-robot touch interaction (both human-initiated and robot-initiated touches) affects human physiology and perceived impressions of the interaction and how it changes attitudes towards humanoid robots. We took inspirations from previous work on human-robot interaction using a small humanoid robot Nao [53]. However, several methodological improvements were made to Li and colleagues' pioneering study [53] by (a) using a non-toy-sized humanoid robot Pepper (120 cm tall, compared to only 57 cm of the Nao robot), (b) including measurement of participants' individual differences, such as their sexual orientation and attitude toward robots, both before and after the touch interaction, (c) adding a robotinitiated touch style into the interaction and (d) measuring perceived impressions of the interaction, as well as physiological responses, as an indicator for arousal. Knowing the subjective impression of the touch interactions helps us to interpret the objective physiological response of the skin conductance change. The expected outcome of studying buttocks-related touch was to observe higher skin conductance compared to the hands-related touch conditions. This would generalize the results of Li and Reeves [53] to another robot, to the experience of being touched, and to another intimate body part.

Participants
Twenty-one right-handed participants (13 females, mean age = 24.7 years, SD = 4.1) took part in the study. Participants were recruited primarily through a combination of email-based and personal recruitment at two universities. The Ethics Committees at both the University of Applied Sciences, Wildau and the University of Potsdam approved the study. Table 1 presents characteristics of the sample. All participants reported to be fluent in English.

Study Design
The robot used for this study, Softbank Robotics' (formerly Aldebaran Robotics) Pepper 2 is a programmable semihumanoid robot with a height of 120 cm and a weight of 28 kg. Pepper has 20 degrees of freedom of movement and has a head, neck, LED equipped eyes, arms, sensor-equipped hands with five fingers each and visible hips. Its torso ends with a mobile base but it does not have any legs or genitals. There is also a tablet (24.6 × 17.5 cm) with a 10.1″ display attached to the chest of Pepper to facilitate communication with its users. This tablet was turned off during the experiment in order to force participants to focus on verbal communication, which took place in English, throughout the study. Pepper's eyes were lit to a bright white colour throughout the experiment to help animate the interaction. The right back side of the hip area was pointed out by the experimenter to each participant as the "buttock area" of the robot before the experiment in order to control possible vocabulary differences between participants. Movements for touch interaction and the voice settings (as a male speaker) of the robot were jointly designed and programmed by the first author and the research team from the University of Applied Sciences Wildau. We purposefully identified the robot as male in its introduction monologue (see Procedure below) and checked this manipulation in the post-experiment questionnaire.
The three touch styles included in the study were: • actively touching the robot: after the robot's request to touch either its hand or its buttock, the participant initiates physical contact with the named body part of the robot using his/her right hand; • passively being touched by the robot: after the robot seeks permission to touch the participant's right hand, the robot initiates physical contact using its left hand. For touching the participant's buttock area, the robot only seeks permission without subsequently touching the participant's buttock area (this was due to inter-subject physical variability and related problems with programming the robot's hand trajectory). Thus, this touch condition is clearly referred to as "being touched/asked permission to touch (PTT)" in the subsequent text; and • pointing to the robot: after the robot's request to point to either its hand or its buttock, the participant points to the corresponding body parts of the robot without direct physical contact, using his/her right index finger.
A mutual touch condition, where the person touches the robot and the robot actively returns the touch (such as during a handshake, cf. [2]), was not included in the present exploratory study. To be clear: in this study the human-initiated conditions include participants touching both the robot's hand and the robot's buttock whereas the robot-initiated conditions include the participants being touched by the robot on the hand and being asked permission by the robot to touch their buttock area (but without this latter touch event actually occurring).
Participant gender, age, sexual orientation, experience with robots, and attitude towards robots were collected, as described below. Whether participants took medications that were likely to influence skin conductance levels was also ascertained [73]. The main dependent variable was SCR, measured using Shimmer 3 hardware and iMotions 4 software. We also measured perceived friendliness of the robot, comfort and arousal in each condition per questionnaire.
We conducted a 3 (touch style: actively touching vs passively being touched/asked PTT vs pointing) × 2 (body part: hand vs buttock) within-subjects design. Pointing served as control condition without physical contact, as in [53]. Due to resource constraints and the exploratory nature of this study, the within-subjects design aimed to increase the chance of detecting differences among conditions while using fewer participants. To avoid the possible order effect of withinsubjects designs, the touch style conditions were counterbalanced, with half of the participants starting with the hand and the other half starting with the buttock condition.

Materials
The Negative Attitude Towards Robot Scale (NARS; [66]) is widely used in human-robot interaction research [63][64][65]77] to measure attitudes towards robots. The questionnaire was administered twice, once before and once after the experiment, with essentially the same questionnaire in order to find out if the interaction had any effects on participants' attitudes (see Appendices 3 and 4). The only differences between the two questionnaire versions were that (a) only the pre-questionnaire included demographics (age, gender, discipline and sexual orientation) and factors that might affect SCRs (e.g., whether medications were taken for cold and for depression in the last seven days) and (b) only the postquestionnaire included measures of perceived impression of the robot (i.e., gender and age of the robot) and of the interaction (i.e., friendliness, comfort and arousal of the interaction). The original English version of the NARS was used with 14 items which consisted of three subscales: negative attitudes toward situations of interaction with robots (e.g., "I would feel paranoid talking with a robot"), negative attitudes toward the social influence of robots (e.g., "Something bad might happen if robots developed into living beings") and negative attitudes towards emotions in interaction with robots (e.g., "If robots had emotions, I would be able to make friends with them"). The scale has an internal consistency of 0.79, which reflects good reliability. "Appendix 1" lists all variables collected in the study.

Procedure
The study took place in the library of University of Applied Sciences, Wildau where the Pepper robot is used as a library assistant. The entire experiment, including all robotic instructions, was conducted in English. The whole data collection procedure consisted of three steps: Step 1-experimental setup, consent-taking and pre-experiment questionnaire completion, Step 2-the touch interaction experiment and Step 3-the post-experiment questionnaire administration. During Step 1, the humanoid robot Pepper was set up in a quiet room in the library, facing the participant. A small tailor-made table was placed between the robot and the chair where the participant was to sit. A human-right-hand outline with palm facing down was marked on the table to ensure each participant could place his/her hand at the same location to be reliably touched by the pre-programmed robot. The chair was adjustable to accommodate different arm lengths of participants for the touching-the-robot condition. After setting up the experiment, informed written consent was obtained, followed by completion of a pre-experiment questionnaire that collected participants' demographic information (age, gender, discipline, sexual orientation) and their attitude toward robots. Two dry electrodes with a diameter of 8 mm (Shimmer 5 ) were then placed on two fingers (index and middle) of the participant's left hand (i.e., their nondominant hand). The electrodes were connected via 9-inch leads with the sensor (Shimmer3 GSR + Unit 6 ) which was attached to the participant's left wrist with a wrist strap and provided real-time data collection and wireless data transmission to the computer running the iMotions software. The sampling frequency was 128 Hz. Figure 1 (b, right panel) shows the electrode placement. Consent, questionnaire and electrode placement were completed in a separate room adjacent to the experimental room within a period of approximately eight minutes. At Step 2, the human-robot touch interaction experiment took place in a quiet room from which the participant could not see outdoor activities. Two researchers monitored the experiment in an adjacent room through a tablet camera placed in the experimental room. The participant was first guided by the experimenter to the experimental room to be seated comfortably facing the robot. The participant was then instructed about the robot's body parts including the hand and buttock areas. The participant was told to place his/ her left hand on the lap when his/her right hand engaged in the touch interaction. The participant was then asked to sit quietly with eyes closed for five minutes for the baseline skin conductance data collection. Please note that all participants completed the touch interaction conditions with their eyes open. At the end of the 5-min baseline period, the robot verbally introduced the experiment (for the monologue text and voice settings used, see "Appendix 2").
Each condition started with the robot saying "Please touch my hand/touch my buttock/place your hand at the marked area on the table/point at my hand/point at my buttock" and each condition ended with the robot saying "Thank you, please put back your hand". The durations of touch events and touch intensities varied across conditions and depended on individuals' compliance and hand anatomy. The only exception was in the condition of being touched by robot on buttock where the robot merely asked permission to touch the participant's buttock (i.e., "Is it OK if I touch your buttock area?"), while the act of the robot touching the participant's buttock area did not actually take place. This was purposefully designed for two reasons. First, there were significant technological challenges to program the robot to reliably and safely touch participants of different heights and of different body shapes. Second, the robot's asking permission to touch the participant's buttock area, in other words, human anticipation of touch stimulation, especially on a socially-less accessible body part, is an important psychological process that still needs investigation. For example, recent evidence indicates that anticipation of touch modifies human brain function [55].
The inter-condition interval (i.e., "cool-off" period) was approximately 13 s, which allowed the skin conductance level to return to baseline before the next experimental condition. Figure 1 (a, left panel) shows the condition of being touched by robot on hand. The end of the experiment was marked by the robot saying "Thank you for your participation. You can now leave the room." The duration of each condition was marked manually on the video stream by the first author during the experiment with one marker for the beginning and one for the end, corresponding to the robot's statements of "Please" and "Thank you", respectively. The experiment lasted for approximately nine minutes. At Step 3, participants completed a post-experiment questionnaire (see Appendix 4) which was the same as the one used in Step 1 but with additional questions about how they perceived the robot gender and age and which condition they rated to be the most/least friendly/comfortable/ arousing. Participants were only allowed to choose one condition for each attribute. An informal interview regarding how participants felt about the experiments (e.g., whether they understood the robot's instructions and questions and whether they answered "yes" in response to the robot seeking permission to touch the buttock area) was also conducted and the answers noted down.
Step 3 was completed within approximately six minutes. The entire data collection required approximately 25 min (Table 2).

Data Preparation and Analysis
Skin conductance, one form of electrodermal activity (a term introduced by Johnson and Lubin [43]), is one of the most commonly used measures of emotional arousal in psychological research [4,49]. The raw skin conductance signal consists of two components: the skin conductance level (known as the tonic level) and the SCR (known as the phasic response; [17]). The tonic level describes the overall conductivity of the skin over longer time intervals ranging from tens of seconds to minutes, which is not informative with regard to responses to stimuli. The phasic response rides on top of these tonic changes and shows significantly faster alterations, being sensitive to emotionally arousing stimulations that are of interest in the present context. Raw data was therefore prepared in the following three steps before final analyses took place: (1) assigning and verifying experimental conditions by using the live markers placed on the video stream, (2) extracting the phasic response data so that the tonic level was removed from the raw skin conductance signal and (3) applying the moving average method to filter motion-related artefacts from the phasic data.
Once the experimental conditions had been identified, three aspects of the phasic responses among conditions were studied: (1) visualization of patterns of phasic responses for each condition, (2) average phasic response differences measured in microsiemens (µS) and (3) peak differences regarding rise time, amplitude and recovery time. Two-way repeated measures analyses of variance (ANOVAs) were performed on aspects (2) and (3) to investigate main effects of touch style, body part as well as the statistical interaction touch style*body part. Regarding questionnaire data, prepost responses were compared using a repeated measures t-test (after ensuring that parametric test assumptions were met) to investigate possible effects of the touch interaction on attitude towards robots. Regarding perceptions of the interaction outcomes (i.e., the identification of most/least friendly/comfortable/arousal conditions), rating outcomes were counted and collated.
Analyses were conducted using IBM SPSS Statistics 7 and RStudio 8 (for pattern visualization only). Results from both physiological and questionnaire data were finally compared in light of the sample and design limitations. Visual inspection also reveals that the variability in SCRs in the condition R touches hand seems larger than the variability in all other conditions. Visualization of patterns of phasic responses was conducted for each of the 21 participants, which served as an important verification process for the quality of the data collected.

Visualization of Patterns of Phasic Responses
SCRs for each condition were visualized using RStudio (Fig. 3), resulting in 21 lines representing the 21 participants. Visual inspection of the patterns indicated overall larger SCRs in the Touch robot hand condition and highest variability in the Robot touches hand condition. This pattern warranted more detailed statistical investigation in the next stage.    each condition presented here is selected to emphasize the key patterns of each condition, rather than the actual duration of the experimental condition, which differs from condition to condition

Differences in Average Phasic Responses
Visual inspection of mean differences in phasic SCRs among different conditions showed that the Touch robot hand condition had the largest level of response (0.0336 µS), Fig. 4). Two-way repeated measures ANOVA explored the main effects of touch style, body part and their interaction in the average phasic SCRs. No significant effects were found for either the main factors or the interaction (details see "Appendix 5").

Peak Analysis (Rise Time, Amplitude and Recovery Time)
The most frequently used measure to describe a single electrodermal response (e.g., a single SCR) is its amplitude [4]. For the electrodermal recording method used in our study (exosomatic recording with direct current), a stimulus will, after a response latency, result in an increase in skin conductance. The amplitude of this signal refers to the intensity of a single response (i.e., the amplitude difference between response onset and peak). Two measures frequently used to describe the shape of electrodermal responses are rise time (i.e., the time from response onset to maximum) and recovery time (i.e., the time needed to recover a certain proportion of the amplitude). Rise time is typically shorter than recovery time [4].
The peaks in skin conductance used for our analyses are event-related phasic SCRs (ER-SCRs) as specific responses to the experimental stimuli, which occurred between 1-4 s after condition onset. ER-SCRs differ from spontaneous or non-specific responses (NS-SCRs) that are not consequences of any eliciting stimulus but instead happen in the body at a rate of 1-3 per minute [17]. The peak amplitude typically ranges between 0.1 and 2.0 µS [16]. In our study, the peak onset (> 0.01 µS) and the peak amplitude threshold (0.05 above the onset value) were pre-determined in agreement with standard settings of the iMotions software.

Peak Recovery Time
Regarding average peak recovery time, the grand mean was 1863.34 ms (SD = 792.14 ms). No significant effects were found for either the main factors or the interaction (details see "Appendix 5").

Changes in Attitude Towards Robots After Interaction
Changes in attitude towards robots after the tactile interaction compared to before the interaction are shown in Table 3. In general, there was no significant difference in the overall attitude toward robots after the approximately nine-minute touch interaction with the Pepper robot (t(20) = -0.987, p > 0.05). However, significantly larger scores (meaning less agreement with statements pertaining to negative attitudes) were obtained for Item 2 after the touch interaction ("Something bad might happen if robots develop into living beings"; t(20) = -2.905, p < 0.01). As Item 2 was part of the subscale for measuring the perceived social influence of robots, this finding indicates that participants might become less negative toward social influences of robots after a tactile interaction with them. Furthermore, Item 8 ("I would feel nervous operating a robot in front of other people"), which belongs to the subscale of situations of interactions with robots, received larger scores (approaching significant difference) after than before the tactile interaction (t (20) = -2.007, p = 0.058). This result also signals that negative attitudes toward situations of interaction with robots might be reduced after the interaction.

Perception of Robot Characteristics and the Interaction Outcome
Regarding the robot's age, 57.1% participants (n = 12) considered the Pepper robot as a child, 33.3% (n = 7) as an adolescent, and only 9.5% (n = 2) as an adult. Regarding the robot's gender, despite introduction of Pepper as a male robot, only 61.9% participants (n = 13) believed the Pepper robot was male, 33.3% (n = 7) were unsure, and 4.8% (n = 1) thought the robot was female. 52.4% (n = 11) based their judgment of robot gender on the robot's voice, and 19.0% (n = 4) based their judgement on the appearance of the robot, and the remaining 28.6% reported that they based their gender judgement on a combination of appearance, voice and robot's introduction.
Regarding the perceived impression of interaction outcomes, being touched by robot on hand was rated by most participants (71.4%, n = 15) as the friendliest condition. Pointing to robot buttock was rated as the least friendly condition (38.1%, n = 8). The largest number of people (38.1%, n = 8) rated touching robot on hand as the most comfortable condition, the least comfortable condition was being asked about buttock (38.1%, n = 8). The most arousing conditions were the robot-initiated actions including being touched by robot on hand and being asked by robot if OK to touch  buttock (both 23.8%, both n = 5), the least arousing condition was pointing to robot hand (28.6%, n = 6) ( Table 4). Individually conducted informal interviews with participants after the experiment revealed mixed reactions towards the whole experience of touch interactions with humanoid robots. Some participants expressed their positivity (e.g., "I was positively surprised"; "I was curious what the robot was going to do next"; "I felt I was with somebody in the room") whereas others felt the whole presentation of the robot lacked spontaneity (e.g., "The robot was too rigid"; "The robot acted kind of weird"; "It would be nice if the robot can move the mouth or the eyelids while speaking"). Unfortunately, a few participants (n = 4) reported experiencing some level of difficulty in understanding what the robot was saying, either due to the mechanical voice or the inability to clearly understand the word "buttock". Concerning the robot seeking permission to touch people's buttock, 81% (n = 17) said "yes" and 19% (n = 4) were unsure about how to answer this question. For those who were unsure, the common explanation was "It was strange. I was surprised that the robot could ask such a question." Two participants asked the robot to repeat the question ("Can you please repeat?"), in which case the robot's pre-programmed answer (i.e., "Thank you for letting me know") was obviously out of synchronization. A majority of participants (76%, n = 16) reported that they felt a greater sense of arousal when the robot initiated the actions as compared to conditions where they touched the robot.

Summary of Results
This study reported preliminary findings of an exploratory experiment detailing how bidirectional social touch (both touching and being touched by a humanoid robot) affected human physiology and subjective impressions of that interaction. The main findings were as follows: 1. Regarding average SCRs, pointing to and touching a humanoid robot, as well as being touched/asked permission to touch by this robot, led to similar physiological activation patterns, with no additional interaction effect of interaction style and body part. 2. Peak analyses of skin conductance levels revealed that (a) The peak rise time was shorter in the being touched/asked permission to touch by robot condition compared to the touching robot condition, and also in the hand condition compared to the buttock condition.
(b) No reliable differences between conditions were found in the peak recovery times. (c) The peak amplitude was larger in the touching robot condition compared to the being touched/ asked permission to touch by robot condition.
3. Although participants' overall attitude toward robots did not change after the touch interaction, participants became generally less negative toward certain aspects of robots (Item 2 contributes to the subscale of social influence of robots; and Item 8 contributes to the subscale of situations of interaction with robots) after interacting for approximately 10 min with a robot. 4. Regarding subjective evaluation of the specific robot's characteristics, a majority of participants (90%, n = 19) perceived the robot either as a child or as an adolescent with only 10% (n = 2) considering the robot as an adult. Despite our explicitly gendered introduction, only 62% (n = 13) thought the robot was a male, 33% (n = 7) were unsure and 5% (n = 1) thought Pepper was a female. 5. Regarding the perceived impression of the interaction outcome, the hand condition was rated by a greater majority of participants as most friendly (n = 15) and comfortable (n = 8), and the robot-initiated conditions (where the robot either touched the human hand or asked permission to touch the human buttock) were considered by a larger majority of participants (n = 5 for each condition) to be most arousing. 6. 81% of participants (n = 17) said "yes" to the robot seeking permission to touch their buttocks and 19% (n = 4) were unsure about how to answer the question.
In comparison to the previous report by Li et al. [53], which inspired the present experiment, major methodological differences in our study included adding a robot touching human condition, subjective measures of interaction outcomes as well as using a non-toy-sized humanoid robot (i.e., Pepper as opposed to the much smaller Nao). Considering these methodological differences, it is perhaps not surprising that we obtained noticeable differences between the results observed in this study and those reported by Li et al. [53]. We now consider these results in turn.
First, Li et al. [53] reported that participants experienced a higher SCR when touching low accessible body parts of the Nao robot (e.g., buttock) compared to touching high accessible body parts (e.g., hand), a difference that was not observed when they pointed to the robot. This increased SCR suggested a higher physiological arousal that was interpreted by the authors as experiencing psychological alertness as a result of touching low accessible body parts of a robot, even though the robot was only a machine. The authors did not consider the alternative possibility that humans attribute intentionality and emotions to a humanoid robot, especially when interacting with it on a personal (tactile) level. In the current study, the effect of body part on SCR was not significant (in fact, no reliable effect was found for the other main factor of touch style and the interaction of body part*touch style). It is worth noting that the effect size in Li et al. study [53] was larger (effect size estimate ƞ 2 = 0.10) compared to the current study (partial ƞ 2 between 0.04 and 0.07) and the observed powers (between 0.190 and 0.307) were likely to be insufficient to detect these smallsized effects. It is therefore possible that the current sample size was too small to detect small-sized effects. According to the statistics software G*Power [24], 42 participants would be needed to detect a moderate main effect and interaction effect (ƞ 2 = 0.06) using a repeated measures ANOVA with a statistical significance level of α = 0.05 (two-tailed) and a power of 0.80. Considering that this estimation exceeded our available sample size (n = 21), results must be interpreted in light of these power limitations and the exploratory nature of the study. Apart from refinement of experimental procedures, future studies should use a larger sample size to investigate further the average SCRs among different conditions. Furthermore, the fact that some participants experienced difficulties in comprehending the word "buttock" in a robotic voice, as indicated in the post-experiment informal interview session, may also have worked against inducing reliable effects. Improved language settings or a conversational familiarisation phase are recommended for future studies. Another problem is that the humanoid robot Pepper does not have a distinctively visible buttock per se, which could have impacted on the outcome.
Surprisingly, further peak analyses revealed a shorter peak rise time in the hand condition, compared to the buttock condition, suggesting that participants reacted faster to the hand as opposed to the buttock condition. It should be noted that the hand condition included touching and being touched by the robot on the hand while the buttock condition included touching the robot buttock and merely being asked by the robot if it is OK if it touches the human buttock. This merely potential (not actual) touch event included in the buttock condition may have reduced participants' arousal level. Moreover, the hand condition was considered by a majority of participants as the most comfortable and friendly condition, perhaps inducing more rapid anticipation of the tactile event. Furthermore, no reliable difference was found in peak amplitudes between the hand and buttock conditions. Although the relationship between peak rise time and peak amplitude is complex, our initial findings on peak rise time with high vs low accessible body parts provide useful information.
Second, adding the robot touching human conditions was intended as an improvement to Li and Reeves' work [53] so that effects of active vs passive touch styles can be compared. Although the effect of touch style in the average SCRs was not significant, passively being touched/asked PTT by robot had a shorter peak rise time than the active touching robot condition whereas touching robot had a larger peak amplitude than the being touched/asked PTT by robot condition. In the being touched/asked PTT by robot condition, the robot asked permission to carry out an action on the participant whereas in the touching robot condition the robot asked the participant to perform an action on the robot. Thus, one possible explanation of the results might be related to an increased psychological alertness due to the unpredictability of how the robot might carry out an action on the human body in the being touched/asked PTT by robot condition. It is evident in the stress literature that uncontrollability and unpredictability contribute to increased anxiety as measured in skin conductance and self-reported anxiety scores [35]. In our study, the touch force, touch trajectory, and touch speed were all relatively unknown in the robot-initiated conditions (i.e., being touched/asked PTT by robot), compared to the touching robot conditions where people have control over all aspects of the touch action. Furthermore, the qualitative data from the post-experiment informal interview support this line of argument in that 76% of participants experienced a sense of arousal in the robot-initiated conditions (i.e., being touched by the robot on hand and being asked by the robot for permission to touch buttock area).
So what is the main contribution of this work? As far as we are aware, no studies have so far compared human physiological responses (e.g., skin conductance) to different touch styles between "human-touch-robot" and "robottouch-human" conditions. Li et al. [53] reported a higher SCR in their "human-touch-robot" vs their "human-point to-robot" conditions. Willemse et al. [88] found no differences in SCRs between "robot-touch-human" and "robot-not touch-human" conditions. Both studies compared the mean SCRs between conditions without further peak analyses. The differences in methodology and statistical analytical procedures make it difficult to compare the current findings on physiological effects of touch styles with the available evidence in the literature. The findings from the current study can nevertheless be considered as a valuable contribution to the field. For example, we are the first to study the effect of a robot seeking permission to touch a human, on either a public or an intimate body part (as might become relevant in caring and intimate settings). Future studies are needed to further investigate how different touch styles in human-robot interaction impact on human physiology.

Theoretical Impact
In human-robot social interactions, social robots typically contribute not only their physical interactions with the user [51], but also their physical embodiment [47] as well as other human-like characteristics [5] such as verbal and non-verbal communication cues to enhance their social presence. In order for embodied artificial agents to be truly perceived as socially present by their human partners, the agents have to be experienced by the human user as "actual social actors in either sensory or non-sensory ways" ( [50], p. 45). Therefore, human-robot tactile interactions should be considered as part of the repertoire of social exchanges between humans and the robot, where the meaning of the social touch becomes highly dependent on the accompanying verbal and non-verbal social signals [23]. In our study, the interplay among the various social signals displayed by the Pepper robot was somewhat mis-aligned. On one hand, its anthropomorphic appearance, its ability to speak and its eye animation (they were lit in white colour) during the conversation all likely raised people's expectation of the social presence of the robot. On the other hand, the actual perception of the robot included its mechanical voice, the metallic feeling of its touch, coupled with a lack of appropriate intonation or facial expressions and mis-timed answers to participants' questions. All these signals were not in line with participants' high expectations. Due to the lack of coherent interplay between these social cues, the touch events we studied may have lost their social meaning, hence failing to be effective in producing the anticipated responses. This concern (and the other considerations raised above) leads us to methodological suggestions for future work which will be discussed next.

Methodological Considerations and Future Work
Research on human-robot social touch is an emerging area of study [23,53]. In particular, empirical studies on physiological responses to human-robot social touch involving socially low accessible body parts are currently sparse. On the other hand, considering the growing number of applications of social robots in our professional, social and private lives, there is an increasing need to identify the multidimensional factors that could play a role in this interaction [88]. These could include factors relating to the touch event (e.g., its duration, frequency, and body location), factors relating to the robot (its appearance, voice, perceived gender and other social cues), factors relating to the human user (e.g., individual differences and attitudes towards robots), and factors relating to the context (e.g., role of the robot and nature of the relationship). At the moment, no standard research protocols are available. The present study protocol can therefore be considered as a valuable contribution in enhancing our understanding of human-robot social touch in general. Moreover, the study setup also allows for further investigations, either with the same robot and variations in robot voice, touch behaviour and manipulation of other social cues like gaze behaviours, or with different types of robots (cf. [14,89]). These contributions can be considered as further merits of our work. However, manually demarcating experimental conditions on video streams potentially increased the noise in our data. This aspect can be improved by connecting the robot touch behaviours with the recording software in future studies. Although the robot's permission-seeking "Is it OK if I touch your buttock?" was intended as an important probe into the human's psychological anticipation following a socially unusual question, no actual robot-touchhuman-buttock event took place in the interaction, even if the participants responded with a "yes". This setup could have confounded our evaluation of the factor of body part. A useful next stage of such investigations is to program robottouch-human-buttock behaviour that is sufficiently natural and safe to be included in the interaction.
The study results should be interpreted in light of the following limitations. First, a convenience sampling strategy was used in this study, which limits the generalizability of the study results to other populations. Individual differences were collected in this study including attitude towards robots, a factor previously found to influence people's perception of the interaction outcome [9,12]. However, the sample size was not large enough to enable meaningful statistical analyses to compare effects between subgroups such as gender, sexual orientation and people with different attitudes toward robots. This must be left for future studies. Having said this, due to the exploratory nature of the study, the current sampling strategy and sample size are considered to be useful for methodological improvements in the next investigation with a more diverse population and a larger sample size.
Second, the humanoid robot Pepper (vs a small humanoid robot Nao) was purposefully selected because it better resembles human height. It was intended to use its physical presence to enhance its social presence while interacting with human participants. This intended effect is unlikely to emerge when people interact with non-humanoid robots, such as a lifting arm in a care context. However, there seemed to be a discrepancy between the expected and actual perceived social presence of the robot, largely due to mis-aligned social cues. This might have complicated the interpretation of the current results. In future studies, incorporating human characteristics into the embodied agents to enhance their social presence should be attempted with special care for designing human-robot social touch experiments. In particular, we wonder how other communicative cues, such as gaze behaviours, can influence impressions of social touch interactions, as indicated in previous work [42]. Many studies have demonstrated that a higher social presence positively affect people's perception of a robot [41,51] as well as the relationship with the robot [31,70]. In the context of social touch, further investigation of the relationship between social presence and touch responses would be beneficial, for example whether the perceived social presence of the robot is a prerequisite for eliciting human (physiological) responses to touch. Also, to what extent must social presence be experienced by the human user in order to elicit physiological responses? Moreover, the surface material of the robot can influence the touch experience and its evaluation [82] and should be considered in future work.
Third, the context of the touch interaction in our study was a lab-based environment, where the role of the robot was not specified. Thus, the intended function of the touch was open for the participants' interpretations. This and other contextual factors will influence people's responses to the touch. Chen et al. [9] found that perception of robot touch (both enjoyability and favourability) was dependent on the robot's verbal warning and the function of the touch. Dougherty et al. [20] explored how touch may be used to induce trust in an android-based business case scenario. Touch is an often overlooked channel of communication and sometimes even more powerful and persuasive than language. Future studies can benefit from investigating further how the perceived function of touch might influence physiological responses. Moreover, perceived gender and age of the robot could also impact on the physiological responses. Generally speaking, the design of future studies in the area of human-robot social touch interactions should take contextual and individual difference factors into consideration [22,23]. One example of such work is facial character analysis [30]. Clearly, humans' attitudes towards humanoid robots and their resulting emotions will influence their interpretation of tactile interactions.
Fourth, some specific parameters of touch, for example, touch force [3] and touch amount [78] that were previously found to be important factors, were not investigated in the present study. Furthermore, the physical qualities of human and robot touch also differ [28], which was also not addressed in this exploratory study. In future studies, it will be helpful to explore the relative roles of these parameters in the human-robot social touch interaction and how they might impact on human physiological responses. Finally, a real control condition with human-human touch interaction should be incorporated into the design of future studies to provide a reference for the interpretation of effects.

Practical Implications
Perhaps the most important message from these initial findings is that human physiological responses to bidirectional social touch interactions with a humanoid robot can be reliably measured in skin conductance. Despite the challenges in making coherent interpretations of the current findings in light of the available evidence in the literature, these preliminary findings can be seen as a cautious message to those who work in robot design and in applications of social robots to take account of possible physiological responses in humans. This is particularly relevant to domains such as healthcare, education and entertainment where human-robot physical interactions will invariably involve different body parts and where quality of care may be improved by optimizing touch experiences.

Conclusions
This study investigated physiological as well as subjective responses to touch interactions with a humanoid robot. We found that humans systematically responded to such tactile stimulation with changes in skin conductance, a measure that signals emotional arousal and engagement. Our exploratory study is the first to report both physiological and psychological consequences of a robot seeking permission to touch a human on an intimate body part. Further work on social touch between humans and humanoids can have implications for robot design in all domains involving tactile interactions, such as caring and intimacy. Monologue text: "Hello, my name is Alex. I am a male humanoid robot. Welcome to this touch experiment. At the experiment, I will ask you to touch my body. Sometimes I will ask you to point to my body, and sometimes I will touch your body. When I ask you to touch me, please touch me with your dominant hand; and when I ask you to point at me, please point at me with your dominant finger; please keep your other hand with the sensor. OK, let's get started." Voice setting: The voice setting in the software Choregraphe 9 version 2.5.5 was controlled by two parameters (i.e., voice shaping and voice speed). The voice shaping regulates the tone and pitch of the voice and was set at 50% (range 50-150%; the lower the value, the deeper the voice. The voice speed was set at 75% (range of 50-200%).

Compliance with Ethical Standards
Conflict of interest The authors declare that they have no conflict of interest.
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creat iveco mmons .org/licen ses/by/4.0/.