Abstract
The outbreak of Covid-19 precipitated the use of service robots in customer-facing services as a replacement for employees to avoid human-to-human contact. However, this development has not resolved the debate as to whether robots should be characterized with gender attributes or simply be genderless. This study explores whether endowing a robot with gender attributes makes it more acceptable as a service provider among stated men and women. To this end, an experiment was conducted at a public fair in which a gendered robot simulated the provision of a service to customers, which consisted of offering them advice, hints, and messages of encouragement to help complete a eudaemonic puzzle. A parsimonious version of the Almere model was used to estimate acceptance of the technology. The findings reveal that for both stated men and women, the main drivers for accepting the female-coded robot are perceived usefulness and social influence, although women attach greater importance to social influence. For the male-coded robot, perceived usefulness and social influence are the main arguments for women, while for men they are enjoyment, perceived usefulness and, negatively, ease of use. In addition, different indirect effects between stated sexes are also identified. In summary, men and women consider different factors when accepting robots of each gender.
1 Introduction
The outbreak of COVID-19 boosted the design, development, and implementation of social robots in service organizations and companies (hospitals, social care centres, nursing homes, hospitality, and education, among others) across 35 countries, with China, the USA and Thailand standing out [1, 2]. This development was driven by the fact that the delivery of services, unlike the production of goods, requires high human–human interaction [3], and it was essential for this to be avoided to curb the spread of the virus [2]. Social robots not only replaced human employees in high-contact tasks, but also helped to relieve them from having to perform lower-priority tasks in an emergency environment. For example, in hospital receptions, robots provided information on the status of patients to relatives without the need for human contact, in care services they offered an online connection between patients and relatives, and in companionship services they kept patients entertained with games and other activities [1, 2]. However, commercial social robots still have many shortcomings, and much research will be required before they can be equipped with sufficient skills to replace human employees [4, 5]. For example, Pinillos et al. [4] tested a bellboy robot in a hotel reception and detected several shortcomings, including navigation errors that prevented it from reaching its power point, collisions with hotel staff causing items to be dropped, and communication errors because the voice recognition did not work properly in noisy environments, meaning customers had to use the touch screen to ask questions [4].
Social robots consist of hardware (physical structure including sensors and start-up mechanism), a control architecture (navigation, human–robot communication, recognition software, etc.) and an application system (specific software to perform specific tasks) [4, 6]. Most commercially available robots, apart from the physical structure, come equipped with a control architecture that allows them to perform a limited set of tasks. If a service needs to be delivered, such as helping an elderly customer to transfer money at an ATM, an application must be programmed and integrated into the control architecture (e.g., ARI, TIAGo robots) [7, 8].
Furthermore, if the robot design is anthropomorphic in shape and incorporates a multimodal communication system, combining verbal (text-to-speech software) and non-verbal (changes in facial expressions) signals, the mere shape of the robot together with its emission of systematised signals can lead users to infer that it is somewhat human-like and, moreover, that it is gendered [9]. Whether intentional or not, any signal, such as morphology, name, tone of voice or behaviour, is sufficient to attribute gender to the robot [10, 11]. Evidence has also shown that this inference that the robot is gendered can help to make its presence more accepted in certain services [3, 12], for example by using robots characterised as male for security tasks, and as female for caring for patients and the elderly [12, 13]. However, Robertson [9] has criticised this assignment of gender to robots, claiming that it is contributing to the perpetuation of sexist stereotypes.
Therefore, given the ease with which customers attribute gender to anthropomorphic robots, should service robots be deliberately designed to be both human-shaped and gendered? Answering this twofold question requires an understanding of the dual nature of service delivery, for it involves generating functional value (derived from solving the customer’s problem) and relational value (derived from the socio-emotional and affective bond that the interaction has generated) [3]. At the same time, in a competitive market, it is relational value that generates competitive advantage, and, in service robots, this occurs when the robot resembles the human form and can engage customers in conversation to such an extent that socio-emotional bonds are generated [3]. Indeed, service companies often assign mechanical-looking robots to perform functional tasks (e.g., Roomba, medicine dispensing robots, etc.), and human-looking robots with conversational skills to perform relational tasks, as the latter, which can provide social cues, tend to generate more positive feedback from customers [14, 15]. Although the use of humanoid robots for conversing may initially be seen as a waste of resources, because cheaper devices such as mobile phones and chatbots can perform the same task more economically, there is evidence that suggests that people prefer to converse with a physical, humanoid being, rather than a virtual one [16]. For example, in a study comparing chatbots with robots, the former’s lack of physical contact has been viewed as a handicap for their future development [17].
For a robot to appear human-like, it must not only have a human shape (embodiment), but also the ability to act and communicate like a human [18]. While the replication of human form, as exemplified by robots such as ERICA or NADINE [19], is fairly easy, replicating human movement, communication and autonomy is a tremendously difficult challenge for both robot designers and service providers [8]. This discrepancy in developmental levels and, consequently, between form and behaviour has given rise to the “uncanny valley” hypothesis [20], which has been studied before [21, 22]. This discordance is also studied in the field of consumer behaviour through the cognitive dissonance produced by the mismatch between the expectations derived from a robot’s human form and the observed clumsiness [18]. To mitigate these negative responses, designers have turned to less realistic designs, either by highlighting the mechanical nature of robots (reminding users that they are simple machines) or by using simplified human forms to encourage consumers to expect simpler interactions [21, 22].
While the arguments gathered from the literature suggest that the design of humanoid social robots is appropriate for delivering Human-Robot Interaction (HRI) services, the second part of the question, whether they should incorporate gender attributes, is less clear and remains open to debate. Given the heterogeneity of consumers that make up the market for services, different degrees of acceptance of gendered robots can be expected among diverse market segments (divided by sex, gender, age, technological literacy, etc.) [23].
In marketing, it is common to use the sex or gender identity of consumers as a segmentation criterion [24]. Sex is considered to be determined by the biological characteristics (hormonal and brain structures) that distinguish human males from females, while gender identity depends on the degree to which individuals view themselves as masculine or feminine [25]. However, some of the literature disagrees with this view of sex as biological, predetermined and binary, one example being Fausto-Sterling, who claims “that labelling someone a man or a woman is a social decision” [26, p.3]. Furthermore, Montañez [27] pointed out that the reality is much more complex, since the determination of biological sex involves not only anatomy, but also an intricate choreography of genetic and chemical factors that develop over time, and which mean that the gender with which an individual identifies is not always aligned with their biological sex at birth.
Despite criticism, biological sex, captured in the form of stated sex, remains one of the most widely used criteria for market segmentation, as people who state that they are men or women manifest different preferences, tastes and purchasing behaviours [24], are easily identifiable groups, and are large enough to be profitable [28]. This paper, following Nomura’s classification [29], proposes an exploratory hybrid study to analyse changes in attitudes and preferences expressed by stated male and female customers with respect to being served by gendered social robots.
The following research question is proposed:
-
RQ: Does the fact that a social robot delivering a customer service exhibits gender attributes impact its technological acceptance depending on a person’s stated sex?
This question is appropriate for assessing the adequacy of designing gendered robots to provide services [3]. This research builds on previous studies that (a) assessed the experience of receiving different degrees of assistance from a TIAGo robot to solve a cognitive puzzle [30] and (b) analysed the moderating role that stated sex and rational thinking could play on the intention to use a robot [23].
The contributions of this study are:
-
It has validated a proposed parsimonious model (in the sense of being compact and focused) for assessing the technological acceptance of services offered by social robots, which is estimated through the intention to use.
-
Cross-sectional information is provided from a hybrid analysis (robot gender by customer’s stated sex) of a sample of 219 participants on the main drivers contributing to the technological acceptance of a service delivered by a fully autonomous robot.
2 Theoretical Background
This section presents the most relevant literature on the use of social robotics for service delivery (see Sect. 2.1). This is followed by an analysis of the role of gender in social robotics (see Sect. 2.2). Human preferences regarding robots according to sex are then considered (see Sect. 2.3) and the paper concludes with a brief overview of the most used models for evaluating the acceptance of a technology (see Sect. 2.4).
2.1 Social Service Robots
For social robots to deliver quality customer-facing services, they must be able to offer instrumental support (help to solve customers’ problems) [31], emotional support (manifest feelings of compassion in adverse situations and happiness in favourable situations) [32], and display autonomy, so that consumers accept them as peers and interaction partners [33]. Robotic designs that can deliver these benefits can be effectively integrated into service organizations, performing tasks that were previously performed by human employees, thus helping to relieve them of some of their workload so they can spend more time on higher-value tasks [16].
However, for a commercial robot to provide emotional support and perform useful tasks, designers need to determine what these tasks will be, and program and install them as an Artificial Intelligence (AI) application. Also, for a robot to interact with customers in a more natural way, social intelligence protocols need to be installed that are triggered both when the data collection subsystem is initiated (e.g., when the robot asks a customer something or vice versa) and when the response is triggered (e.g., when it explains the solution) [34].
The goal of providing social robots with sufficient skills to offer customer-facing services is closely related to the goals of social robotics, which also involve finding solutions for the vast complexity of human-robot relationships, e.g., by developing simpler and more intuitive interfaces [35]. Although it was initially believed that designs with human forms would be perceived as more sociable, as they inspire greater trust and thus foster HRI [36, 37], designs with simplified anthropomorphic features are now more popular as, despite the advances in AI, it is still much easier to design human-like forms in animatronics than it is to reproduce the way humans naturally act, think and communicate.
Therefore, the expected trend in the design of the next generation of social robots is for humanoid forms to be gradually matched to their AI and social intelligence capabilities. However, it is still unclear whether they should incorporate gender attributes [18, 22, 38], especially as the use of gendered social robots to perform gender-stereotyped tasks in service enterprises may raise ethical concerns.
2.2 Humanoid and Social Robots with Gender Attributes
Both the designers and marketers of new technologies have made an enormous effort to endow them with human attributes, for example by assigning names, gender, and human voices to autonomous vehicles [39] or names, human language, and verbiage to chatbots (Amazon Alexa) [40]. It seems that humanizing these devices increases trust, the perception of competence and acceptance of their failures [39]. Similarly, the HRI literature suggests that humanising social robots (embodiment, behaviour, and speech) endows them with a social presence that means they can interact more naturally with consumers [41]. However, while some authors find that consumers prefer to interact with social robots that exhibit humanoid features rather than mechanical-looking or zoomorphic ones [42, 43], others suggest that making robots resemble humans is undesirable [44, 45].
The literature on service robots shows that certain morphologies are more suited to some tasks than others [3]. Goetz et al. [46] found that people preferred robots with a human-like appearance for sociable tasks, and a machine-like appearance for less sociable tasks. For example, in hotels, humanoid robots are preferred for concierge tasks [47], while mechanical-looking designs are used for security tasks [37]. This line of thought also affects the incorporation of gender attributes into the design of social robots. Although social robots were initially gender-neutral or male-coded, designs gradually shifted towards female-coded robots, often called “fembots” [9]. The coding of robots with gendered attributes only requires small changes in morphology, tone of voice, behaviour or name [10, 11]. Again, evidence has been gathered that manipulation of a robot’s gender makes it more suitable for some tasks than others. For example, a robot with a male voice was considered more useful and acceptable for safety tasks than one with a female voice [13], and even the addition of a cliched accessory, such as an apron, made stereotypically female tasks such as cleaning or wiping tables more acceptable [48]. These practices have been criticized in HRI for reproducing gender stereotypes [29], which are “beliefs about the attributes and behaviours considered appropriate for males and females in a given culture” [43, 49]. These not only have a descriptive but also a normative function, i.e., regarding how different genders should behave [11]. Furthermore, there has also been criticism that the full variety of gender expressions and sexed bodies has not been considered [9, 26, 27, 50]. Supporting this line of argument, some authors question the need to design robots with gendered attributes [10], as the main argument for appraising a social robot should be what it can do and not its gender [51].
However, there are still significant groups of consumers who express preferences for robots coded with gendered attributes, believing that this can help to make them feel more comfortable, and boosts the effectiveness and quality of their output [11]. For example, in the domestic environment, a male-coded robot may be perceived as a threat, so female-coded robots are preferred [52]. Also, among older age groups, the use of robot gender cues may contribute to improved acceptance and greater appreciation of the experience [53]. For example, in a qualitative study with a small sample of elderly people, the majority preferred a nursing task to be performed by a female-looking robot [12].
Although Schiebinger et al. [54] warned that as soon as a robot is gendered, stereotypical beliefs are triggered, this does not preclude the possibility of conducting studies on the effect at a specific point in time. As the social environment is gradually changing, the moment the robot is designed will be conditioned by the dominant gender norms, identities and relations of the time, the important thing being that they provide insights and kindle debate. As suggested by Suppe [55], a central aim of science is to provide knowledge about what the world really is, and it is also essential for there to be correspondence between theoretical propositions and observable reality.
2.3 Men’s and Women’s Consumer Preferences
In order to design services to be provided by social robots, it is not only necessary to adequately design the robotic equipment (morphology, gender, AI system and social intelligence protocols) but also to profile the target audience for which the service is intended (i.e., their level of trust in technologies, age, gender identity and sex, among others) [23, 56]. These profiles are configured after segmenting the market [24].
One of the most used variables to segment the market is stated sex, although gender identity (a self-attributed and more nuanced variable) has recently begun to be considered too. Both play a major role in the design of products, services, and marketing campaigns [24]. For example, the Coca-Cola Company designed, promoted, and launched two products with almost identical ingredients, Coca-Cola Zero and Diet Coke, the former targeting the male market segment and the latter the female one [24]. However, social changes, which are leading to a reduction in stigma and discrimination against people with gender identities that do not conform to the male/female binary [57], and a greater media presence of these groups are leading companies to develop different segmentation and positioning policies. Thus, some companies based on unisex products have opted to design new products focused on sex (e.g., Lego Friends) and, conversely, companies that traditionally segmented by sex now focus on gender-neutral products (e.g., Target) [24, 58].
The main reason why marketers use stated sex to segment the market is grounded on the selectivity hypothesis [59], according to which people who declare themselves to be men or women manifest different preferences and tastes, and find different types of images, shapes and commercial stimuli attractive [24]. This is due to a propensity of men and women to process information differently [24, 60]. While women tend to process the data captured by their senses, particularly sight, in an integral manner, using the interrelationships between the different individual elements to compose a complete image, men tend to process data selectively, focusing their attention on specific elements and basing their overall assessment on them [61]. This means that women’s tendency to process information holistically makes them more likely to notice, for example, harmony between colours, to prefer low contrast in colour combinations and, ultimately, to prefer more harmonious, fluid, and rounded product and packaging designs [59, 62]. Conversely, men’s propensity to process information more selectively leads them to prefer sharper colour contrasts, as well as designs with angular shapes and straight lines [59].
But these differences not only relate to products, but also the evaluation of services provided by self-service machines and social robots. For example, in a study that assessed the quality of service provided by ATMs, the authors found significant differences in men’s and women’s perceptions of ease of use [63]. Similar results were obtained in studies with social robots. For example, Schermerhorn, Scheutz and Crowell [64] compared men’s and women’s perceptions of the humanness of a robot, and the results reflected a greater tendency among men than women to perceive it as human. As for the degree of acceptance of gendered robots by the different stated human sexes, there is no consensus. Wang and Young [65] showed that men are more likely to express a positive attitude towards female-coded robots than male-coded robots, and Siegel, Breazeal, and Norton [66], in a museum experiment, collected evidence that visitors tended to rate the robot of the opposite gender as more credible, trustworthy, and attractive than the robot of the same gender. In contrast, Eyssel et al. [10] found that users expressed a greater preference for robots of the same gender than of the opposite one.
Recent findings propose the existence of at least two levels of consumer involvement in information processing and derived responses, which are highly correlated with stated sex and gender identity [24, 54, 67]. Nickel et al. [24] suggest the use of stated sex as a criterion for segmenting markets that involve automatic, impulsive, or convenience-based decision processes (usually with time constraints) and, conversely, gender identity for markets that require more reflexive information processing (the purchase of speciality or preferred products), and that contribute to identity shaping or status signalling.
2.4 Technological Acceptance of Social Robots
For more than a decade, the models used to study and predict the acceptance of new technologies (personal computers, smartphones, etc.) have been adapted to the study of social robots [68]. Heerink et al. [69], the pioneers in this field, proposed an eclectic model, which they called Almere, which consists of combining factors from different sources: the Unified Theory of Acceptance and Use of Technology (UTAUT) [70], UTAUT2 and the social robotics literature [69]. This model was designed to predict the intention to use social robots among a sample of nursing home residents and, following several analyses, took the form of six direct and four indirect precedents. It was validated by conducting a series of experiments with various robotic devices: a voice-controlled iCat, a RoboCare in video format, aniCat with a touch screen and Steffie, developed as part of an informational website. They mainly used a Wizard of Oz (WoZ) scenario, in which the robot was operated by a research assistant. Consequently, the results of the combined data gathered from four robotic applications were very general and the use of a WoZ setup conveyed the feeling that these robots had reached a degree of sophistication that was far removed from reality [18]. In later research, generally analysing a single robotic device, more simplified versions were proposed. Examples are Fridin and Belokopytov [71], who considered a technological acceptance model made up of three direct and three indirect precedents. Liu et al. [72], who replicated UTAUT, used the four direct precedents, and Graaf et al. [73] used a simple model composed of five direct precedents, but analysed all their interactions.
In this study, a parsimonious adaptation of the Almere model is proposed, as the use of simple models is advisable for evaluating early HRI experiences, given that consumers have not yet formed a judgement and only appreciate a small number of relevant factors [23]. The proposal consists of four direct (perceived usefulness, perceived ease of use, perceived enjoyment, and social influence) and two indirect (perceived adaptability and perceived sociability) precedents of the intention to use a social robot (Fig. 1 illustrates the model). From the original Almere model, two direct precedents have been discarded: attitude and trust. Attitude was removed because in Davis’s [74] TAM model perceived usefulness and ease of use were considered precedents of attitude [68]; therefore, considering them both in parallel, as in the Almere model, is akin to measuring them twice. Trust has also been ruled out for three reasons: robots with human-like characteristics are precisely the ones that generate the most trust [21]; second, the experiment was conducted in a public setting where the presence of other people reduces the perception of danger to personal integrity [22]; and third, this factor had no significant effect on the Almere model [69]. Two indirect precedents, anxiety, and social presence, were also ruled out, given the characteristics of the robot and the public setting where the HRI was experienced [21, 22].
As noted above, several studies have shown that the attribution of human characteristics to social robots (human-like body shape, gender, conversational skills, etc.) elicits similar social reactions in HRI to those generally observed in human–human interactions [64, 66, 73, 75]. Furthermore, it is relatively easy to gender a social robot, as any manipulation of the tone of voice, name or a physical characteristic is enough to achieve this goal [9, 76]. Given how easily gender is attributed to a robot, in this study we decided to compare male-coded and female-coded robots, discarding the design of a gender-neutral robot as a control element due to the difficulties involved in characterizing and controlling its neutrality. However, although numerous studies have addressed the effect of robot gender on HRI [29], little is known about how gendered robots affect the precedents of technological acceptance of social robots used to deliver customer-facing services to different sexes.
3 Empirical Study
To explore how gendered social robots affect the drivers of technological acceptance of the delivery of customer-facing services, a 2 × 2 between-subjects experiment (robot gender vs. participant stated sex) was conducted. The fieldwork involved setting up a stand at a trade fair for technological and sustainable products that was visited by thousands of people. To simulate a customer-facing service, a difficult eudaemonic puzzle was used, and the robot acted as an assistant who helped the participants to solve it. The proposed puzzle meets some characteristic requirements of consumer-facing services: first, it requires participants to follow a sequence of commands with the risk of getting stuck, as usually happens in complex operations with ATMs [77]; second, the duration of the interaction (about five minutes) is very similar to that of checking into a hotel [78] and, third, the conversation that the robot establishes with the participant, with advice as to where to find the right token [79] and messages of empathy when the participant gets things right or wrong, are common in consumer-facing services [80]. In addition, the intrinsic difficulty of the puzzle means the robot’s help is convenient and justified. Specifically, participants were requested to solve a cognitive puzzle by forming the five-letter name of a Nobel laureate with the assistance of a robot. The social robot applies several degrees of assistance, combining verbal and non-verbal cues, depending on the number of errors made by the user. During the experiment, the robot was able to adopt either a feminine or a masculine role (Fig. 2 shows a participant playing).
3.1 Apparatus
The robot employed in this experiment was a TIAGo, whose original head was replaced with an LCD screen to reproduce facial expressions and thus give it more human-like characteristics.
The robot was programmed with the solution to the puzzle so that it could help users. Specifically, four increasing levels of assistance were defined: “Encouragement”, “Suggesting line”, “Suggesting subset” and “Suggesting solution”, following previous studies [30, 35]. As soon as the user makes a mistake by picking an incorrect token, the robot warns the user of the error. If the user still gets it wrong, the level of assistance is increased, until it finally suggests the correct token on the fourth attempt. All the assistance levels were provided by means of the robot’s speech function.
As well as providing direct assistance, the robot was also equipped with a back-channel social intelligence protocol [8] called SOCIABLE, which primarily served to naturalise social interaction, and convey the impression of interacting with a machine that has a certain degree of intelligence. Thus, when a participant picks up a token, the robot assists him or her with verbal signals, e.g., words, onomatopoeias or short sentences such as “yeah”, “correct”, “hum”, “are you sure?“, and non-verbal signals, e.g., nodding its head or facial gestures, to tell the participant whether the token is correct or incorrect. Finally, the robot congratulates the user on placing the correct token and tries to reassure them if they place an incorrect one.
The game board used a fully sensorised electronic RFID technology, which is totally reliable in terms of perceptual ability, regardless of light and occlusions.
3.2 Modelling Robot Gender
To represent the TIAGo robot’s gender, both verbal (tone of voice) and non-verbal (facial expressions) signals were used. Verbal signals were generated using text-to-speech Loquendo software, while non-verbal signals were produced by presenting caricatured images on the LCD screen. As well as eye expressions, the male-coded robot also had a moustache (see Fig. 3) while the female-coded robot had long eyelashes (see Fig. 4). The use of these stereotypical features has been demonstrated to be easily identifiable by participants [38], and similar solutions have been used in previous literature [10, 75].
3.3 Sample, Procedure, and Metrics
A total of 223 visitors, aged 18–67 years (Mage =35 years, 106 women, 113 men, and 4 declared ‘others’, who were not considered in the study), participated in the experiment, in which the researchers controlled participation by stated sex and age [22].
Although the goal was to study the technological acceptance of social robots that deliver assistance services to customers, a prototype service was evaluated in this preliminary phase [81]. Prototypes are models that simplify the different operations involved in providing a service and are used to detect failures in both tangible and intangible elements, as well as to explore the reactions of different stakeholders, in order to improve the design [81]. This allowed for a sufficiently large sample of real HRI experiences to be obtained in a short time, with the capacity to make estimates of the effects of incorporating male and female codes into the robot.
Visitors, in batches of ten, were invited one by one to do the puzzle with the help of a male-coded or female-coded robot. Upon arrival, participants were informed of the study and asked to sign a consent form. The experimenter would then explain the cognitive puzzle and the assistance that the robot might be able to provide. They then went on to do the puzzle with the robot’s help (approximately 5 min), and on completion they were invited to fill in a questionnaire.
This questionnaire consisted of 26 statements that covered the seven proposed constructs and had to be evaluated on a five-point scale (1 = “totally disagree” and 5 = “totally agree”). The scales were adapted from previous literature. Thus, intention to use (ITU) consisted of three elements: “If the robot was available, I would try to use it”, “If the robot was available, I would try to use it whenever I could in my spare time”, and “If the robot were available, I would sometimes be thinking of using it again” [68, 82]. Perceived Usefulness (PU) consists of three items: “I think the robot is useful for entertainment”, “It would be nice to have the robot for entertainment”, and “I think the robot could be used to entertain me and to do other things”. Perceived Ease of Use (PEOU) is made up of five items: “I immediately learned how to use the robot”, “The robot seemed easy to use”, “I think I can use the robot without any help”, “I think I can use the robot with someone’s help” and “I think I can use the robot if I have good instructions” [68, 69]. Perceived Enjoyment (PENJ) consists of five elements: “It’s fun to talk to the robot”, “It’s fun to play with the robot”, “The robot looks fun”, “The robot seems charming” and “The robot seems boring” [74]. Social Influence (SI) consists of three items: “I think my friends would like me to use the robot”, “I think it would create a good impression if I played with the robot” and “I think people whose opinion I value would look favourably upon me for playing with the robot”. Perceived adaptiveness (PAD) is made up of three elements: “I think the robot could adapt to my needs”, “I think the robot would adapt to what I need at any moment in the game” and “I think the robot will help me when I consider it necessary”. Perceived Sociability (PS) consists of four items: “Talking to the robot is amusing”, “I find the robot pleasant to interact with”, “I feel the robot understands me” and “I think the robot is attentive” [69]. Finally, the participants filled in the classification data in the form of an open-ended question (sex, age, etc.).
To analyse the psychometric characteristics of the scales, the Structural Equation Modelling (SEM) technique, based on variance and covariance matrices by maximum likelihood with Eq. 6.4 [83], was used with all data. Meanwhile, the analysis and estimation of the values of the four scenarios, considering the sample size once it had been segmented, were adjusted using Ordinary Least Squares (OLS) [84].
4 Results
4.1 Scale Validation
First, the psychometric characteristics of the scales (dimensionality, reliability and validity) were analysed. As a result, five items were removed, leaving twenty-one items for the seven constructs (three items per construct). The measurements, summarized in Table 1, show how all constructs achieved adequate reliability, and convergent and discriminant validity. Cronbach’s α had values greater than 0.80 in all factors, composite reliability (CR) also obtained values greater than 0.80 (ranging from 0.83 to 0.93) and all items presented adequate convergent validity, since each factor load exceeded 0.6 and the t-values of each item were significantly high as recommended by the literature [85]. The discriminant validity was also verified (Table 2), since each of the square roots of the average variance extracted (AVE) between each pair of factors was greater than its correlations between the remaining factors. This means that any construct must share more variance with its items than with the other constructs in the model [86].
4.2 Four Scenarios
Once the scales had been validated, the sample was divided into four subsamples according to the gender of the robot and the stated sex of the participants, and from each subsample the parsimonious adaptation of the Almere model was estimated using OLS. Thus, four models were estimated: Scenario 1 (female-coded robot and stated men), Scenario 2 (female-coded robot and stated women), Scenario 3 (male-coded robot and stated men) and Scenario 4 (male-coded robot and stated women). Table 3 shows the weights of the factors that explain the dependent variable, and the amount of variability explained by the model was collected by their R2 value (Fig. 5 presents the significant relationships in a path diagram). We thus observe that R2 has acceptably high values given the size of the subsamples: R2 = 0.46 for Intention to Use in Scenario 1, R2 = 0.59 in Scenario 2, R2 = 0.36 in Scenario 3 and R2 = 0.51 in Scenario 4.
First, we analyse the results of men’s evaluations of the experience of receiving help from a robot and their intention to continue being served by social robots (Scenario 1, female-coded robot; Scenario 3, male-coded robot). In the male evaluation of the female-coded robot, the intention to continue using the robot is basically explained by two drivers: perceived usefulness (β = 0.39, p < 0.05) and social influence (β = 0.39, p < 0.05), while the other direct effects did not reach significant values. On the other hand, when the robot is male-coded (Scenario 3), the perceived enjoyment of interacting with it (β = 0.32, p < 0.05) is slightly higher than the perceived usefulness (β = 0.30, p < 0.05), while the perception that the robot is easy to use is markedly negative (β = -0.36, p < 0.05). Although the analysis is exploratory, the results reveal different motivations for continuing to be served by social robots. Men therefore consider that the experience of being served by a female-coded robot will earn social support from their circle of friends, while a male-coded robot is perceived as entertaining but difficult to use. Regarding indirect effects, the differences between male-coded and female-coded robots are smaller among men. The perceived usefulness of the experience with a female-coded robot (R2 = 0.31) is explained by the perception that it adapts to their needs while trying to solve the puzzle (β = 0.48, p < 0.05). The same occurs with a male-coded robot (R2 = 0.41) and is also explained by its perceived adaptability (β = 0.57, p < 0.05). Another indirect relationship is the feeling that interaction with a female-coded robot seems entertaining (R2 = 0.33), as long as it is perceived to be adaptable (β = 0.33, p < 0.05) and sociable (β = 0.33, p < 0.05). Small differences are obtained when the robot is male-coded (R2 = 0.33), and thus the perception of being entertaining is explained by its adaptability (β = 0.39, p < 0.05) and, to a lesser extent, by perceived sociability (β = 0.29, p < 0.05).
When the service provided by the robot is valued by women (scenario 2, female-coded robot; scenario 4, male-coded robot), the intention to use a female-coded robot is explained, to a greater extent, by its social influence (β = 0.59, p < 0.05) and, to a lesser extent, by its perceived usefulness (β = 0.22, p < 0.05). The other direct effects did not reach significant values. With the male-coded robot, however, the weight of the variables is reversed, with a greater weight being assigned to perceived usefulness (β = 0.43, p < 0.05) and less weight to social influence (β = 0.36, p < 0.05). This is an interesting result, since male-coded robots are perceived to be more utilitarian than female-coded ones, but the latter are attributed a greater social function. In terms of indirect effects, for women the differences between male-coded and female-coded robots are significant. The perceived usefulness of the female-coded robot (R2 = 0.49) is explained simply by the feeling that the robot adapts to their needs (β = 0.66, p < 0.05), but in the case of the male-coded robot, (R2 = 0.26), in addition to the perception that it adapts to their needs (β = 0.37, p < 0.05), it is also explained by the perceived ease of use (β = 0.26, p < 0.05). Regarding perceived enjoyment, in the case of the female-coded robot (R2 = 0.52), this is explained by its sociability (β = 0.59, p < 0.05) and, albeit to a lesser extent, by its adaptability (β = 0.21, p < 0.05). In the case of the male-coded robot (R2 = 0.32), however, the weight of the factors is reversed, with greater importance attached to adaptability (β = 0.35, p < 0.05) than to perceived sociability (β = 0.29, p < 0.05).
These results suggest that the gender of a social robot that provides a customer-facing service is a relevant factor, as it affects the intention to continue receiving the service. Furthermore, neither men nor women revealed the same preceding motivations for using a gendered robot in this context. For men, a service delivered by a female-coded robot, in addition to being useful, also entails social acceptance gained from telling other people about their experience. But, for women, social acceptance is the main factor driving their intention to be served by a female-coded robot. In contrast, the motivations associated with a male-coded robot, in addition to being useful, are its ability to entertain for men, while women also attach importance to its social influence.
5 Discussion and Implications
Although the trend in robot design is towards more human-like forms (e.g., PAL Robotics’ most recent models, TALOS, REEM-C and ARI, have more human-like shapes than TIAGo) [7], until significant advances in AI produce the leap to android designs, humanoid designs will remain the standard for providing customer-oriented services [16, 87, 88]. This research has used a semi-humanoid robot, TIAGo, to explore how gendered social robots influence the customer-oriented service experience of stated men and women, and how it affects their technological acceptance. Moreover, while the debate continues in academia as to whether robots should be programmed to manifest gender [29], the findings of this study offer new theoretical and managerial implications, as well as future avenues for research.
5.1 Theoretical Insights
In this study, an experiment was conducted that suggests that attributing gender to a robot delivering a service affects consumers’ motivations to continue using that robot according to their stated sex. Although there is a large body of evidence analysing how users’ stated sex affects their acceptance of gendered robots, there is no consensus on its effects. While Siegel et al. [66] showed that men express more positive attitudes towards female-coded robots than male-coded robots, Eyssel et al. [10] find the opposite. In this experiment, stated sex was considered as a segmentation criterion, as it is currently one of the most used criteria, given that it generates sufficiently large groups for comparison, and it is appropriate for experiences of short duration and little emotional involvement [3, 24]. However, gender identity has begun to be used as a criterion to segment the market, but with the focus on longer service experiences that require greater consumer involvement [89, 90].
The design of gendered robots is a source of controversy, as it raises ethical considerations that must be considered. First, the assignment of gender to a robot immediately triggers gender stereotypes [54] and this may contribute to the perpetuation of outdated gender roles that do not conform to current social norms [9]. Second, it may limit the robot’s ability to interact with people who do not identify with the assigned gender [51]. That is, non-binary gender identities are not considered, nor are other gender expressions and different sexed bodies [9, 26, 27, 50]. Third, endowing robots with gendered attributes may emphasise certain qualities and neglect others, constructing simplistic personalities with stereotypical forms and behaviours [91]. Fourth, it may be viewed as a violation of people’s privacy and autonomy, as a certain way of interacting with the robot is imposed on them [92]. Finally, assigning gender to the robot reinforces the idea that gender is an assignment, a set of attributes that can be easily interchangeable, which could lead to a more fluid view of gender [91, 93]. Nevertheless, the appropriateness of using robot gender cues in targeted services for older cohorts should be considered, as it may help improve their acceptance of robots [53].
This exploratory hybrid study, which considered the combination of robot gender and consumers’ stated sex, provides evidence of a more complex process, as customer-facing service delivery differs according to the gender of the robot and the sex of the human user.
In the case of using female-coded robots, men consider both utilitarian motivation and social influence to be equally important for continuing to use the services provided by the robot, while women consider social influence to be most important factor. In other words, for women, the opinion of their peers, family and friends is more important than the potential benefit to be gained from the HRI. However, when the robot was assigned a male gender, the story changed. For men, this change of gender affected their motivations, whereby social influence became less significant, and the robot came to be viewed more as an object of entertainment. Furthermore, although the utility of the robot remained important, the difficulty of its use appeared as a criticism, something that was not the case when using a female-coded robot. Although in previous studies women have reported lower self-confidence in handling technical equipment [25, 94], in this experience of receiving help from a social robot no significant differences in perceived ease of use were found. This is completely different from what was detected in the assessment of ATM services, where women assigned greater importance to ease of use than men [60]. Only in the case of the male-coded robot did men find it significantly more difficult to use. As for women, their motivation towards the services of a robot also changed completely when it was assigned a male gender, since they perceived it more as a useful tool than a social influencer.
In short, while women view a female-coded robot as an element of social influence, the male-coded robot is nothing more than a useful instrument. Meanwhile, men view a male-coded robot as serving for entertainment, albeit difficult to handle. The latter results are in line with those obtained by Bryant et al. [67], who showed that the preferred robots for comedy and entertainment activities are basically male, although they did not tell us about the differences between market segments. And while the study by Winkle et al. [94] found that its participants tend to perceive robots of their own gender more positively than those of the opposite gender, our results only corroborate this in the case of women using female-coded robots, because men also appear to have a certain preference for female-coded robots. The literature has also proposed that consumers’ expectations of a social robot’s cognitive abilities and utility depend on its stated gender [10, 95], but our study shows that the consumer’s sex also plays an important role in this assessment [68].
Different indirect effects have also been observed between gendered robots and consumers’ stated sex. While in the case of the female-coded robot perceived usefulness was explained by the perceived adaptability of both market segments, in the case of the male-coded robot, adaptability was still the relevant factor, but so was the perceived ease of use in the case of women. This is an interesting result, since while men perceived the male-coded robot as difficult to handle, among women perception of ease of use was the factor that precisely explained its usefulness. Finally, the perception of entertainment was explained by the ability of both segments to adapt in the case of the male-coded robot, whereas with the female-coded robot the main argument was the perception of sociability.
In short, it can be concluded that the technological acceptance of a social robot for the provision of customer-facing service experiences [96] is driven by utility, social influence and entertainment criteria based on the gender of the robot and the customer’s stated sex. However, practitioners and researchers should not only understand how certain design features can contribute to the generation of positive or negative experiences in HRI but should also consider the ethical implications of their decisions, especially when it comes to gendering robotic devices.
5.2 Managerial Implications
This study began by presenting the types of operations that social robots must provide when offering customer-facing services to generate memorable experiences for consumers. They must be able to solve customers’ problems and generate a satisfactory interaction of a social-affective nature that will foster customer loyalty [3]. However, service companies and organizations must also consider whether the implementation of social robots will serve a tactical or strategic purpose. The former refers to cases when the aim is merely to replace employees with social robots, while a strategic purpose is aimed at improving service quality, whereby the robot complements the role of employees, improving their well-being as well as that of customers. In other words, a strategic approach involves improving the customer–company relationship and not merely reducing costs, which could help to generate a positive buzz and boost the company’s image [97].
However, the use of gendered robots is generating controversy, as is the case with other commercial products, and this has caused confusion among designers and managers. While some believe that a gendered approach is an important part of the final product design, others believe the opposite, and feel that genderless designs should be pursued [98]. Nickel et al. [24] propose an integrative approach, suggesting that both sex and gender criteria need not be incompatible, but can be used as criteria for product design and market segmentation, depending on the degree of consumer involvement. They suggest that stated sex can be a good segmentation criterion in the design and marketing of products and services in which consumers invest little cognitive and emotional effort in the purchasing process, such as convenience products, impulsive purchases, or short-term services. On the contrary, when it comes to products or services that require greater cognitive and emotional effort, such as speciality and/or novelty goods, or medium or long duration services, gender identity might play a more relevant role in market segmentation [24, 28]. Given the general nature of their proposal, these criteria can be transferred to the design and marketing of products or services provided by social robots.
However, managers will also need to monitor how social values evolve, and the ethical consequences of their decisions. Hence, within the same social environment, specific cohorts, such as older people, may find stereotypically gendered robots more natural in care services, given the entrenched nature of such beliefs in those cohorts, while this gender attribution is rejected by younger cohorts, who are more critical of the reproduction of gender stereotypes [11].
5.3 Limitations
The limitations of this study open avenues for future research. First, it presents the findings of an exploratory study, and further studies with larger sample sizes are needed to corroborate its results. Second, the robot’s assistance was scripted and predefined across four levels of assistance depending on the number of errors made by the participant, and it would be interesting to evaluate a robot capable of tailoring its support to the participants and the effect that this might have on their intention to use it [30, 99]. However, unlike most of the previous studies using WoZ scenarios, in this one the robot aided in a fully autonomous manner. Third, the sample collects evaluations from a highly industrialised Western country, Spain, where there might be higher engagement with gendered life conditions. The consideration of cultural differences and degrees of technologization and industrialisation could lead to different responses. For example, in a study of self-service technology, two different markets were considered (a collectivist market and an individualistic one) and the findings revealed significant differences between the motivators shaping the UTAUT model in terms of the intention to use this technology [100]. Fourth, in this experiment, two types of signals were used to represent the robot’s gender, namely tone of voice and a cartoonish caricature (moustache for the male robot and long eyelashes for the female one), which may have conditioned a result that confirms stereotypical values. Experiments with more subtle attributions should therefore be proposed to observe whether the same results are obtained.
Future studies could explore other characteristics of social robot design, such as the ability to display different personality characteristics or signs of cultural awareness [101, 102]. Other contexts and levels of customer-facing services could also be considered.
Finally, all participants stated that it was their first HRI experience, and the findings may have been different in terms of both the main factors and their weights if they had been more accustomed to dealing with social robots. Moreover, although studies on long-term HRI are scarce [103], it would be interesting to know how the results differ in the case of longer experiences.
Data Availability
The datasets generated and/or analysed during the current study are available from the corresponding author on reasonable request.
References
Aymerich-Franch L, Ferrer I (2020) The implementation of social robots during the COVID-19 pandemic. arXiv preprint arXiv:2007.03941. https://arxiv.org/abs/2007.03941
Yang GZ, Nelson J, Murphy B, Choset RR, Christensen H, Collins HH, Dario S, Goldberg P, Ikuta K, Jacobstein K, Kragic N D (2020) Combating COVID-19—The role of robotics in managing public health and infectious diseases. Sci Robot 5(40):eabb5589. https://doi.org/10.1126/scirobotics.abb5589
Wirtz J, Patterson PG, Kunz WH, Gruber T, Lu VN, Paluch S, Martins A (2018) Brave new world: service robots in the frontline. J Serv Manage 29(5):907–931. https://doi.org/10.1108/JOSM-04-2018-0119
Pinillos R, Marcos S, Feliz R, Zalama E, Gómez-García-Bermejo J (2016) Long-term assessment of a service robot in a hotel environment. Robot AutonSyst 79:40–57. https://doi.org/10.1016/j.robot.2016.01.014
Henkel AP, Caic M, Blaurock M, Okan M (2020) Robotic transformative service research: deploying social robots for consumer well-being during COVID-19 and beyond. J Serv Manage 31(6):1131–1148. https://doi.org/10.1108/JOSM-05-2020-0145
Rautaray SS, Agrawal A (2015) Vision based hand gesture recognition for human computer interaction: a survey. ArtifIntell Rev 43(1):1–54. https://doi.org/10.1007/s10462-012-9356-9
PAL Robotics (2021) TIAGo robot features. https://pal-robotics.com/robots/tiago/. Accessed 7 Sep 2021
Andriella A, Huertas-García R, Forgas-Coll S, Torras C, Alenyà G (2020) Discovering SOCIABLE: Using a Conceptual Model to Evaluate the Legibility and Effectiveness of Backchannel Cues in an Entertainment Scenario. In: RO-MAN 2020- 29th IEEE International Conference on Robot and Human Interactive Communication, Naples, Italy, IEEE, pp. 752–759. https://doi.org/10.1109/RO-MAN47096.2020.9223450
Robertson J (2010) Gendering humanoid robots: Robo-. sexism in Japan Body Soc 16(2):1–36. https://doi.org/10.1177/1357034X10364767
Eyssel F, Hegel F (2012) (S) he’s got the look: gender stereotyping of robots. J ApplSocPsychol 42(9):2213–2230. https://doi.org/10.1111/j.1559-1816.2012.00937.x
Weßel M, Ellerich-Groppe N, Schweda M (2021) Gender stereotyping of Robotic Systems in Eldercare: an exploratory analysis of ethical problems and possible solutions. Int J Soc Robot 1–14. https://doi.org/10.1007/s12369-021-00854-x
Rızvanoglu K, Öztürk Ö, Adıyaman Ö (2014) The impact of human likeness on the older adult’s perceptions and preferences of humanoid robot appearance. In: Design, user experience, and usability. User experience design practice. Third International Conference, DUXU 2014. Heraklion, Crete, Greece, June 22–27, 2014, Proceedings, Part IV. Springer, Cham, pp 164–172
Tay BTC, Park T, Jung Y, Tan YK, Wong AHY (2013) When stereotypes meet robots: The effect of gender stereotypes on people’s acceptance of a security robot. In International Conference on Engineering Psychology and Cognitive Ergonomics, Springer, Berlin, Heidelberg, pp 261–270
Aggarwal P, McGill AJ (2007) Is that Car smiling at me? Schema Congruity as a basis for evaluating Anthropomorphized Products. J Consum Res 34(4):468–479. https://doi.org/10.1086/518544
Jia JW, Chung N, Hwang J (2021) Assessing the hotel service robot interaction on tourists’ behaviour: the role of anthropomorphism. Ind Manage Data Syst 121(6):1457–1478. https://doi.org/10.1108/IMDS-11-2020-0664
Blaurock M, Čaić M, Okan M, Henkel AP (2022) A transdisciplinary review and framework of consumer interactions with embodied social robots: design, delegate, and deploy. Int J Consum Stud 00:1–23. https://doi.org/10.1111/ijcs.12808
Leyzberg D, Spaulding S, Toneva M, Scassellati B (2012) The physical presence of a robot tutor increases cognitive learning gains. In: Proceedings of the 34th Annual Conference of the Cognitive Science Society, Sapporo, Japan, 34. https://escholarship.org/uc/item/7ck0p200
Suchman L (2006) Human-Machine Reconfigurations: Plans and Situated Actions (2nd ed., Learning in Doing: Social, Cognitive and Computational Perspectives). Cambridge University Press, Cambridge. https://doi.org/10.1017/CBO9780511808418
ABOT (2022) The Anthropomorphic Robot Database, accessed 14 September 2022, https://www.abotdatabase.info/collection
Mori M, MacDorman KF, Kageki N (2012) The uncanny valley. IEEE Robot Autom Mag 19(2):98–100. https://doi.org/10.1109/MRA.2012.2192811
Mathur MB, Reichling DB (2016) Navigating a social world with robot partners: a quantitative cartography of the Uncanny Valley. Cognition 146:22–32. https://doi.org/10.1016/j.cognition.2015.09.008
Mende M, Scott ML, van Doorn J, Grewal D, Shanks I (2019) Service robots rising: how humanoid robots influence service experiences and elicit compensatory consumer responses. J Mark Res 56(4):535–556. https://doi.org/10.1177/0022243718822827
Forgas-Coll S, Huertas-Garcia R, Andriella A, Alenyà G (2021) How do consumers’ gender and rational thinking affect the Acceptance of Entertainment Social Robots? Int J Soc Robot. https://doi.org/10.1007/s12369-021-00845-y
Nickel K, Orth UR, Kumar M (2020) Designing for the genders: the role of visual harmony. Int J Res Mark 37(4):697–713. https://doi.org/10.1016/j.ijresmar.2020.02.006
Meyers-Levy J, Loken B (2015) Revisiting gender differences: what we know and what lies ahead. J ConsumPsychol 25(1):129–149. https://doi.org/10.1016/j.jcps.2014.06.003
Fausto-Sterling A (2001) Myths of gender: Biological theories about women and men. Basic Books, NY
Montañez A (2017) Beyond XX and XY. Sci Am 317(3):50–51. https://doi.org/10.1038/scientificamerican0917-50
Wolin LD (2003) Gender issues in advertising – an oversight synthesis of research: 1970–2002. J Advert Res 43(1):111–129. https://doi.org/10.2501/JAR-43-1-111-130
Nomura T (2017) Robots and gender. Gend Genome 1:18–26. https://doi.org/10.1089/gg.2016.29002.nom
Andriella A, Torras C, Alenyà G (2020) Short-term Human–Robot Interaction adaptability in real-world environments. Int J of Soc Robotics 12:639–657. https://doi.org/10.1007/s12369-019-00606-y
Williams T, Johnson T, Culpepper W, Larson K (2020) Toward forgetting-sensitive referring expression generation for integrated robot architectures. arXiv preprint arXiv:2007.08672.
Gelbrich K, Hagel J, Orsingher C (2021) Emotional support from a digital assistant in technology-mediated services: Effects on customer satisfaction and behavioral persistence. Int J Res Mark 38(1):176–193. https://doi.org/10.1016/j.ijresmar.2020.06.004
Fong T, Nourbakhsh I, Dautenhahn K (2003) A survey of socially interactive robots. Robot AutonSyst 42(3–4):143–166. https://doi.org/10.1016/S0921-8890(02)00372-X
Złotowski J, Proudfoot D, Yogeeswaran K, Bartneck C (2015) Anthropomorphism: opportunities and challenges in human–robot interaction. Int J Soc Robot 7(3):347–360. https://doi.org/10.1007/s12369-014-0267-6
Weber J (2005) Helpless machines and true loving care givers: a feminist critique of recent trends in human-robot interaction. J Inf Commun Ethics Soc 3(4):209–218. https://doi.org/10.1108/14779960580000274
Broadbent E, Kumar V, Li X, Sollers J, Stafford R, MacDonald B, Wegner D (2013) Robots with display screens: a robot with a more humanlike face display is perceived to have more mind and a better personality. PLoS ONE 8(8):e72589. https://doi.org/10.1371/journal.pone.0072589
Li D, Rau PP, Li Y (2010) A cross-cultural study: effect of robot appearance and task. Int J Soc Robot 2(2):175–186. https://doi.org/10.1007/s12369-010-0056-9
Kalegina A, Schroeder G, Allchin A, Berlin K, Cakmak M (2018) Characterizing the Design Space of Rendered Robot Faces. In:13th ACM/IEEE International Conference on Human-Robot Interaction (HRI), pp. 96–104
Waytz A, Heafner J, Epley N (2014) The mind in the machine: Anthropomorphism increases trust in an autonomous vehicle. J Exp Soc Psychol 52:113–117. https://doi.org/10.1016/j.jesp.2014.01.005
Hoy MB (2018) Alexa, Siri, Cortana, and more: an introduction to voice assistants. Med Ref ServQ 37(1):81–88. https://doi.org/10.1080/02763869.2018.1404391
Duffy BR (2003) Anthropomorphism and the social robot.Robot AutonSyst. 42(3–4):177–190. https://doi.org/10.1016/S0921-8890(02)00374-3
Tu YC, Chien SE, Yeh SL (2020) Age-related differences in the uncanny valley effect. Gerontology 66(4):382–392. https://doi.org/10.1159/000507812
Walters ML, Syrdal DS, Dautenhahn K, TeBoekhors R, Koay KL (2008) Avoiding the uncanny valley: robot appearance, personality and consistency of behavior in an attention-seeking home scenario for a robot companion. Auton Robot 24(2):159–178. https://doi.org/10.1007/s10514-007-9058-3
Broadbent E, Lee YI, Stafford RQ, Kuo IH, MacDonald BA (2011) Mental schemas of robots as more human-like are associated with higher blood pressure and negative emotions in a human–robot interaction. Int J Soc Robot 3(3):1–7. https://doi.org/10.1007/s12369-011-0096-9
Wu YH, Fassert C, Rigaud AS (2011) Designing robots for the elderly: appearance issues and beyond. Arch GerontolGeriatr 54(1):121–126. https://doi.org/10.1016/j.archger.2011.02.003
Goetz J, Kiesler S, Powers A (2003) Matching robot appearance and behavior to tasks to improve human-robot cooperation. In: RO-MAN The 12th IEEE international workshop on robot and human interactive communication, Milbrae, CA, pp 55–60
Shin HH, Jeong M (2020) Guests’ perceptions of robot concierge and their adoption intentions. Int J Contemp Hosp Manag 32(8):2613–2633. https://doi.org/10.1108/IJCHM-09-2019-0798
Wang Z, Huang J, Fiammetta C (2021) Analysis of Gender Stereotypes for the Design of Service Robots: Case Study on the Chinese Catering Market. In Designing Interactive Systems Conference 2021, pp 1336–1344
Gerrig RJ (2014) Psychology and life. Pearson Education Limited, Essex, UK
Kelly S (2014) Tofu feminism: can feminist theory absorb evolutionary psychology? Dialect anthropol 38(3):287–304. https://doi.org/10.1007/s10624-014-9353-2
Dufour F, EhrweinNihan C (2016) Do robots need to be stereotyped? Technical characteristics as a moderator of gender stereotyping. Soc Sci 5(3):27. https://doi.org/10.3390/socsci5030027
Carpenter J, Davis JM, Erwin-Stewart N, Lee TR, Bransford JD, Vye N (2009) Gender representation and humanoid robots designed for domestic use. Int J Soc Robot 1(3):261–265. https://doi.org/10.1007/s12369-009-0016-4
Jung EH, Waddell TF, Sundar SS (2016) Feminizing robots: User responses to gender cues on robot body and screen. In: Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems, pp 3107–3113. https://doi.org/10.1145/2851581.2892428
Schiebinger L, Klinge I, Paik HY, Sánchez de Madariaga I, Schraudner M, Stefanick M (eds) (2019) Gendering Social Robots: Analyzing Gender. In Gendered innovations in science, health & medicine, engineering, and environment. From http://genderedinnovations.stanford.edu/case-studies/genderingsocialrobots.html#tabs-2. Accessed 5 Nov 2021
Suppe F (1977) Afterword—1977. In Suppe F (ed) The Structure of Scientific Theories, 2d ed. University of Illinois Press, Urbana, IL, pp 614–730
Tussyadiah IP, Zach FJ, Wang J (2020) Do travelers trust intelligent service robots? Ann Touri Res 81:102886. https://doi.org/10.1016/j.annals.2020.102886
Suen L W, Lunn M R, Katuzny K, Finn S, Duncan L, Sevelius J, … Obedin-Maliver J (2020)What sexual and gender minority people want researchers to know about sexual orientation and gender identity questions: a qualitative study. Arch Sex Behav 49(7): 2301–2318.https://doi.org/10.1007/s10508-020-01810-y
Rawsthorn A (2015) Fluid gender identity drives a revolution in design. The New York Times, May 5, 2015. Available at: https://www.nytimes.com/2015/05/06/arts/international/fluid-gender-identity-drives-a-revolution-in-design.html. Accessed 5 Decem 2021
Moss G (2009) Gender, design, and marketing. Surrey, Gower. https://doi.org/10.4324/9781315254593
Aspara J, Van Den Bergh B (2014) Naturally designed for masculinity vs. femininity? Prenatal testosterone predicts male consumers’ choices of gender-imaged products. Int J Res Mark 31(1):117–121. https://doi.org/10.1016/j.ijresmar.2013.09.001
Darley WK, Smith RE (1995) Gender differences in information processing strategies: an empirical test of the selectivity model in advertising response. JAdvert 24(1):41–56. https://doi.org/10.1080/00913367.1995.10673467
Xue L, Yen CC (2007) Towards female preferences in design – a pilot study. Int J Des 1(3):11–27
Lee HJ, Fairhurst A, Cho HJ (2013) Gender differences in consumer evaluations of service quality: self-service kiosks in retail. Serv Ind J 33(2):248–265. https://doi.org/10.1080/02642069.2011.614346
Schermerhorn P, Scheutz M, Crowell C (2008) Robot social presence and gender: Do females view robots differently than males? In: The Proceedings of the 3rd ACM/IEEE Conference on Human-Robot Interaction, pp 263–270
Wang Y, Young JE (2014) Beyond pink and blue: Gendered attitudes towards robots in society. In: Proceedings of Gender and IT Appropriation Science and Practice on Dialogue-Forum for Interdisciplinary Exchange May 2014, European Society for Socially Embedded Technologies, pp. 49–59
Siegel M, Breazeal C, Norton MI (2009) Persuasive robotics: The influence of robot gender on human behaviour. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, October 2009, IEEE, pp. 2563–2568. https://doi.org/10.1109/IROS.2009.5354116
Bryant DA, Borenstein J, Howard A (2020) Why Should We Gender? The Effect of Robot Gendering and Occupational Stereotypes on Human Trust and Perceived Competency. In: Proceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction, pp 13–21. https://doi.org/10.1145/3319502.3374778
Zhong L, Zhang X, Rong J, Chan HK, Xiao J, Kong H (2020) Construction and empirical research on acceptance model of service robots applied in hotel industry. Ind Manage Data Syst 121(6):1325–1352. https://doi.org/10.1108/IMDS-11-2019-0603
Heerink M, Kröse B, Evers V, Wielinga B (2010) Assessing Acceptance of Assistive Social Agent Technology by older adults: the Almere Model. Int J Soc Robot 2:361–375. https://doi.org/10.1007/s12369-010-0068-5
Venkatesh V, Morris MG, Davis GB, Davis FD (2003) User acceptance of information technology: toward a unified view. MIS Q 27:425–478. https://doi.org/10.2307/30036540
Fridin M, Belokopytov M (2014) Acceptance of socially assistive humanoid robot by preschool and elementary school teachers. Comput Hum Behav 33:23–31. https://doi.org/10.1016/j.chb.2013.12.016
Liu L, Miguel-Cruz A, Rios-Rincon A, Buttar V, Ranson Q, Goertzen D (2015) What factors determine therapists’ acceptance of new technologies for rehabilitation–a study using the Unified Theory of Acceptance and Use of Technology (UTAUT), Disability and rehabilitation 37(5): 447–455
De Graaf MM, Allouch SB, van Dijk JAGM (2019) Why would I use this in my home? A model of domestic Social Robot Acceptance. Hum-Comput Interact 34(2):115–173. https://doi.org/10.1080/07370024.2017.1312406
Davis FD (1989) Perceived usefulness, perceived ease of Use, and user Acceptance of Information Technology. MIS Q 13:319–340. https://doi.org/10.2307/249008
Eyssel F, Kuchenbrandt D, Bobinger S, de Ruiter L, Hegel F (2021) ’If you sound like me, you must be more human’: On the interplay of robot and user features on human-robot acceptance and anthropomorphism. In: Proceedings of the 7th ACM/IEEE Conference on Human-Robot Interaction, pp 125–126
Perugia G, Rossi A, Rossi S (2021), November Gender Revealed: Evaluating the Genderedness of Furhat’s Predefined Faces. In: International Conference on Social Robotics (pp 36–47). Springer, Cham. https://doi.org/10.1007/978-3-030-90525-5_4
Meuter ML, Bitner MJ, Ostrom AL, Brown SW (2005) Choosing among alternative service delivery modes: an investigation of customer trial of self-service technologies. J mark 69(2):61–83. https://doi.org/10.1509/jmkg.69.2.61.60759
Solichin A, Jayaun J, Purabaya R (2019) Mobile-based In-Room Check-in System for Optimizing Check-in Process at The Sultan Hotel & Residence Jakarta. In 2019 International Conference on Informatics, Multimedia, Cyber and Information System (ICIMCIS), IEEE, pp 255–258
Kim S, Chen RP, Zhang K (2016) Anthropomorphized helpers undermine autonomy and enjoyment in computer games. J Consum Res 43(2):282–302. https://doi.org/10.1093/JCR/UCW016
Fox J, Ahn SJ, Janssen JH, Yeykelis L, Segovia KY, Bailenson JN (2015) Avatars versus agents: a meta-analysis quantifying the effect of agency on social influence. Hum Comput Interact 30(5):401–432. https://doi.org/10.1080/07370024.2014.921494
Tuomi A, Tussyadiah IP, Hanna P (2021) Spicing up hospitality service encounters: the case of Pepper™. Int J Contemp Hosp 33(11):3906–3925. https://doi.org/10.1108/IJCHM-07-2020-0739
Palau-Saumell R, Forgas-Coll S, Sánchez-García J, Robres E (2019) User Acceptance of Mobile apps for restaurants: an expanded and extended UTAUT-2. Sustainability 11(4):1210. https://doi.org/10.3390/su11041210
Bentler P (2006) EQS structural equations Program Manual. Multivariate Software Inc, Encino, California
Hayes AF (2018) Introduction to mediation, moderation, and conditional process analysis: A regression-based approach (Second Edition). The Guilford Press, New York
Hair JF, Black WC, Babin BJ, Anderson RE (2010) Multivariate Data Analysis. Pearson Prentice Hall, New Jersey
Fornell C, Larcker DF (1981) Structural equation models with unobservable variables and measurement error: Algebra and statistics. J Mark Res 18(3):382–388. https://doi.org/10.1177/002224378101800313
Murphy J, Hofacker C, Gretzel U (2017) Dawning of the age of robots in hospitality and tourism: Challenges for teaching and research. Eur J Tour Res 15:104–111
Van Doorn J, Mende M, Noble SM, Hulland J, Ostrom AL, Grewal D, Petersen JA (2017) Domo arigato Mr. Roboto: emergence of automated social presence in organizational frontlines and customers’ service experiences. J Serv Res 20(1):43–58. https://doi.org/10.1177/1094670516679272
Bem SL (1974) The measurement of psychological androgyny. J Consult Clin Psychol 42(2):155
Choi N, Fuqua DR, Newman JL (2009) Exploratory and confirmatory studies of the structure of the Bem Sex Role Inventory short form with two divergent samples. EducPsycholMeas 69(4):696–705. https://doi.org/10.1177/0013164409332218
Alesich S, Rigby M (2017) Gendered robots: implications for our humanoid future. IEEE Technol Soc Mag 36(2):50–59
Wallach W (2015) A dangerous master: how to keep technology from slipping beyond our control. Basic Books, Perseus Books Group. NY
Halberstam J (1991) Automating gender: Postmodern Feminism in the age of the Intelligent machine. Fem Stud 17(3):439–460
Winkle K, Melsión GI, McMillan D, Leite I (2021) Boosting Robot Credibility and Challenging Gender Norms in Responding to Abusive Behaviour. In: Companion of the 2021 ACM/IEEE International Conference on Human-Robot Interaction. https://doi.org/10.1145/3434074.3446910
Rea DJ, Wang Y, Young JE (2015) Check your stereotypes at the door: an analysis of gender typecasts in social human-robot interaction. In: International Conference on Social Robotics, Springer, Cham, pp 554–563
Lemon KN, Verhoef PC (2016) Understanding customer experience throughout the customer journey. J Mark 80(6):69–96. https://doi.org/10.1509/jm.15.0420
De Kervenoael R, Hasan R, Schwob A, Goh E (2020) Leveraging human-robot interaction in hospitality services: incorporating the role of perceived value, empathy, and information sharing into visitors’ intentions to use social robots. Tourism Manage 78:104042. https://doi.org/10.1016/j.tourman.2019.104042
Xue L, Yen CC (2007) Towards female preferences in design – a pilot study. Int J Des 1(3):11–27
Andriella A, Torras C, Alenyà G (2020) Cognitive system Framework for Brain-Training Exercise based on Human-Robot Interaction. CognComput 12:793–810. https://doi.org/10.1007/s12559-019-09696-2
Chiu YT, H, Hofer KM (2015) Service innovation and usage intention: a cross-market analysis. J Ser Manage 26(3):516–538. https://doi.org/10.1108/JOSM-10-2014-0274
Andriella A, Siqueira H, Fu D et al (2021) Do I have a personality? Endowing Care Robots with Context-Dependent personality traits. Int J of Soc Robotics 13:2081–2102. https://doi.org/10.1007/s12369-020-00690-5
Recchiuto CT, Sgorbissa A (2020) A feasibility study of Culture-Aware Cloud Services for Conversational Robots. IEEE Rob Autom Lett 5(4):6559–6566. https://doi.org/10.1109/LRA.2020.3015461
Dziergwa M, Kaczmarek M, Kaczmarek P, Kędzierski J, Wadas-Szydłowska K (2018) Long-term cohabitation with a social robot: a case study of the influence of human attachment patterns. Int J Soc Robot 10(1):163–176. https://doi.org/10.1007/s12369-017-0439-2
Funding
This work was partially funded by the European Union’s Horizon 2020 programme under ERC Advanced Grant CLOTHILDE (no. 741930) and under the Marie Skłodowska-Curie grant agreement (no. 712949) (TECNIOspring PLUS); by MCIN/ AEI /https://doi.org/10.13039/501100011033 and by the “European Union NextGenerationEU/PRTR” under the project ROB-IN (PLEC2021-007859); by the Research Council of Norway under the project SECUROPS (INT-NO/0875); and by the “European Union NextGenerationEU/PRTR” through CSIC’s Thematic Platforms (PTI + Neuro-Aging).
Open Access funding provided thanks to the CRUE-CSIC agreement with Springer Nature.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of Interest
The authors declare that they have no conflict of interest.
Ethical Standard
All participants were healthy adults who were informed that if they so wished, they could withdraw from the experiment at any time. The study was approved by the Ethical Committee of the Spanish National Research Council (reference code 056/2019).
Informed Consent
Informed consent was obtained from all individual participants included in the study.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Forgas-Coll, S., Huertas-Garcia, R., Andriella, A. et al. Gendered Human–Robot Interactions in Services. Int J of Soc Robotics (2023). https://doi.org/10.1007/s12369-023-01035-8
Accepted:
Published:
DOI: https://doi.org/10.1007/s12369-023-01035-8
Keywords
- Gendered robots
- Technological acceptance
- Social robots
- Human sex
- Gender
- Services