This contribution of the journal Gruppe. Interaktion. Organisation. (GIO) presents a study on the social perception of Embodied Digital Technologies (EDTs) and provides initial insights into social perception processes concerning technicality and anthropomorphism of robots and users of prostheses. EDTs such as bionic technologies and robots are becoming increasingly common in workspaces and private lives, raising questions surrounding their perception and their acceptance. According to the Stereotype Content Model (SCM), social perception and stereotyping are based on two fundamental dimensions: Warmth (recently distinguished into Morality and Sociability) and Competence. We investigate how human actors, namely able-bodied individuals, users of low-tech prostheses and users of bionic prostheses, as well as artificial actors, such as industrial robots, social robots, and android robots, are perceived in terms of Competence, Sociability, and Morality. Results show that individuals with low-tech prostheses were perceived as competent as users of bionic prostheses, but only users of low-tech prostheses were perceived less competent than able-bodied individuals. Sociability did not differ between users of low-tech or bionic prostheses or able-bodied individuals. Perceived morality was higher for users of low-tech prostheses than users of bionic prostheses or able-bodied individuals. For robots, attributions of competence showed that industrial robots were perceived as more competent than more anthropomorphized robots. Sociability was attributed to robots to a lesser extent. Morality was not attributed to robots, regardless of their level of anthropomorphism.
Dieser Beitrag der Zeitschrift Gruppe. Interaktion. Organisation. (GIO) stellt eine Studie zur sozialen Wahrnehmung von Embodied Digital Technologies (EDTs) vor und gibt erste Einblicke in soziale Wahrnehmungsprozesse bezüglich Technizität und Anthropomorphismus von Robotern und Anwender:innen von Prothesen. EDTs wie bionische Prothesen und Roboter halten zunehmend Einzug in Arbeits- und Lebenswelten, so dass Fragen zu ihrer Wahrnehmung und Akzeptanz untersucht werden müssen. Dem Stereotype Content Model (SCM) folgend, basieren soziale Wahrnehmung und Stereotypisierung auf zwei grundlegenden Dimensionen: Wärme (neuerdings spezifiziert als Moral und Soziabilität) und Kompetenz. Es wurde untersucht, wie menschliche Akteur:innen, d. h. Menschen ohne Behinderung, Nutzer:innen von Low-Tech-Prothesen und Nutzer:innen von bionischen Prothesen, sowie künstliche Akteure, d. h. Industrieroboter, soziale Roboter und Androiden, in Bezug auf Kompetenz, Soziabilität und Moral wahrgenommen werden. Die Ergebnisse zeigten, dass Personen mit Low-Tech-Prothesen als ebenso kompetent wahrgenommen wurden wie Nutzer bionischer Prothesen, aber nur Nutzer von Low-Tech-Prothesen als weniger kompetent wahrgenommen wurden als Menschen ohne Behinderung. Die Soziabilität unterschied sich nicht zwischen Nutzer:innen von Low-Tech-Prothesen, bionischen Prothesen und Menschen ohne Behinderung. Die wahrgenommene Moral war bei Nutzer:innen von Low-Tech-Prothesen höher als bei Nutzer:innen von bionischen Prothesen oder Menschen ohne Behinderung. Bei Robotern zeigten die Kompetenzzuschreibungen, dass Industrieroboter als kompetenter wahrgenommen wurden als eher anthropomorphisierte Roboter. Soziabilität wurde Robotern in geringerem Maße zugeschrieben. Moral wurde Robotern nicht zugeschrieben, unabhängig vom Grad ihrer Anthropomorphisierung.
Modern societies are characterized by technological change and increasing use of Embodied Digital Technologies (EDTs), such as virtual reality devices, (social) robots, and bionic devices to reestablish or augment the capabilities of their users. A change towards Hybrid Societies, which are composed of human protagonists as well as users of EDTs, is ongoing. EDTs include bionic prostheses and robots, whose common ground is a high grade of technicity and their physical presence. Social perception influences social interactions (Cuddy et al. 2008), which shapes how societies are constructed and maintained. With regard to the increasing digitalization of society, aspects and consequences of social perception will not be limited to humans any more, but could soon include artificial agents.
Mirroring this trend, the number of robots in work settings has dramatically increased over the last years (Statistisches Bundesamt 2021). While this development is welcomed by some, low-skill workers have shown rather negative reactions to robots in workspaces since first implementations in the 1980s (Chao and Kozlowski 1986). These perceptions have partly changed over the last decades, whereas criticism, particularly in service sectors, mostly concerns the decreased human contact and unnecessary deployment of new technologies (Savela et al. 2021). With regard to augmenting bionic devices like exoskeletons, Gilotta et al. (2019) pointed out that social aspects might negatively influence acceptability in such a way that wearing exoskeletons can lead to stigmatization in the workplace due to users’ perceived dependency on the supportive device.
Consequently, one must ask what can be done to mitigate these adverse effects and ensure the utmost acceptance while simultaneously using the inherent potentials. Therefore, it is inevitable to understand user perceptions, corresponding attributions, and evoked feelings, especially when it comes to the acceptance of users of bionic devices and social robots on a broader societal level. Notwithstanding, we must consider that nowadays, the majority of people have no direct contact with such devices, and their attitudes towards them might be formed based on fiction rather than fact (Sarda Gou et al. 2021). Accordingly, we aim at a deeper understanding of how the basic principles of social perception influence the formation of attitudes towards social robots and bionics users and whether or how social perception is influenced in turn by different grades of technicity.
Typically used as a theoretical basis for the research on social perception, the Stereotype Content Model (SCM; Fiske et al. 2002) proposes two fundamental dimensions, Warmth and Competence, across which different groups are perceived by others. Due to its two-dimensionality, the SCM has recently been subject to critique (Abele et al. 2021). A third factor, Morality, usually included in the Warmth-dimension (Fiske 2018), has been brought up by either dividing Warmth into two dimensions of Sociability and Morality (Kervyn et al. 2015), or by introducing Morality as a third factor into the original SCM-model (Heflick et al. 2011; Leach et al. 2007). Especially with regard to aforementioned Hybrid Societies, the inclusion of Morality needs to be researched more thoroughly (Mandl et al. 2022a), since its importance on ethical and prosocial behavior is indisputable (Hannah et al. 2011).
Research has shown that people with physical disabilities are perceived differently, that is, warmer and less competent, than able-bodied individuals (e.g., Cuddy et al. 2007; Fiske et al. 2002; Meyer and Asbrock 2018). Bionic prostheses, which are highly technical and therefore appear more artificial than their purely cosmetic counterparts diverge from this perception: people with disabilities who use bionic prostheses reclaim perceived Competence (Mandl et al. 2022a; Meyer and Asbrock 2018) and retain perceived Warmth usually associated with physical disabilities (Meyer and Asbrock 2018). If Sociability and Morality are introduced as social dimensions instead of Warmth (Kervyn et al. 2015), attributions of Sociability did not differ depending on the grade of technicity, but Morality did (Mandl et al. 2022a). With regard to robots of various grades of technicity, prior research has shown that attributions of Competence were higher for robots with clear scopes of applications such as industrial robots than for robots with more diverse application scenarios. Attributions of Sociability and Morality were rather ambiguous and remain inconsistent (Mandl et al. 2022a).
To contribute to the existing research and shed light onto the question of how these new actors are perceived, we identified two major research questions:
(1) How are users of prostheses and (2) different robots perceived in terms of the three major dimensions of social perception: Competence, Sociability, and Morality?
2 Social perception
Social perception and social evaluation are central aspect for social interactions. Gathering fast and distinctive information about others is of highest importance for interactions with others. In early work on social perception, Asch (1946) showed that certain attributes (Warmth) were more important for impression formation about others than other attributes. Various approaches to capture and describe the most important dimensions for social perception followed, mostly describing two (Abele and Wojciszke 2007, Fiske et al. 2002) or three (Koch et al. 2016; Osgood et al. 1957; see Abele et al. 2021 for an extensive review and theoretical integration). For the present study, we employ a well-developed model of the core dimensions of social perception, namely the Stereotype Content Model (SCM; Fiske et al. 2002), which focusses on the social perception of others in terms of stereotype dimensions. It postulates that all group stereotypes and interpersonal impressions are formed on two fundamental dimensions of Warmth (warm to cold), which depicts an individuals’ perceived intentions (good to bad) and Competence, depicting the self-explanatory area between competence and incompetence, resulting in four possible clusters. Based on these clusters, the SCM and its extension, the Behaviors from Intergroup Affect and Stereotypes (BIAS) map (Cuddy et al. 2007), predict specific emotional and behavioral reactions, such as compassion and sympathy for individuals with perceived low Competence and high Warmth, or envy and jealousy for individuals perceived as highly competent and low in warmth. As shown by Mieczkowski et al. (2019), recent results indicate that participants impressions of robots’ Warmth and Competence led to similar emotional reactions as they had to humans. Consequently, these results suggest the applicability of the SCM and BIAS map to robots as well as to different contexts and cultures.
2.1 Social perception of people wearing prostheses
Previous research has shown that people with physical disabilities are perceived as low in Competence and high in Warmth, evoking pity and active facilitation as a behavioral correlate (Asbrock 2010; Cuddy et al. 2007; Fiske 2018; Fiske et al. 2002; Meyer and Asbrock 2018). Recently, Meyer and Asbrock (2018) investigated effects of high-tech prostheses, or bionic prostheses, in contrast to prostheses with a lower grade of technicity, on social perception. Bionic prostheses reestablished perceived Competence to a degree comparable with able-bodied people, whereas users of low-tech prostheses were perceived as less competent. Interestingly enough, perceived Warmth only dropped if bionic prostheses were not used to restore functionality, but to enhance functionality of able-bodied people. If these people got labeled as cyborgs, beings comprised of technical and organic parts, perceived Warmth dropped even lower and they were regarded as rather frightening (Meyer and Asbrock 2018). Further research has shown that the use of high- and low-tech prostheses causes differences in attributed Competence and Morality, but not attributed Sociability (Mandl et al. 2022a).
2.2 Social perception of robots
Research on social perception has been focused on human actors, which needs to be reevaluated under the assumption that robots are being widely deployed in workspaces and private spaces (Carpinella et al. 2017; Scheunemann et al. 2020; Turja and Oksanen 2019). Robots in workspaces can be seen as either machines, mere aids without social implications (Demir et al. 2019; Savela et al. 2018), or as a type of co-worker, which in turn act as social partners (Demir et al. 2019; Sauppé and Mutlu 2015). Especially with the impending widespread use of robots in service sectors, social perception plays a crucial role in whether robots will be accepted (Savela et al. 2018; Wiese et al. 2021). Recently, the term moral machines has gained traction in Human-Roboter Interaction research, that is, the approach of implementing the ability of moral decision making into Artificial Intelligence (AI) and acceptance thereof (Awad et al. 2018; Cervantes et al. 2020; Moor 2006). As a preface to AI making moral decisions, the notion of ascribing Morality to these agents needs to be assessed. Therefore, not only ascriptions of Warmth and Competence as social dimensions, but also Morality needs to be evaluated. Research has shown that robots can be ascribed traits similar to the traditional dimensions of Competence (Carpinella et al. 2017, Mandl et al. 2022a), but that ascriptions of more inherently human capabilities such as Warmth (Carpinella et al. 2017) or Sociability and Morality (Mandl et al. 2022a) showed mixed results. Nevertheless, people working with robots tend to ascribe human attributions to robots (Sauppé and Mutlu 2015). Apart from social perception, design aspects influence how robots are perceived (von der Pütten und Krämer 2012). While a humanlike, anthropomorphic appearance contributes to perceived trust (de Visser et al. 2016), androids which cannot at first glance be discerned from actual human beings might elicit fear or even disgust (MacDorman 2019; Mori et al. 2012). This effect, the Uncanny-Valley-Phenomenon (Mori et al. 2012), is discussed controversly (Bartneck et al. 2007; Rosenthal-von der Pütten et al. 2014; Wang et al. 2015).
3 Aims and research questions
Synoptically, this study aims at the social perception of Embodied DigitalTechnologies. To realize more profound insights into this topic and to extend prior research, we focus on two specific application areas, i.e., restoring or augmenting human capabilities by using bionic prostheses and the use of (social) robotics.
While perception of people with disabilities has been widely researched, the effects of differently technologized prostheses is still inconclusive: prior research has shown mixed results in terms of perceived Competence of users of low- and high-tech prostheses. In this study, we aim at replicating and extending the findings of differing social perception of users of prostheses of different Grades of Technicity (Mandl et al. 2022a; Meyer and Asbrock 2018). We therefore hypothesize that:
H1: People with physical disabilities who use low-tech prostheses are perceived as less competent than people with physical disabilities who use bionic prostheses.
To account for the increase of robots in society, we extend the focus of social perception from human actors to robots. Hannah et al. (2011) mentioned Morality and its effect on ethical and prosocial behavior, cornerstones of functioning societies. This highlights the importance of focusing not only on the SCM-dimension of Warmth, but on Sociability and Morality separately, as proposed by Kervyn et al. (2015). Previous research on social capacities, that is, Competence, Sociability, and Morality, in robots were inconclusive insofar as that only a small percentage of people were willing to ascribe both Sociability and Morality to robots, regardless of their morphology. Specialized robots such as industrial robots were attributed more Competence than social robots or android robots (Mandl et al. 2022a). Hence, we pose the following research questions: how are people with physical disabilities and different types of prostheses, able-bodied individuals, and robots of varying levels of anthropomorphism perceived in terms of (RQ1) Competence, (RQ2) Sociability, and (RQ3) Morality.
Prior to data collection, the study was preregistered on OSF (https://osf.io/6st2b). The procedure was evaluated by the local Ethics Committee. It was not considered to require further ethical approval and hence, as uncritical concerning ethical aspects according to the criteria used by the Ethics Committee, which includes aspects of the sample of healthy adult, voluntary attendance, noninvasive measures, no deception, and appropriate physical and mental demands on the subject. We report how we determined our sample size, all data exclusions (if any), all manipulations, and all measures in the study (Simmons et al. 2012).
We conducted a‑priori-power analyses with G*Power (Version 220.127.116.11; Faul et al. 2007) for repeated measures ANOVA with between-factors design with ten measurements. A medium effect size of 0.25 and a correlation of 0.5 among measurements were assumed, and power was set to 0.95. This resulted in a total sample size of 310 participants. The sample was acquired via Prolific Academic (www.prolific.co), an online survey platform (Palan and Schitter 2018), in two batches to rule out technical issues. Analyses showed that there were no significant differences between the two samples (N1 = 122 and N2 = 216). For all further analyses both samples were added up resulting in a total sample size of N = 338. The sample included 186 men, 147 women, and five participants who identified as non-binary. The average age was 30.99 years (SD = 10.37). Most participants were residing in Germany at the time they partook in the study (76.0%) (other countries of residence: Austria: 9.2%, Switzerland: 1.2%, other countries: 13.6%). The sample was highly educated, with 32.8% having obtained a school leaving certificate (German Abitur, Austrian/Swiss Matura) and 50.6% having obtained a university degree. 0.6% stated that they had not obtained any degree, 16.0% stated that they had finished some other education. 8.0% stated that they, or someone in their inner social circle, wear some kind of prosthesis. All participants gave informed consent.
The stimulus material consisted of 11 pictures of human beings with and without low- and high-tech prostheses, and robots. These pictures correspond to different Grades of Technicity (GoT), ranging from none (able-bodied individual) to very high (android robot). The material can be found on OSF (https://osf.io/qz2ca/). To account for different types of disabilities and prostheses, three types of disabilities (one arm, one leg, both legs) were shown. For each disability, both one low- and one high-tech prosthesis were presented. Two able-bodied individuals, a female and a male, were shown. The pictures were chosen according to the following criteria: neutral to slightly positive facial expression, neutral clothing, neutral background. People with prostheses were exclusively male to control for the influence of female stereotypes.
Furthermore, we presented three pictures of different robots to account for different levels of anthropomorphism: on the lowest level an industrial robot, which does not possess any human-like qualities. On the second level, a social robot (“Pepper”, SoftBank Robotic Europe), which resembles a human being in terms of bodyshape and possesses a face with eyes and a mouth. The highest level of anthropomorphism is represented by an android robot, a still taken from the movie ROBOLOVE (Arlamovsky 2019), which is almost indiscernible from a human being. The robots, except for the industrial robot, which was presented in a typical industrial setting, were presented with neutral backgrounds. All pictures were presented in randomized order with the instruction to rate how the participants perceive the person/robot, how they think the person/robot would act/think/react, even though this first impression might be wrong and revoked later. For each picture, the same twenty-five adjectives on opposing ends of a semantic differential were presented in random order and rated on a five-point Likert scale.
Pre-Tests (n = 30) revealed that participants hesitated to ascribe more humanlike attributes to robots and stated explicitly that these attributions are not applicable to robots, leading to missing values on these dimensions. To account for these issues and gain minimal information from missing values,we gave two additional choices for robots: “this does not apply to robots in general” and “this does not apply to this specific robot” (Chita-Tegmark et al. 2021). We will present a detailed analysis of missing values in section 5.1.
To cover the three main dimensions of social perception, Competence, Sociability, and Morality, we composed items to be rated on a five-point Likert scale. We also assessed perceived Anthropomorphism to ensure that the manipulation of three levels of Anthropomorphism was perceived similarly by the participants.
Competence: we chose three items in line with previous studies (Fiske 2018; Fiske et al. 2002; Meyer and Asbrock 2018): competent, independent, and competitive. Reliability analyses showed unsatisfying results if averaged into a scale (McDonald’s ω = 0.45). We therefore chose to drop two items and use only the item “competent-incompetent” for further analyses (Fig. 1).
Sociability: Sociability was covered by three subscales: warmth (three items, e.g., warm-cold; Fiske 2018; Fiske et al. 2002; Meyer and Asbrock 2018), animacy (three items, e.g., lively-stagnant; Bartneck et al. 2009), and likeability (two items, e.g., pleasant-unpleasant; Bartneck et al. 2009). Technical issues accounted for the loss of one item of the animacy scale. Consequently, we dropped this item and proceeded with two items on the animacy scale, resulting in a total of seven items which we averaged into a scale (McDonald’s ω = 0.87, Fig. 1).
Morality: we adapted eight attributions, which people high in Moral Identity possess (e.g., fair-unfair), of the German version of the Moral Identity Questionnaire (Aquino and Reed 2002) based on theoretical considerations, that is, intelligibility and relevance, and chose corresponding antonyms. We averaged these items into a scale (McDonald’s ω = 0.82, Fig. 1).
Anthropomorphism: We used five items of the Godspeed Questionnaire (Bartneck et al. 2009) to assess perceived anthropomorphism of the robotic stimuli (e.g., humanlike-machinelike). We averaged these items into a scale and analyzed the reliability for this sample (McDonald’s ω = 0.89, Fig. 2).
After giving informed consent and filling in a sociodemographic questionnaire, participants were presented with the stimulus material. They rated each visual stimulus on 25 adjectives that comprised the six (sub-)scales Competence, Warmth, Animacy, Likeability, Morality, and Anthropomorphism. Afterwards, participants completed the ATIFootnote 1 scale (M = 4.23, SD = 1.01) (Franke et al. 2019) and the NFCFootnote 2-K scale (M = 4.79, SD = 1.11) (Beißert et al. 2014). However, these two personality questionnaires were out of the scope of this study, and will be used for further analysis. Lastly, we asked whether the participants themselves or any of their acquaintances used prostheses. Upon finishing, participants were forwarded to Prolific Academic (www.prolific.co) to receive a compensation of € 3.20. The total processing time was approximately 20 min.
4.1 Statistical analysis
By employing a within-subject design, which is quite common in organizational research or HRI research, (e.g., Scheunemann et al. 2020), all participants rated all of the eleven pictures. These eleven repeated measurements of the dependent variables were thus nested in participants and are therefore highly likely to be non-independent. This was the case for Competence, ICC(1) = 0.19, F(377, 1491) = 2.26, p < 0.001, ICC(2) = 0.56, Sociability, ICC(1) = 0.29, F(337, 1010) = 2.64, p < 0.001, ICC(2) = 0.62, and Morality, ICC(1) = 0.51, F(337, 822) = 4.57, p < 0.001, ICC(2) = 0.78. From a merely statistical point of view, as mentioned above, nested data can basically be thought as repeated measures within the subject, these repeated measures per subject led to correlated errors. These non-independent errors prohibit the use of standard procedures like ANOVA or linear regression models (Bliese et al. 2018, Seltman 2009). We thus employed Mixed Models, which in a nutshell, give correct estimates in the presence of correlated errors, to account for nested data and the effects that come with it.
We used R (Version 4.1.1; R Core Team, 2021) and the Rpackages dplyr (Version 1.0.7), tidyverse (Version 1.3.1), tidyr (Version 1.1.3), forcats (Version 0.5.1) for data management, psych (Version 2.1.6), sjstats (Version 0.18.1), ggpubr (Version 0.4.0), sjplot (Version 2.8.9), lm.beta (Version 1.5-1), apaTables (Version 2.0.8), and ggplot2 (Version 3.3.5) for descriptive analyses, MuMln (Version 1.43.17), effects (Version 4.2-0), emmeans (Version 18.104.22.168), mulitlevel (Version 2.6), stats (Version 4.0.2), lme4 (Version 1.1-27.1), pbkrtest (Version 0.5.1) and lattice (Version 0.20-44) for fitting Mixed Models and subsequent post-hoc testing.
Visual inspection of the data revealed a non-linear relationship of Grade of Technicity with Competence, Sociability, and Morality (cf. Fig. 1).
To account for the apparent break between human and robotic stimuli, we decided to split the data for all three attributions into two subgroups. These subgroups were built by splitting the data by stimuli into a “human” and a “robotic” subgroup, regarding the fact that every participant rated all stimuli; both subgroups were equal in size (N = 338). The structure of the data for human stimuli furthermore revealed that restored functionality, that is, to regain abilities comparable to able-bodied individuals, seemed to explain differences in attributions better than technicity of the prostheses.
5.1 Missing values
We analyzed the two additional categories for attributions of Competence, Sociability, and Morality for the robotic stimuli (“this does not apply to all robots” and “this does not apply to this specific robot”) to further evaluate the pattern of answers (Fig. 3).
We found significant differences in the ascription of attributions to robots, χ2 = 7.02, df = 2, p = 0.030, no differences in the willingness to attribute social dimensions to industrial robots, social robots, or android robots, χ2 = 2.51, df = 2, p = 0.286, and significant differences in the ascription of attributions to robots in general, χ2 = 7.09, df = 2, p = 0.029. Out of 338 participants, only 17 attributed adjectives concerned with Morality to the industrial robot, 48 to the social robot, and 81 to the android robot. Participants were comparably unwilling to attribute Sociability to robots: 21 attributed Sociability to industrial robots, 79 to the social robot, and 176 to the android robot. For Competence, these differences were not as large: 258 participants attributed Competence to the industrial robot, 291 to the social robot, and 266 to the android robot.
We therefore need to limit the interpretations of attributions of Sociability and Morality in such a way that, at present, participants did not agree that robots could be attributed these social dimensions.
5.2 Results: human stimuli
To test our hypothesis and research questions, we ran a Mixed Model regressing Competence on restored functionality. A model with random intercept fit the data best (Table 1), indicating that restored functionality was positively associated with attributions of Competence. Post-hoc Tukey tests revealed that users of low-tech prostheses were not perceived as less competent than users of bionic prostheses, ∆M = 0.07, SE = 0.03, p = 0.085, d = 0.16, thus rejecting hypothesis H1. The analysis of RQ1 revealed that users of low-tech prostheses were perceived as less competent than able-bodied individuals, ∆M = 0.14, SE = 0.03, p < 0.001, d = 0.34. No significant differences of perceived Competence were found between users of bionic prostheses and able-bodied individuals, ∆M = 0.08, SE = 0.03, p = 0.055, d = 0.18. Control variables of gender and education were associated with attributions of Competence: female participants attributed significantly more Competence than male participants, ∆M = 0.18, SE = 0.06, p = 0.011, d = 0.43. Age was negatively associated with attributions of Competence, b = −0.01, SE = 0.00, t = −2.45.
For research question RQ2, which we posed to evaluate whether there would be differences in perceived Sociability for users of low- or high-tech prostheses and able-bodied individuals, we ran a Mixed Model regressing Sociability on restored functionality. Results did not reveal a significant association between restored functionality and Sociability (Table 2). Instead, again, the control variable gender was associated with attributions of Sociability: female participants attributed significantly more Sociability than male participants, ∆M = 0.14, SE = 0.05, p = 0.026, d = 0.42.
For research question RQ3, concerned with whether there would be differences in perceived Morality for users of low- or high-tech prostheses and able-bodied individuals, we ran a Mixed Model regressing Morality on restored functionality. We found that restored functionality was negatively associated with Morality (Table 3). Post-hoc Tukey test revealed that users of low-tech prostheses were attributed significantly more Morality than users of high-tech prostheses, ∆M = 0.12, SE = 0.02, p < 0.001, d = 0.47, and able-bodied individuals, ∆M = 0.16, SE = 0.02, p < 0.001, d = 0.65. Attributions of Morality for users of high-tech prostheses and able-bodied individuals did not differ significantly but showed a tendency ∆M = 0.05, SE = 0.02, p = 0.050, d = 0.18. None of the control variables were significantly associatied with perceptions of Morality.
5.3 Results: robotic stimuli
Research questions proposed that different levels of anthropomorphism could be associated with different attributions of Competence (RQ1), Sociability (RQ2), and Morality (RQ3). We therefore needed to establish whether the manipulation was successful. We ran a corresponding Mixed Model regressing attributed Anthropomorphism on Grade of Technicity which showed that Anthropomorphism differed between robot types, b = 0.22, SE = 0.03, t = 7.21, p < 0.001. Surprisingly, results showed that the industrial robot was not perceived as less anthropomorphic than the social robot, ∆M = 0.03, SE = 0.06, p = 0.868, d = 0.06. The industrial robot was perceived as less anthropomorphic than the android robot, ∆M = −0.42, SE = 0.06, p < 0.001, d = 0.76. The social robot was perceived as less anthropomorphic than the android robot, too, ∆M = −0.45, SE = 0.06, p < 0.001, d = 0.81. Control variable gender was associated with perceived Anthropomorphism: female participants attributed less Anthropomorphism than male participants, ∆M = −0.18, SE = 0.07, t = −2.61, p = 0.026.
To answer research question RQ1, whether robots with varying levels of Anthropomorphism would be perceived differently in terms of attributed Competence, we ran a Mixed Model regressing Competence on Grade of Technicity. We found that Anthropomorphism was negatively associated with Grade of Technicity (Table 4). Post-hoc Tukey test revealed that the industrial robot was perceived as more competent than the social robot, ∆M = 0.69, SE = 0.08, p < 0.001, d = 0.72, and the android robot, ∆M = 1.02, SE = 0.08, p < 0.001, d = 1.07. The social robot was perceived as more competent that the android robot, ∆M = 0.33, SE = 0.08, p < 0.001, d = 0.35. Control variables gender and age were associated with perceived Competence. Female participants attributed significantly more Competence than male participants, ∆M = 0.37, SE = 0.08, p < 0.001. Age was negatively associated with attributed Competence, b = −0.01, SE = 0.00, t = −3.30.
Research question RQ2 was concerned with whether there would be differences in perceived Sociability for different robots. We ran a Mixed Model regressing Sociability on Anthropomorphism and found that Anthropomorphism was not associated with perceived Sociability (Table 5). None of the control variables showed significant associations with perceived Sociability.
For research question RQ3, whether there would be differences in perceived Morality for different robots, we ran a Mixed Model regressing Morality on Grade of Technicity and found that perceived Morality was negatively associated with Grade of Technicity (Table 6). Post-hoc Tukey test revealed that the social robot was attributed more Morality than the android robot, ∆M = 0.28, SE = 0.07, p < 0.001, d = 0.85. Perceived Morality did not differ significantly between the social robot and the industrial robot, ∆M = −0.20, SE = 0.11, p = 0.178, d = 0.60, and between the industrial robot and the android robot, ∆M = 0.08, SE = 0.11, p = 0.739, d = 0.25.
The present study was aiming at answering two major research questions to replicate findings by Meyer and Asbrock (2018) and Mandl et al. (2022a): (1) How are users of prostheses and (2) different robots perceived in terms of the three major dimensions of social perception: Competence, Sociability, and Morality?
As we pointed out in section 5, data inspection revealed a non-linear relationship between the Grade of Technicity (GOT) and the assessed attributions. Therefore, we divided the dataset by compiling data via the stimuli into a “human” subset and a “robot” subset. Moreover, although the construct Grade of Technicity can basically be applied to both samples, it is necessary to create a more suitable category for the human subsample due to content considerations. Based on these considerations, the category Grade of Technicity seems inappropriate for users of bionic prostheses, especially in comparison to the stimuli of non-disabled people. It seems more salient for the evaluation that a human is a human, and even with an accentuation in the direction of a cyborg, the primary humanoid attribution remains. Consequently, we introduced the construct Restored Functionality (RF) to describe the human sample.
Accordingly, the following discussion will be conducted along these two subsamples and the corresponding constructs mentioned above.
6.1 Social perception of human beings with prostheses
We hypothesized that people with bionic prostheses are perceived as more competent than people with low-tech prostheses. The present data did not support this hypothesis: users of bionic prostheses were perceived as competent as people with low-tech prostheses, further supported by a very small effect size. Instead, we found that people with low-tech prostheses were perceived as less competent than able-bodied people, which in turn was not true for users of bionic prostheses: they were perceived as being as competent as able-bodied individuals. These findings are only partly in line with previous findings: users of bionic prostheses were found to be perceived as more competent than people with physical disabilities in general Meyer and Asbrock (2018) and users of low-tech prostheses (Mandl et al. 2022a), but as less (Meyer and Asbrock 2018) or equally as (Mandl et al. 2022a) competent as able-bodied individuals. Users of low-tech prostheses apparently fall in the space between these two categories, as they are perceived as competent as users of bionic prostheses, but less competent than able-bodied individuals. We interpret this in such a way that prostheses, regardless of their technicity, account for more perceived Competence. As for users of bionic prostheses, our findings deviated in such a way that they were perceived as competent as users without physical disabilities. This indicates that technicity of the prostheses might contribute to perceived Competence, but only marginally. However, further research will have to show whether these differences in perceived Competence persist. People with disabilities are, in general, perceived as warmer than able-bodied people (Fiske 2018; Fiske et al. 2002; Meyer and Asbrock 2018). By dividing the dimension of Warmth into Sociability and Morality (Kervyn et al. 2015; Leach et al. 2007), we were aiming at replicating prior findings. In line with Mandl et al. (2022a), people with disabilities, regardless of the type of protheses, were perceived as sociable as able-bodied people. We found differences between people with low-tech prostheses and both users of bionic prostheses and able-bodied individuals in perceived Morality, which is mostly in line with prior findings, where more Morality was attributed to people with low-tech prostheses than able-bodied individuals, but no difference between users of bionic prostheses and able-bodied individuals was found (Mandl et al. 2022a). We suspect that this might be due to the fact that the common stereotype of people with physical disabilities includes attributions of being tolerant and sincere (Fiske et al. 2002) in the dimension of Warmth, which correspond to Morality rather than Sociability. Our findings indicate that using such devices affects how others perceive users and might induce specific behaviors towards them. For example, using bionic prostheses positively affects the acceptance of disabled people at work conveyed by the ascribed Competence (Vornholt et al. 2013), similar to able-bodied individuals. We expect that augmenting devices will also trigger these processes of social perception with positive and rather negative outcomes, as partially shown by Gilotta et al. (2019) and Siedl and Mara (2021). Consequently, in upcoming studies, we will explicitly address devices for augmenting user capabilities and also focus on more work-related applications like exoskeletons.
6.2 Social perception of robots
We employed an adapted version of the SCM Model, which includes the social dimensions Competence, Sociability, and Morality to robots of different levels of technicity. We assumed that higher levels of technicity are linked to higher levels of Anthropomorphism.
Less anthropomorphized robots were perceived as more competent than more anthropomorphized robots, which is in line with previous findings (Mandl et al. 2022a). We assume that this might be caused by the clear application area of industrial robots whereas intended use of social robots and androids might be not that obvious.
Surprisingly, and in contrast to Mandl et al. (2022a), where industrial robots were perceived as less sociable than more anthropomorphized robots, attributions of Sociability did not differ between robots of different anthropomorphic levels. Instead, robots were seen as comparably sociable. More Morality was attributed to the social robot than the android robot, with the industrial robot falling in between. This counteracts the finding that Morality did not differ as a function of anthropomorphism (Mandl et al. 2022a). We need to point out that by further investigating, we found that less participants ascribed inherently humane attributions such as Sociability and Morality to robots. This might be the reason for the divergent results found in previous studies. Therefore, these results should be seen as preliminary and require further research. We assume that this might be partly explained by the fact that the majority of people are not in direct contact with robots. There is evidence that attitudes towards robots are currently based on fiction rather than objective reality (Naneva et al. 2020). Moreover, based on intergroup relation research, Sarda Gou et al. (2021) could show that direct contact with robots positively affected participants’ attitudes toward robots. We found rather reserved attributions to robots. Further studies should address whether differences in perception persist if people work with robots or not, and whether perceptions in work settings can be conferred to social settings. We assume that attitudes towards and emotions evoked by robots will become more realistic and objective in the long run, so that longitudinal studies should be conducted to assess and monitor those changes.
We evaluated the social perception of robots and individuals with and without physical disabilities with low- and high-tech prostheses by presenting pictures. These static stimuli do not take into account the effects of motion, which is thought to influence how robots are perceived (Kupferberg et al. 2011). For that reason, the present research has to be seen as a first step towards a better understanding of the social perception of robots. Also pictures of individuals are highly influenced by personal taste, so since we decided on presenting pictures of actual human beings, they, of course, differed in their physical appearance which might have influenced their perception. Furthermore, we used scales from the Godspeed Inventory (Bartneck et al. 2009) and the SCM (Fiske 2018; Meyer and Asbrock 2018), since we not only investigated perceptions of robots but also of human beings. This decision comes with certain limitations: Due to reliability issues we decide to use a single item Competence measure. Additionally, technical issues accounted for the loss of one item of the animacy scale for the social robot within the first sample. Furthermore, only a small number of participants were willing to attribute Sociability and Morality to robots. We will address those limitations in upcoming experimental and field studies.
We were able to show that people with low- and high-tech prostheses and able-bodied individuals are perceived differently in terms of Competence and Morality, whereas we found no differences in perceived Sociability. Social perception differs between robots with more or less anthropomorphic appearances. In general, typically humane attributions such as Morality cannot be transferred to robots without issues. In contrast, attributions of Competence and, in part, Sociability can be ascribed to robots more easily. Although the primary aim was to generate more profound insights into the social perception of EDTs on a general level by replicating prior research on bionic devices for restoring user capabilities and enriching those research with (social) robotics, first practical implications can be derived from our findings. As shown, the use of bionic technologies can affect stereotypes and interpersonal perceptions. The introduction of exoskeletons and similar technologies in future work contexts, for example, might thus have unintended social repercussions that need to be accounted for.
Concerning (social) robots, these exploratory results should be considered when designing robots that will be used in primarily social environments or at least when implementing robots into work settings. To realize this in praxi we suggest to assess the social perception of when implementing robots to prescreen their acceptance by the users who directly work with these EDTs to become aware of and mitigate possible unintended side effects. An economical and easy-to-implement approach might be using the Social Perception of Robots Scale (SPRS) developed by the authors (Mandl et al. 2022b).
Affinity for Technology Interaction.
Need for Cognition.
Abele, A. E., & Wojciszke, B. (2007). Agency and communion from the perspective of self versus others. Journal of Personality and Social Psychology, 93(5), 751–763. https://doi.org/10.1037/0022-3522.214.171.1241.
Abele, A. E., Ellemers, N., Fiske, S. T., Koch, A., & Yzerbyt, V. (2021). Navigating the social world: Toward an integrated framework for evaluating self, individuals, and groups. Psychological Review, 128(2), 290–314. https://doi.org/10.1037/rev0000262.
Aquino, K., & Reed, A. (2002). The self-importance of moral identity. Journal of Personality and Social Psychology, 83(6), 1423–1440. https://doi.org/10.1037//0022-35126.96.36.1993.
Arlamovsky, M. (2019). ROBOLOVE. NGF—Nikolaus Geyrhalter Filmproduktion.
Asbrock, F. (2010). Stereotypes of social groups in Germany in terms of warmth and competence. Social Psychology, 41(2), 76–81. https://doi.org/10.1027/1864-9335/a000011.
Asch, S. E. (1946). Forming impressions of personality. Journal of Abnormal and Social Psychology, 41, 258–290.
Awad, E., Dsouza, S., Kim, R., Schulz, J., Henrich, J., Shariff, A., Bonnefon, J.-F., & Rahwan, I. (2018). The moral machine experiment. Nature, 563(7729), 59–64. https://doi.org/10.1038/s41586-018-0637-6.
Bartneck, C., Kanda, T., Ishiguro, H., & Hagita, N. (2007). Is the uncanny valley an uncanny cliff? In RO-MAN 2007—The 16th IEEE International Symposium on Robot and Human Interactive Communication (pp. 368–373). https://doi.org/10.1109/ROMAN.2007.4415111.
Bartneck, C., Kulić, D., Croft, E., & Zoghbi, S. (2009). Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots. International Journal of Social Robotics, 1(1), 71–81. https://doi.org/10.1007/s12369-008-0001-3.
Beißert, H., Köhler, M., Rempel, M., & Beierlein, C. (2014). Eine deutschsprachige Kurzskala zur Messung des Konstrukts Need for Cognition: Die Need for Cognition Kurzskala (NFC-K). [A German-language short scale for measuring the construct Need for Cognition: The Need for Cognition Short Scale (NFC-K)] (Vol. 2014/32). GESIS—Leibniz-Institut für Sozialwissenschaften. https://www.gesis.org/fileadmin/_migrated/content_uploads/WorkingPapers_2014-32.pdf. Accessed August 2, 2022
Bliese, P. D., Maltarich, M. A., & Hendricks, J. L. (2018). Back to basics with mixed-effects models: nine take-away points. Journal of Business and Psychology, 33(1), 1–23. https://doi.org/10.1007/s10869-017-9491-z.
Carpinella, C. M., Wyman, A. B., Perez, M. A., & Stroessner, S. J. (2017). The robotic social attributes scale (RoSAS): Development and validation. In Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction (pp. 254–262). https://doi.org/10.1145/2909824.3020208.
Cervantes, J.-A., López, S., Rodríguez, L.-F., Cervantes, S., Cervantes, F., & Ramos, F. (2020). Artificial moral agents: a survey of the current status. Science and Engineering Ethics, 26(2), 501–532. https://doi.org/10.1007/s11948-019-00151-x.
Chao, G. T., & Kozlowski, S. W. J. (1986). Employee perceptions on the implementation of robotic manufacturing technology. Journal of Applied Psychology, 71(1), 70–76. https://doi.org/10.1037/0021-9010/86/$00.75.
Chita-Tegmark, M., Law, T., Rabb, N., & Scheutz, M. (2021). Can you trust your trust measure? In Proceedings of the 2021 ACM/IEEE International Conference on Human-Robot Interaction (HRI′21), March 8–11, 2021, Boulder, CO, USA (p. 9). New York: ACM. https://doi.org/10.1145/3434073.3444677.
Cuddy, A. J. C., Fiske, S. T., & Glick, P. (2007). The BIAS map: Behaviors from intergroup affect and stereotypes. Journal of Personality and Social Psychology, 92(4), 631–648. https://doi.org/10.1037/0022-35188.8.131.521.
Cuddy, A. J. C., Fiske, S. T., & Glick, P. (2008). Warmth and competence as universal dimensions of social perception: the stereotype content model and the BIAS map. Advances in Experimental Social Psychology, 40, 61–149. https://doi.org/10.1016/S0065-2601(07)00002-0.
Demir, K. A., Döven, G., & Sezen, B. (2019). Industry 5.0 and human-robot co-working. Procedia Computer Science, 158, 688–695. https://doi.org/10.1016/j.procs.2019.09.104.
Faul, F., Erdfelder, E., Lang, A.-G., & Buchner, A. (2007). G*Power 3: A flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behavior Research Methods, 39(2), 175–191. https://doi.org/10.3758/BF03193146.
Fiske, S. T. (2018). Stereotype content: warmth and competence endure. Current Directions in Psychological Science, 27(2), 67–73. https://doi.org/10.1177/0963721417738825.
Fiske, S. T., Cuddy, A. J. C., Glick, P., & Xu, J. (2002). A model of (often mixed) stereotype content: Competence and warmth respectively follow from perceived status and competition. Journal of Personality and Social Psychology, 82(6), 878–902. https://doi.org/10.1037/0022-35184.108.40.2068.
Franke, T., Attig, C., & Wessel, D. (2019). A personal resource for technology interaction: development and validation of the Affinity for Technology Interaction (ATI) scale. International Journal of Human-Computer Interaction, 35(6), 456–467. https://doi.org/10.1080/10447318.2018.1456150.
Gilotta, S., Spada, S., Ghibaudo, L., Isoardi, M., & Mosso, C. O. (2019). Acceptability beyond usability: a manufacturing case study. In S. Bagnara, R. Tartaglia, S. Albolino, T. Alexander & Y. Fujita (Eds.), Proceedings of the 20th Congress of the International Ergonomics Association (IEA 2018). IEA 2018. Advances in intelligent systems and computing, Vol. 824. Cham: Springer. https://doi.org/10.1007/978-3-319-96071-5_95.
Hannah, S. T., Avolio, B. J., & May, D. R. (2011). Moral maturation and moral conation: A capacity approach to explaining moral thought and action. Academy of Management Review, 36(4), 663–685. https://doi.org/10.5465/amr.2010.0128.
Heflick, N. A., Goldenberg, J. L., Cooper, D. P., & Puvia, E. (2011). From women to objects: Appearance focus, target gender, and perceptions of warmth, morality and competence. Journal of Experimental Social Psychology, 47(3), 572–581. https://doi.org/10.1016/j.jesp.2010.12.020.
Kervyn, N., Fiske, S., & Yzerbyt, V. (2015). Forecasting the primary dimension of social perception: symbolic and realistic threats together predict warmth in the stereotype content model. Social Psychology, 46(1), 36–45. https://doi.org/10.1027/1864-9335/a000219.
Koch, A., Imhoff, R., Dotsch, R., Unkelbach, C., & Alves, H. (2016). The ABC of stereotypes about groups: Agency/socioeconomic success, conservative-progressive beliefs, and communion. Journal of Personality and Social Psychology, 110(5), 675–709. https://doi.org/10.1037/pspa0000046.
Kupferberg, A., Glasauer, S., Huber, M., Rickert, M., Knoll, A., & Brandt, T. (2011). Biological movement increases acceptance of humanoid robots as human partners in motor interaction. AI & SOCIETY, 26(4), 339–345. https://doi.org/10.1007/s00146-010-0314-2.
Leach, C. W., Ellemers, N., & Barreto, M. (2007). Group virtue: The importance of morality (vs. competence and sociability) in the positive evaluation of in-groups. Journal of Personality and Social Psychology, 93(2), 234–249. https://doi.org/10.1037/0022-35220.127.116.11.
MacDorman, K. F. (2019). Masahiro Mori und das unheimliche Tal: Eine Retrospektive. Zenodo. https://doi.org/10.5281/ZENODO.3226274.
Mandl, S., Bretschneider, M., Meyer, S., Gesmann-Nuissl, D., Asbrock, F., Meyer, B., & Strobel, A. (2022a). Embodied digital technologies: First insights in the social and legal perception of robots and users of prostheses. Frontiers in Robotics and AI, 9, 787970. https://doi.org/10.3389/frobt.2022.787970.
Mandl, S., Bretschneider, M., Asbrock, F., Meyer, B., & Strobel, A. (2022b). The Social Perception of Robots Scale (SPRS): Developing and Testing a Scale for Successful Interaction Between Humans and Robots. In L. M. Camarinha-Matos, A. Ortiz, X. Boucher, & A. L. Osório (Eds.), Collaborative Networks in Digitalization and Society 5.0. 23rd IFIP WG 5.5 Working Conference on Virtual Enterprises, PRO-VE 2022, Lisbon, Portugal, September 19–21, 2022 Proceedings (pp. 321–334). Springer. https://doi.org/10.1007/978-3-031-14844-6_26.
Meyer, B., & Asbrock, F. (2018). Disabled or cyborg? How bionics affect stereotypes toward people with physical disabilities. Frontiers in Psychology, 9, 2251. https://doi.org/10.3389/fpsyg.2018.02251.
Mieczkowski, H., Liu, S. X., Hancock, J., & Reeves, B. (2019). Helping not hurting: Applying the stereotype content model and BIAS map to social robotics. In 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI) (pp. 222–229). https://doi.org/10.1109/HRI.2019.8673307.
Moor, J. H. (2006). The nature, importance, and difficulty of machine ethics. IEEE Intelligent Systems, 21(4), 18–21. https://doi.org/10.1109/MIS.2006.80.
Mori, M., MacDorman, K., & Kageki, N. (2012). The uncanny valley [from the field]. IEEE Robotics & Automation Magazine, 19(2), 98–100. https://doi.org/10.1109/MRA.2012.2192811.
Naneva, S., Sarda Gou, M., Webb, T. L., & Prescott, T. J. (2020). A systematic review of attitudes, anxiety, acceptance, and trust towards social robots. International Journal of Social Robotics, 12, 1179–1201. https://doi.org/10.1007/s12369-020-00659-4.
Osgood, C., Suci, G., & Tannenbaum, P. (1957). The measurement of meaning. American Journal of Sociology, 63, 550–551. https://doi.org/10.1086/222316.
Palan, S., & Schitter, C. (2018). Prolific.ac—A subject pool for online experiments. Journal of Behavioral and Experimental Finance, 17, 22–27. https://doi.org/10.1016/j.jbef.2017.12.004.
von der Pütten, A. M., & Krämer, N. C. (2012). A survey on robot appearances. In Proceedings of the Seventh Annual ACM/IEEE International Conference on Human-Robot Interaction—HRI ′12 (pp. 267–268). https://doi.org/10.1145/2157689.2157787.
Rosenthal-von der Pütten, A. M., Krämer, N. C., Becker-Asano, C., Ogawa, K., Nishio, S., & Ishiguro, H. (2014). The uncanny in the wild. Analysis of unscripted human-android interaction in the field. International Journal of Social Robotics, 6(1), 67–83. https://doi.org/10.1007/s12369-013-0198-7.
Sarda Gou, M., Webb, T. L., & Prescott, T. J. (2021). The effect of direct and extended contact on attitudes towards social robots. Heliyon, 7(3), e6418. https://doi.org/10.1016/j.heliyon.2021.e06418.
Sauppé, A., & Mutlu, B. (2015). The social impact of a robot co-worker in industrial settings. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems—CHI ′15 (pp. 3613–3622). https://doi.org/10.1145/2702123.2702181.
Savela, N., Turja, T., & Oksanen, A. (2018). Social acceptance of robots in different occupational fields: a systematic literature review. International Journal of Social Robotics, 10(4), 493–502. https://doi.org/10.1007/s12369-017-0452-5.
Savela, N., Oksanen, A., Pellert, M., & Garcia, D. (2021). Emotional reactions to robot colleagues in a role-playing experiment. International Journal of Information Management, 60, 102361. https://doi.org/10.1016/j.ijinfomgt.2021.102361.
Scheunemann, M. M., Cuijpers, R. H., & Salge, C. (2020). Warmth and competence to predict human preference of robot behavior in physical human-robot interaction. In 2020 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN) (pp. 1340–1347). https://doi.org/10.1109/RO-MAN47096.2020.9223478.
Seltman, J. H. (2009). Experimental design and statistics. http://www.stat.cmu.edu/~hseltman/309/Book/Book.pdf. Accessed August 2, 2022
Siedl, S. M., & Mara, M. (2021). Exoskeleton acceptance and its relationship to self-efficacy enhancement, perceived usefulness, and physical relief: A field study among logistics workers. Wearable Technologies, 2, e10. https://doi.org/10.1017/wtc.2021.10.
Simmons, J., Nelson, L., & Simonsohn, U. (2012). A 21 word solution. SPSP Dialogue. https://doi.org/10.2139/ssrn.2160588
Statistisches Bundesamt (2021). Statista. Retrieved September 15, 2021, from https://de.statista.com/statistik/daten/studie/445223/umfrage/produtkionsmenge-von-mehrzweck-industrierobotern-in-deutschland/
Turja, T., & Oksanen, A. (2019). Robot acceptance at work: a multilevel analysis based on 27 EU countries. International Journal of Social Robotics, 11(4), 679–689. https://doi.org/10.1007/s12369-019-00526-x.
de Visser, E. J., Monfort, S. S., McKendrick, R., Smith, M. A. B., McKnight, P. E., Krueger, F., & Parasuraman, R. (2016). Almost human: anthropomorphism increases trust resilience in cognitive agents. Journal of Experimental Psychology: Applied, 22(3), 331–349. https://doi.org/10.1037/xap0000092.
Vornholt, K., Uitdewilligen, S., & Nijhuis, F. J. N. (2013). Factors affecting the acceptance of people with disabilities at work: a literature review. Journal of Occupational Rehabilitation, 23(4), 463–475. https://doi.org/10.1007/s10926-013-9426-0.
Wang, S., Lilienfeld, S. O., & Rochat, P. (2015). The uncanny valley: Existence and explanations. Review of General Psychology, 19(4), 393–407. https://doi.org/10.1037/gpr0000056.
Wiese, E., Weis, P. P., Bigman, Y., Kapsaskis, K., & Gray, K. (2021). It’s a match: Task assignment in human-robot collaboration depends on mind perception. International Journal of Social Robotics. https://doi.org/10.1007/s12369-021-00771-z.
Funded by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation)—Project-ID 416228727—SFB 1410.
Open Access funding enabled and organized by Projekt DEAL.
Conflict of interest
M. Bretschneider, S. Mandl, A. Strobel, F. Asbrock and B. Meyer declare that they have no competing interests.
Data Availability Statement
The datasets analyzed for this study can be found in the OSF Repository (https://osf.io/qz2ca/).
About this article
Cite this article
Bretschneider, M., Mandl, S., Strobel, A. et al. Social perception of embodied digital technologies—a closer look at bionics and social robotics. Gr Interakt Org 53, 343–358 (2022). https://doi.org/10.1007/s11612-022-00644-7
- Embodied Digital Technologies
- Social Robotics
- Social Perception
- Embodied Digital Technologies
- Soziale Robotik
- Soziale Wahrnehmung