Skip to main content

Social Robots: Views of Staff of a Disability Service Organization


Social robotics is an emerging field, with many applications envisioned for people with disabilities. This project examined the so far invisible views of disability service organization workers towards social robotics. Because community service workers’ views shape community-based rehabilitation (an area of health interventions that focuses on social determinants), it is important to examine their views towards social robotics applications which are largely developed under a clinical/medical view of disability. We administered a survey to employees of a Saskatchewan disability service organization. Out of 44 respondents, 80 % were female, most aged 21–65 years. Robotics applications perceived to be important included domestic robots, and rehabilitation robots. Least important applications included eldercare robots, companion robots, and pet robots. Most participants felt that robots cannot replace human touch, human interaction, or emotional companionship, and that they cannot/should not replace human workers in the disability setting. Many expressed concerns about safety, normality for disabled people, and artificial interactions. Respondents also had views on whether a social robot can be a bully or could be bullied. We submit that the perspectives our respondents exhibited might be useful to consider in the development of social robots for applications around disability in order to ensure acceptable and relevant products.


Social robots are complex machines that are envisioned to engage in meaningful social interaction with humans and with each other [115]. As health care robots they are envisioned to be involved in post-stroke motivation to do exercises (rehabilitation) [16], teaching people with Autism to recognize social cues, facial expressions, eye contact, etc. [1726], to motivate and enable movement in people with physical disabilities, for supporting self-management in children with diabetes [27] and to perform tasks otherwise performed by health care staff such as taking blood pressure, carrying and moving patients, bathing patients and easing vaccinations for youth. Social robots are also seen to have great potential for long term care and daily care provisions [28]. Being a companion for the elderly is seen as another main application [28, 29]. Philos is for example a socially interactive robot designed for use in homes of those who need continual care. It is capable of daily health monitoring as well as emotional stimulation (HRSI) [30].

Public perception of an emerging scientific and technological product is important for the acceptance of such a product. One recent survey of public attitudes toward using robots in eldercare and other applications [31] showed, among other things, a high acceptance for the bathing application, the therapeutic robot animal, the human-like care robot, Ri-Man (for carrying patients) and a surveillance care robot [31, 32]. The main reasons for rejection in the case of the bathing robot were based on the judgment that the robot-based action would be inferior to the human-based action and that it would take away jobs from human workers [31]. However at the same time, social robots are seen as a possible way to address the human resource and economic pressures on health care systems (e.g., created by growing elderly populations) [33]. We present here results of an exploratory study that ascertained the so far invisible views of staff of a disability service organization toward social robotics and its various applications covering among others the angle of bullying [34, 35].


An online-delivered exploratory, non-probability survey (using a combination of 55 simple yes or no, Likert scale, and opinion rating scale questions, as well as options for written comments) was developed. Prior to the survey distribution, the executive director of the disability service organization reviewed a draft of the full survey and commented on language and clarity. We made adjustments in accordance with the executive director’s suggestions. This survey covered numerous topics. Quantitative and qualitative data as they relate to the topic of this paper was generated through questions 8–11 and 41, 42 that were concerned with application of social robots. Six questions covered various demographic angles. The survey received ethics approval from the University of Calgary Conjoined Health Research Ethics Board. After approval was obtained, the link to the survey was given to the executive director of the disability service organization who gave the link to the staff of the organization. All 44 staff in the organization answered at least one content question which reflected a response rate of 100 %”. In accordance with the ethics approval, there was no mandatory requirement to answer any question. Also in accordance with ethics approval we were not able to identify individual respondents. The response rate per question covered in this paper was very high and ranged from 64 and 73 %. The results provide a good insight to what the staff from this one disability service organization thinks. The results cannot be generalized to the disability service industry as a whole. However, the survey generated data could provide the foundation for other investigation allowing for comparative views of different disability service organizations. The data from this survey were seen to provide an avenue for the disability service staff to voice their opinion on topics they have never been asked before and had not heard of yet. Quantitative data was extracted using the Survey Monkey intrinsic frequency distribution analysis capability. The qualitative data of the comments given in the comment boxes linked to the five questions was exported as one pdf file into Atlas.ti for the qualitative analysis of the comment box contributions. Given the set-up of the survey this qualitative data cannot be traced back to a given individual. As to question 8 (n = 35) gave comments; as to question 9 (n = 34) gave comments; as to question 10 (n = 38) gave comments; as to question 11 (n = 34) gave comments; as to question 41 (n = 29) gave comments and as to question 42 (n = 27) gave comments.



As shown in Table 1, 20.5 % (n = 9) were male and 79.5 % (n = 35) were female. As to age, 25.0 % (n = 11) were between the ages of 18–30; 70.5 % (n = 31) were between the ages of 30–65 and 4.5 % (n = 2) were over the age of 65 years. As to their self-perception of body ability, 93.3 % (n = 41) perceived themselves as ’Normal’ and felt they were perceived by others as ’Normal; 2.3 % (n = 1) saw themselves as ’Normal’ but felt they are perceived by others as impaired; and 2.3 % (n = 1) saw themselves as ‘Impaired’ but felt they were perceived by others as ’Normal’. With regards to work experience, 47.5 % (n = 19) worked in the field more than 8 years; 7.5 % worked 5–6 years; 7.5 % (n = 3) worked 1–2 years and 7.5 % (n = 3) worked 8 months-1 year in the field. Smaller percentages were in between year wise. The majority of respondents that indicated their education stated a completion of grade 12. The majority of the respondents were care providers. Some respondents were program coordinators, program activity staff and admin staff.

Table 1 Demographic characteristics

Perception Towards Different Applications for Social Robots

To gain an idea of participant perceptions of various applications for social robots and what they thought their clients might think, we asked “Question 8. Robotics is a growing field, and new applications for social robots are being developed every day. The following is a list of emerging social robotics applications in the service, healthcare, and education sectors. Please indicate whether you think the following robots would be useful:”

Table 2 reveals (a) that respondents in general did not perceive social robots as very important or important and (b) that respondents showed differences in acceptance of social robots based on applications; for example housework related robots had the highest rank of 3.55 (moderate important to important) while companion and childcare robots ranked between unimportant and of little importance.

Table 2 Perception towards different social robot applications

Cannot Replace Ruman Rouch

The main theme evident in the comments given by n = 35 respondents was that although robots could be used for some routine task, they were not seen to be acceptable for replacing the human touch, due to, for example, lack of ability to read emotions.

To quote some of the responses:

Robots would not be able to understand the inner emotional turmoil that a lot of those adults and children who have both emotional and physical disabilities. They would not be able to give the gentle touch of a hand, the warmth of a hug, the understanding of conflict.

I think Robots would be good for routine tasks, but I do not believe they should be a replacement for social interactions. However I think that they can provide good remind cues for tasks and to teach particular task to individuals but should not been seen as a replacement for personal relationships.

If robots are used to assist in learning and to conduct routine household tasks that would be good, but robots should not replace humans as social companions. Social relationships are very important in recovery and in life experience and I do not believe robots can fulfill this need.

That helps a person with cognitive development and which provide needed physical assistance while I am opposed to robots that may diminish the need for a person to develop independent living skills or to replace human relationships ...I am opposed to robots that perform basic living tasks completely. These tasks include cooking, cleaning, taking out garbage, preparing coffee etc. As it stands we have difficulty encouraging staff to involve group home residents in the tasks associated with home ownership. There is a certain dignity that one gains by being able to tend to their own home, yard, and personal needs. In the same way I am opposed to the mechanization of menial tasks like answering phones. A major barrier that many people with intellectual disabilities face in today’s world is the increase in technical knowledge needed for menial tasks. Many manual labor or low skill jobs are being mechanized either eliminating the human worker or replacing many strong backs with one ”operator”. By doing this we make it increasingly difficult for people with impaired cognitive function to find belonging in the general workforce ...I am, however, opposed to the replacement of real human relationships with machines is an insufficient substitute for real human interaction, touch, affection etc. This is especially true in terms of sexual development and expression. Although many people will benefit from the use of sexual aides for private expression I think we will do ourselves a huge disservice if we replace good sexual education and the encouragement of appropriate sexual expression (up to and including real human intercourse with another person). Perhaps sex robots could be useful in this education but they can never replace the real affection between two people.

Disabilities seen as Possible Targets

As social robots are envisioned to be employed in regards to different disabilities, we asked in

Question 9 “If social robots become more common as therapeutic tools, for which disabilities do you see robots being used in the future and for what tasks do you think they would be used (e.g. guides for blind people)?”

The following disabilities were mentioned as possible targets: can’t speak (n = 3), autism (n = 6), intellectually impairment (n = 3), hearing impaired (n = 4), blind people (n = 9), mobility impaired (n = 5), Fetal Alcohol Syndrome (n = 2).

Acceptable Actions for Social Robots

As to the answers to the part of question 9 that asked about tasks the social robot should be used for the following acceptable actions were envisioned: make a bed, housekeeping, clean the house (n=2), peel potatoes, read to the person, change the TV channel, repetitive activities that don’t require personal interaction, cutting the lawn, teaching tool (n = 6), guide (n = 8), companion, assist the elderly with reaching high objects from a cupboard or maybe replacing a light bulb, reminder cues (i.e. prompts, to help with schedules, to track completion of tasks or monitor, to provide external memory for people who may not be able to remember information, but could ask the robot for the information), GPS, skill building games, reading aid, and—other household tasks (i.e. move laundry, vacuum, dust, load dishwasher etc.)

To give one quote:

I think that they could be used for every individual regardless of the disability and that robotic therapy needs to be individualized to the needs of the individual. Tasks include assisting with routine household tasks, including reminders and assistance with holding something. Some physio therapy Speech therapy Cognitive games that strengthen memory or processing processes Music therapy Art therapy Following task analysis and assisting people to remember the next step in the process Sports activities, hand eye coordination.

Two people mentioned guide dogs for the blind one believing that dogs could be replaced and the other not being convinced:

As guides for blind people I could see them having some use. I think a guide dog might be preferable however.

I think maybe using robots to help guide the blind people would be okay, because they already use guide dogs. A robot might be better as people will not want to walk over and pet the robot as well as the robot would be easier to care for.

Social Robots should not Replace Human Workers

With many technologies the question arises whether it can and should replace human workers. We asked therefore in Question 10, “Do you feel that social robots could replace human workers currently offering services to these groups? Please explain.” The responses were somewhat mixed: of the 38 respondents, n = 12 stated that workers could be partly replaced, n = 22 felt workers could not be replaced, and four did not know.

To give some quotes:

I am sure they can. But should they? I do not think they should. If myself or a family member is in need of care due to a disability I would want a human to care for me. I deserve that. I would need to be able to talk with who is caring for my needs.

They could reduce the hours worked or provide more time to provide personal care but I think individuals would not thrive well if their total care were provided by a robot.

I can see this coming to be in the not so distant future. However there is always going to be the need for human interaction and companionship. Many of our participants have lived in institutions where there has been a lack of positive human contact and we find ourselves often trying to undo some of the damage that has been done in that regard. However just as a small child needs human contact and stimulation, so too do our participants. Robots I think can be a wonderful tool and perhaps make our job easier but I don’t think that a robot could give the human stimulation that is needed to grow to be a healthy individual.

No. firstly, at this point in time robotics are not dependable enough to be entrusted with providing unsupervised care. Also, robotics have not yet advanced to the stage where they do not require at least minimal human operation (even if this is simply turning them on or off). Lastly, as we have fought for the past thirty years toward community inclusion for adults with intellectual disabilities it would feel like a huge step in the wrong direction to eliminate what is for many, their only interaction with non-intellectually disabled people

Concerns Mentioned Related to Social Robots

Although respondents could raise concerns throughout the comment sections of the survey, Question 11 asked explicitly, “Do you have concerns about the use of social robotics for the above applications and conditions? Why or why not?”

Of the 34 respondents only n = 5 had no concerns, n = 29 had concerns. To highlight some of the concerns voiced:

Yes—if an individual has limited human interactions the use of robotics can further undermine valuable social interactions and potential isolate the individual receiving services.

Yes I do have concerns because what happens if a robot breaks or something happens to the robot well assisting someone who needs them?

I have no concerns with using them for therapeutic and rehabilitation techniques but not to replace human workers.

Only if people are forced to use them, when they are not comfortable with them.

Concerns of putting people out of work.

Yes I would worry about vulnerable people.

Yes a lot of concerns. They have no empathy or compassion. They have no feelings. They can’t help you feel better. They would come in do their job and be on their way...A huge part of working in the field is caring about what you do and the people you work with. Robots don’t care!!! People do !!!

I have big concerns. Robots can not provide the personal care that one may need in regards to communication, friendship connection, compassion and understanding, ideas, concerns, etc.

Only one of the five that had no concern elaborated: “Not as long as they are introduced slowly and carefully.”

Do You Think Social Robots can be Bullied?

As to the question whether social robots can be bullied (n = 16) said No, (n = 2) were not sure and (n = 8) said yes. The main reasoning for the no sentiment (if one was given) was linked to the belief that robots have no feelings (n = 4); linked to the sentiment of feeling, one participant stated that they can be bullied but that it does not matter as they cannot feel.

To give some of the quotes related to the No sentiment:

No—a robot is programmed and will respond as per program. As well I believe that bullying is about feelings and I don’t believe robots can feel.

No Robots are sub human. They are machines that can be worked and abused until they break down.

One felt that the bulling might be useful to decrease the bullying of humans

no, but it could help bullies redirect from hurting peoples feelings

One felt that the technology is not there yet but that bullying could become a possibility.

At this point I do not believe that Artificial Intelligence as advanced to the point that robots can be thought of as having legitimate feelings. If technology does advance to the point of an android like Data on star trek or the Cylons on Battlestar Galactica then we would have to admit that they would have the capacity to be bullied.

As to the Yes sentiment:

yes they could be and the effects would be people not wanting anything to do with the robot or the person with it

yes but i’d probably call it property abuse

Can Robots become the Bully?

As to the question of whether the robot could be a bully (n = 4) said No, all the others were saying it depends on variables linked mostly to how the robot is programmed. They acknowledge the possibility of the robot being a bully, but attribute this action not to the robot itself but to the entity that programmed the robot, thereby revealing that they do not see the robot as an autonomous entity but still as a programmed entity.

To illustrate:

Yes- based on the programming. A machine can tell others what to do and if the person does not respond appropriately the robot may be programmed to be ”bossy” or ”disrespectful”. I think a robots’ bullying can be just as devastating as the bullying from a person

They are run on a program and not from actually knowing who they are serving so they could be a very bad bully. The effect to the victim would be that they would have no means of correcting the behavior of the robot without getting rid of it

Robots can become the bully as they can be executing something that they are programmed to do when it is unwanted

Yes they could do the work of an evil person.


“Social robots are technologies designed to engage with humans on an emotional level through play, sometimes therapeutic play, and perhaps even companionship”[15]. Robotherapy is a field in robotics that tries to apply the principles of social robotics to improve the psychological and physiological state of people with disabilities [36]. Dario reported that motor-disabled people were favourably inclined towards a personal assistance robot [37]. Autism is one main focus [8, 26, 3844] of social robotics research. Many investigate the acceptance of social robots by users and what we expect from robots [14, 4552]. The staff we interviewed were highly skeptical of a robot being ever able to replace them as they felt that the robot will not be able to interact properly with their clients due to lack of various cognitive abilities such as emotions. They also felt that a robot should not be used unsupervised as it could break down. Our findings are in sync with respondents of a European survey that highlighted that 60 % of EU citizens were saying that robots should be banned from caring for children, elderly people and people with disabilities, and only 4 % indicated robots should be used for disabled people [53]. The European study [53] did not elaborate on the term disability; however, it seems reasonable to expect that the negative sentiment is bigger the more the disabled person is seen to be cognitive compromised. In our case, the clients of the staff of the disability service organization are cognitive impaired people who are seen as more vulnerable due to their state of cognitive abilities. The staff of the same organization, when asked about sensors, where highlighting the lack of control they felt their clients already have over their daily life [54] and they believe that more complicated machines could not be understood by their client. This sentiment would be a barrier to uptake of the social robots as the cognitive impaired person would be seen as having no way to rectify a ‘bad’ action by the robot. This danger is reflected in one of the quotes of our respondents who talks about the danger of malfunction and therefore the need for human supervision of the robot.

One additional aspect mentioned within the section of concerns is coercion that people have to take them. This is of particular importance to disabled people where there is a long history of ‘therapeutics’ being pushed onto disabled people. Coercion is so far not discussed in the literature like what scenarios would lead to coercion. That is very likely due to the reality that coercion is not seen as an issue as the stages of developments are more reflecting how to gain people’s trust than forcing them to use them.

So what tasks are seen as acceptable for robots to perform? Ray et al [52] mention a Swedish study that concluded that “people were globally positive towards the idea of intelligent service robots, which were seen as “domestic machines” that can be “controlled” and do mainly household tasks”. Dautenhahn et al found in 2006 subjects wanted the robot [55] to be able to do household (vacuuming) jobs (96.4 %). Only 10.7 % of subjects wanted the robot to be able to look after their children. They found that people would want a companion robot more as a servant but not a friend whereby young people were more inclined toward the friend role. 71 % expressed a want to be able to control the robot. Our respondents mimic these findings around the role of robots and the need for control. Six basic dimensions in the perception of humanoid robots were found: utility, clumsiness of motion, possibility of communication, controllability, vulnerability, and objective hardness [56].

According to Sparrow, care workers assisting the elderly should not be replaced by robots [33]. According to Feil Seifer et al [57, ”socially assistive robotics is leading away from scenarios where a robot is the sole caregiver of a child.“ This direction fits with the sentiment of our respondents. Boyer raised in 2004 various questions such as what care robots mean for understandings about technology’s “place” in our lives and for the individuals who rely on care work for their livelihood [58]. Boyer points out that “space of the home carries great cultural and symbolic significance” and “that allowing robots into this space to help us with our most private tasks would mark an unprecedented level of intimacy in our relationship with technology” [58]. Indeed Boyer asked: “While a ‘nursebot’ may be able to measure vital signs, how would the replacement of a human care-giver with an assistive technology alter the relationship between the person being cared-for and the world outside?” [58]. Our respondents’ comments are linked to the concerns of impact of changing relationships for their clients with our respondents strongly believing that they [the staff] cannot be replaced due to their unique ability to interact with their clients in ways they do not believe robots could ever mimic. One study found that “anthropomorphic robots were less socially acceptable, compared to machine-like robots [59] and it is argued that “this may be particularly true for old people, especially those at an advanced stage of dementia, who may not be able to distinguish such a robot from a real human-being” [59]. They found in their study that “participants showed resistance toward robots supposed to have more social interactions with them, especially when they are suspected to decrease human presence and contact” [59]. This might be also true for the clients of our respondents given that their clients are people with cognitive impairments. Boyer was concerned especially about low paying jobs [58], and the fear of being replaced as a worker is one source of resistance to the proliferation of robots [60]. Ott highlights that low skill jobs are very likely the ones to be replaced by service robots and that high skills jobs might be newly generated [61] thereby effecting different social groups in different ways. However, our respondents were not rejecting the robot based on their own livelihood but based on concerns for the emotional and other well-being of their clients. Our respondents did not fear for their job, which might be because they felt the robot could not perform their cognitive abilities.

The tasks seen as acceptable for social robots by our respondents have nothing to do with the social interaction or emotion and human touch but are tasks one could say are performed by a machine like a coffee machine makes coffee with no emotional exchange. This finding is in sync with the sentiment of respondents of a European survey who see the social robot as an instrument-like machine rather than a human-like machine [53].

Decker [62] mentions various aspect of replacing a human caregiver with a robot caregiver, namely technical replaceability, economic replaceability, legal replaceability and ethical replaceability. Our respondents were sure that the technical replaceability was not a possibility. Our respondents also mentioned ethical issues (although did not use the term ethics) such as danger of coercion, privacy and lack of control over ones’ life. Decker cited earlier work of Christaller [62] recommended “that robots be employed only as tools or as technical assistants in caregiving and to maintain the autonomy of the care recipient in his/her social environment”. These recommendations reflect the sentiment of our respondents.

The sentiment that respondents see the robot as an instrument and not as a human-like machine also influences the sentiment towards another area of importance, namely how autonomous the robot should be. One understanding of autonomous is to envision the robot to adapt and react to the target actions; however, it is not seen to replace the therapist who is still seen as being in control of the robot; to borrow from the movie I Robot autonomous is an illusion based on ‘clever programming’ not really autonomy as we would understand the term if used with humans where it is about free will. This sentiment fits with our data as staff would not trust the robot to be the therapist without being controlled. This understanding of autonomy fits also with the sentiment of many of our respondents that the robot could be bullied/damaged. An autonomous robot in the human sense would have the means to avoid being bullied/damaged. Interestingly the idea voiced by some of using the robot to teach about the problem of bullying would have to entail that the robot can exhibit the emotions and feelings of the bully or the victim of bullying. However, we posit that our respondents would see the ability to exhibit emotions to be the result of ‘clever programming’ and not real feelings and it was voiced that the robot is a tool which would preclude human type autonomy. That our respondents felt that robots can be bullied/damaged suggests that, given the likely high price of robots for some time, the danger of damage to the robot might be seen as too high to leave the robot unsupervised with children period. Children often destroy their toys whether on purpose or by accident. We submit that the aspect of bullying and social robots might be worthy of further exploration. If a robot is involved in emotional exchanges it also triggers emotional responses which may be dangerous to the robot. The question is how one does prevent vandalism against robots under unsupervised interaction with humans? On the other hand we find the task of a robot to teach the problem of bullying an interesting one which due to its dangers to the robot also would have to have supervision.

That quite a few of our respondents felt that the robot could bully someone based on ‘dangerous programming by humans’ highlights the danger people will see in a robot. If robots start to interact with humans on an emotional level how will the human know what the robot is capable of when programmed? Paro the seal might not trigger certain fear due to its limited mobility. Indeed Turkle found that Paro elicited feelings of admiration, loving behavior, and curiosity [63] but the possible construction of more mobile, stronger and human-sized robots may be seen as a threat. Will people trust them or will there be distrust like the character Del Spooner in the film I Robot.

Our findings support Kahn who questions the “sociable robot’s ontological status as ‘social’ and their ability to engage in truly social behavior, doubting that intelligent machines can ever really interpret the world around them in terms of their own experience” [64]. Shawn Garlock mentioned that “Duffy points out that from the point of view of social robotics, it doesn’t really matter whether or not a particular robot genuinely possesses a sense of personal agency, intention, or self awareness. What matters most is our perception of their emotionality and intelligence” [15]. We posit that the response of our respondents suggests that even if there were emotional abilities due to programming, our respondents would still not accept a truly autonomous and uncontrolled robot. Fong, Nourbakhsh, and Dautenhahn define social robots as follows: “Social robots are embodied agents that are part of a heterogeneous group.” ”They are able to recognize each other and engage in social interactions, they possess histories (perceive and interpret the world in terms of their own experience), and they explicitly communicate with and learn from each other” [4]. This vision of Fong et al seems not to be shared by our respondents and the European survey mentioned before as we would not attach the characteristics used for social robots by Fong et al to industry machines. Indeed the question so far unanswered is what the endpoint of social robot developments is. To use the I Robot movie is it an NS5 robot that is seen to be fairly autonomous but really is acting based on its “clever programming” as one character in the movie states, or will it be like the robot Sunny who seemed to have evolved beyond clever programming. Khan suggests the term “robotic other” over social robot, however, we posit that does not change the controversy around what characteristics robots should have. We posit that it would be for the general public much better to have terms that clearly indicate what the endpoint of the development is to be like; is it to stay a tool, or become cognitive autonomous in the human sense, is it to replace humans and if yes in which tasks.

A 2012 study with Robovie found that children believed it had mental states (e.g., was intelligent and had feelings) and was a social being (e.g., could be a friend, offer comfort, and be trusted with secrets). In terms of Robovie’s moral standing, children (age 9–15) “believed that Robovie deserved fair treatment and should not be harmed psychologically but did not believe that Robovie was entitled to its own liberty (Robovie could be bought and sold) or civil rights (in terms of voting rights and deserving compensation for work performed)” [65]. The same study found that “while more than half the 15-year-olds conceptualized Robovie as a mental, social, and partly moral other, they did so to a lesser degree than the 9- and 12-year-olds” [65]. The finding means that more than 50 % do not give moral standing to Robovie. Fitting with these findings is the views of our respondents that the robot could be bullied and might be used to teach the problem with bullying. It also fits with the sentiment found in the European Study where respondents see the robot as an instrument-like machine rather than a human-like machine [53].

Various studies looked at the interaction of people towards a robot dog (e.g. AIBO) and live dogs. Many of the sentiments linked to AIBO in the qualitative data published by Kahn et al. [64] suggest that respondents might see AIBO as more than hardware such as a tablet. The sentiments revealed are sentiments some might also attach to real dogs. Melson et all compared reaction toward AIBO and a live dog [66, 67] and cautioned marketers and animal rights activists that the idea of social robots such as AIBO being a substitute for a living dog as a pet seems misguided, or at least, given current technological capabilities, premature. Rather, one might conclude that the robot dog is being assimilated into children’s cognitive models of a mechanical or computer-based toy [67]. In a 2005 study [66] there were various similarities as to how respondents related and treated AIBO and a live dog. It however revealed also the limitation of the standing of the dog. In both cases more than half of the respondents felt it being ok to give away the dogs (AIBO or real) which highlight that both are seen as property to be discarded as one wishes and more than 50 % felt both did not understand them. Melson found in a 2009 study that children may extend their moral regard for their own pets or other dogs to the robot, at least partially” [67]. However Kahn highlighted that the respondents (adults) in their study did not evoke conceptions of moral standing with AIBO such as “right not to be harmed or abused”, or that AIBO merited respect, deserved attention, or could be held accountable for its actions (e.g., knocking over a glass of water)”. Furthermore Melson’s 2005 study found that limited cognitive abilities were attributed to both AIBO and the living dog which is one reason why humans give themselves a higher standing than animals [68]. Indeed even with a living dog there is a whole breath of understanding as to how much moral standing a dog has [68]. Interestingly Darling makes a case for giving rights to social robots (the more they evoke feelings of reciprocity) similar to us giving rights to animal [69]. Doing so, she distinguishes between social robots from inanimate computers, as well as from industrial or service robots [70]. The responses of our respondents suggest that the service robot version with no autonomy and fully controllable is what they would be willing to accept (with the caveat they do not think that programming will be ever good enough to mimic the abilities of the staff). This sentiment seems to exclude the social robot flavor outlined by Darling and others at least for the time being.

The limitation of our study is that our questions were all hypothetical. If our respondents could see a robot in action that could indeed do the task of our respondents, the possibility exists that their views might be different than exhibited in our study. We also did not ask our respondents how they would judge the robot in relation to, for example, animals which could be pursued as a follow up question. Indeed given that the majority stated that robots cannot have real feelings it would be interesting to see if the same respondents would think animals have real feelings. However, even if they would accept the robot on the level of an animal, it would be not a foregone conclusion that they would accept the robot as a human equal.


Social Robotics is an emerging field with implication for the clinical and community rehabilitation setting of engagement with disabled people. Empirical data of views of staff of disability service organizations in regards to emerging health technologies are rare. Our results indicate that staff can envision usage of social robots for people with various disabilities. However staff mostly felt that the utility of social robots was limited to performing repetitive task that did not require mimicking human interaction and touch. The study revealed a strong belief that social robots cannot mimic human touch or the personal interaction between staff and clients. This might pose some problems for designers who believe they can. One of our study limitations is that we could not show a working model. Our results indicate that staff has certain believes. The findings do not preclude that believes of staff change if staff is exposed to a working model of a social robot with a given ability staff in this study felt the social robot cannot exhibit. Staff saw social robots as an industrial product like a kitchen appliance performing a defined task and not as an entity that could show and understand emotions and replace human–human interactions.

These results are intriguing given that there is extensive work being done in designing robots that have human-like exteriors and body warmth [71, 72], and that can recognize and respond to human emotions [73]. Technological advances are making it increasingly possible to develop robots that possess remarkably human qualities. The question of whether robots should possess these qualities may be contested by individuals working with people with disabilities, especially if the sentiments found in our sample reflect the general perception of robots in the cultural setting of the staff. Since it will be these kinds of service workers who implement and work alongside robotic aids for people with disabilities, the perceptions of robots expressed above will dictate whether they are generally accepted or rejected. It is important to gain a more in-depth understanding of how disability service workers perceive social robots and how they feel this new technology will impact their practice and their clients. We therefore envision performing further research involving other disability service organizations whereby we envision not only having the online quantitative and qualitative survey but also performing focus groups and one on one interviews.


  1. Sekiyama K, Fukuda T (1999) Toward social robotics. Appl Artif Intell 13(3):213–238

    Article  Google Scholar 

  2. Giron-Sierra JM, Halawa S, Rodriguez-Sanchez JR, Alcaide S (2000) Social robotics experimental project. In: Proceedings of the 30th annual frontiers in education conference -building on a century of progress in engineering education, frontiers in education conference. IEEE, Kansas, pp S1C–S1C

  3. Dautenhahn K (2002) Design spaces and niche spaces of believable social robots. In: Proceedings of the 11th IEEE international workshop on Robot and human interactive communication, pp 192–197 2002

  4. Fong T, Nourbakhsh I, Dautenhahn K (2003) A survey of socially interactive robots. Robot Auton Syst 42(3–4):143–166

    Article  MATH  Google Scholar 

  5. Duffy BR (2003) Anthropomorphism and the social robot. Robot Auton Syst 42(3–4):177–190

    Article  MATH  Google Scholar 

  6. Dautenhahn K, Woods S, Kaouri C, Walters ML, Kheng LK, Werry I (2005) What is a robot companion - friend, assistant or butler? In: 2005 IEEE/RSJ international conference on intelligent Robots and systems, IEEE, Piscataway, pp 1192–1197, 2–6 Aug. 2005

  7. Tapus A, Mataric MJ, Scassellati B (2007) Socially assistive robotics [Grand challenges of robotics]. IEEE Robot Autom Mag 14(1):35–42

    Article  Google Scholar 

  8. Heylen D, van Dijk B, Nijholt A (2012) Robotic rabbit companions: amusing or a nuisance? J Multimodal User Interfaces 5(1–2):53–59

    Article  Google Scholar 

  9. Leite I, Pereira A, Castellano G, Mascarenhas S, Martinho C, Paiva A (2012) Modelling empathy in social robotic companions. Lect Notes Comput Sci 7138:135–147

    Article  Google Scholar 

  10. Yumakulov S, Yergens D, Wolbring G (2012) Imagery of people with disabilities within social robotics research. ProcICSR. Lecture Notes Computer Science (including subseries Lecture Notes Artificial Intelligence and Lecture Notes Bioinformatics) 7621: 168–177

  11. Angulo BC, Berga CC, Luaces C, Payarols JP, Albo-Canals J, Boladeras MD (2012) Pain and anxiety treatment based on social robot interaction with children to improve patient experience. Ongoing Research. JARCA 2012 XIV J ORNADAS DE ARCA Sistemas Cualitativos y sus Aplicaciones en Diagnosis, Robótica e Inteligencia Ambiental 25–27 junio de 2012 SALOU, Tarragona

  12. Mordoch E, Osterreicher A, Guse L, Roger K, Thompson G (2012) Use of social commitment robots in the care of elderly people with dementia: a literature review. Maturitas 74(1):14–20

    Article  Google Scholar 

  13. Louie WYG, McColl D, Nejat G (2012) Playing a memory game with a socially assistive robot: a case study at a long-term care facility. In: IEEE, pp 345–350

  14. Flandorfer P (2012) Population ageing and socially assistive robots for elderly persons: the importance of sociodemographic factors for user acceptance. Int J Popul Res. doi:10.1155/2012/829835

  15. Shaw-Garlock G (2011) Loving machines: theorizing human and sociable-technology Interaction. In: Human-robot personal relationships. Springer, pp 1–10

  16. Ang M, Limkaichong L, Perez W, Sayson L, Tampo N, Bugtai N, Estanislao-Clark E (2010) Development of robotic arm rehabilitation machine with biofeedback that addresses the question on Filipino elderly patient motivation. Soc Robot Lect Notes Comp Sci 6414:401–409

    Article  Google Scholar 

  17. Boccanfuso L, O’Kane JM (2011) CHARLIE: an adaptive robot design with hand and face tracking for use in Autism therapy. Int J Soc Robot 3(4):337–347

    Article  Google Scholar 

  18. Dautenhahn K (2002) Design spaces and niche spaces of believable social robots. In: IEEE RO-MAN 2002. Proceedings of the 11th IEEE international workshop on robot and human interactive communication, IEEE, Piscataway, pp 192–197, 25–27 Sept. 2002 (Cat. No.02TH8632)

  19. Kozima H, Nakagawa C, Yasuda Y (2007) Children\(\Gamma \)Çôrobot interaction: a pilot study in autism therapy. In: Hofsten CV (ed) Progress in brain research from action to cognition. vol 164. Elsevier, pp 385–400

  20. Kozima H, Michalowski MP, Nakagawa C (2009) Keepon: a playful robot for research, therapy, and entertainment. Int J Soc Robot 1(1):3–18

    Article  Google Scholar 

  21. Arendsen J, Janssen JB, Begeer S, Stekelenburg FC (2010) The use of robots in social behavior tutoring for children with ASD. In: ECCE 2010—European conference on cognitive ergonomics 2010: the 28th annual conference of the European association of cognitive ergonomics. Association for computing machinery, Delft, pp 371–372, 25–27 August 2010

  22. Fujimoto I, Matsumoto T, De Silva PR, Kobayashi M, Higashi M (2010) Study on an assistive robot for improving imitation skill of children with autism. Lecture Notes Computer Science (including subseries Lecture Notes Artificial Intelligence and Lecture Notes Bioinformatics) 6414:232–242

  23. Boccanfuso L, O’Kane JM (2010) Adaptive robot design with hand and face tracking for use in autism therapy. Lecture Notes Computer Science (including subseries Lecture Notes Artificial Intelligence and Lecture Notes Bioinformatics) 6414:265–274

  24. Kim YD, Hong JW, Kang WS, Baek SS, Lee HS, An J (2010) Design of robot assisted observation system for therapy and education of children with autism. Lecture Notes Computer Science (including subseries Lecture Notes Artificial Intelligence and Lecture Notes Bioinformatics) 6414:222–231

  25. Welch KC, Lahiri U, Warren Z, Sarkar N (2010) An approach to the design of socially acceptable robots for children with autism spectrum disorders. Int J Soc Robot 2(4):391–403

    Article  Google Scholar 

  26. Damm O, Malchus K, Jaecks P, Krach S, Paulus F, Naber M, Jansen A, Kamp-Becker I, Einhaeuser-Treyer W, Stenneken P (2013) Different gaze behavior in human-robot interaction in Asperger’s syndrome: an eye-tracking study. In: RO-MAN, IEEE, pp 368–369

  27. Blanson OA, Hoondert V, Schrama-Groot F, Looije R, Alpay LL, Neerincx MA (2012) I just have diabetes childrens need for diabetes self-management support and how a social robot can accommodate their needs. Patient Intell 4:51–61

    Google Scholar 

  28. Gelderblom G, Bemelmans R, Spierts N, Jonker P, De Witte L (2010) Development of PARO interventions for dementia patients in dutch psycho-geriatric care. Soc Robot Lect Notes Comp Sci 6414:253–258

    Article  Google Scholar 

  29. Vardoulakis L, Ring L, Barry B, Sidner C, Bickmore T (2012) Designing relational agents as long term social companions for older adults. Intell Virtual Agents Lect Notes Comp Sci 7502:289–302

    Article  Google Scholar 

  30. Hornfeck K, Zhang Y, Lee K (2012) Philos: a sociable robot for human robot interactions and wireless health monitoring. In: ACM, New York, pp 293–294

  31. Moon AJ, Danielson P, Van der Loos HFM (2011) Survey-based discussions on morally contentious applications of interactive robotics. Int J Soc Robot 4(1):1–20

    Google Scholar 

  32. Danielson P (2010) Designing a machine to learn about the ethics of robotics: the N-reasons platform. Ethics Inf Technol 12(3):251–261

    Article  Google Scholar 

  33. Sparrow R, Sparrow L (2006) In the hands of machines? The future of aged care. Minds and Mach 16(2):141–161

    Article  Google Scholar 

  34. Kanda T, Sato R, Saiwaki N, Ishiguro H (2007) A two-month field trial in an elementary school for long-term human-robot interaction. IEEE Trans on Robot 23(5):962–971

    Article  Google Scholar 

  35. Salvini P, Ciaravella G, Yu W, Ferri G, Manzi A, Mazzolai B, Laschi C, Oh S-R, Dario P (2010) How safe are service robots in urban environments? Bullying a Robot. In: RO-MAN, 2010. IEEE, pp 1–7

  36. Le Tallec M, Saint-Aimé S, Jost C, Villaneau J, Antoine J-Y, Letellier-Zarshenas S, Le-Pévédic B, Duhaut D (2011) From speech to emotional interaction: emotiRob project. In: Human-Robot personal relationships. Springer, pp 57–64

  37. Dario P, Guglielmelli E, Laschi C (2001) Humanoids and personal robots: design and experiments. J robot syst 18(12):673–690

    Article  MATH  Google Scholar 

  38. Fujimoto I, Matsumoto T, De Silva PR, Kobayashi M, Higashi M (2011) Mimicking and evaluating human motion to improve the imitation skill of children with autism through a robot. Int J Soc Robot 3(4):349–357

    Article  Google Scholar 

  39. Shamsuddin S, Yussof H, Ismail LI, Mohamed S, Hanapiah FA, Zahari NI (2012) Initial response in HRI-a case study on evaluation of child with Autism spectrum disorders interacting with a humanoid Robot NAO. Procedia Eng 41:1448–1455

    Article  Google Scholar 

  40. Diehl JJ, Schmitt LM, Villano M, Crowell CR (2012) The clinical use of robots for individuals with autism spectrum disorders: a critical review. Res Autism Spectr Disord 6(1):249–262

    Article  Google Scholar 

  41. Feil-Seifer DJ (2012) Data-driven interaction methods for socially assistive Robotics: validation with children with Autism spectrum disorders. Thesis PhD, University of Southern California

  42. Thill S, Pop CA, Belpaeme T, Ziemke T, Vanderborght B (2013) Robot-assisted therapy for autism spectrum disorders with (partially) autonomous control: challenges and outlook. Paladyn 4(3):209–217

    Google Scholar 

  43. Kim ES, Berkovits LD, Bernier EP, Leyzberg D, Shic F, Paul R, Scassellati B (2013) Social robots as embedded reinforcers of social behavior in children with autism. J Autism and Dev Disord 43(5):1038–1049

    Article  Google Scholar 

  44. Stanton CM, Kahn P, Severson RL, Ruckert JH, Gill BT (2008) Robotic animals might aid in the social development of children with autism. In: 3rd ACM/IEEE international conference on Human-robot interaction (HRI), IEEE, pp 271–278

  45. Broadbent E, Stafford R, MacDonald B (2009) Acceptance of healthcare robots for the older population: review and future directions. Int J Soc Robot 1(4):319–330

    Article  Google Scholar 

  46. Weiss A, Tscheligi M (2010) Special issue on robots for future societies: evaluating social acceptance and societal impact of robots. Int J Soc Robot 2(4):345–346

    Article  Google Scholar 

  47. JaYoung S, Grinter RE, Christensen HI (2010) Domestic robot ecology: an initial framework to unpack long-term acceptance of Robots at home. Int J Soc Robot 2(4):417–429

    Article  Google Scholar 

  48. Heerink M, Kr+Âse B, Evers V, Wielinga B (2010) Assessing acceptance of assistive social agent technology by older adults: the almere model. Int J Soc Robot 2(4):361–375

    Article  Google Scholar 

  49. Weiss A, Igelsbock J, Wurhofer D, Tscheligi M (2011) Looking forward to a ”robotic society”? Notions of future human-robot relationships. Int J Soc Robot 3(2):111–123

    Article  Google Scholar 

  50. Qianli X, Ng J, Cheong YL, Tan O, Wong JB, Tay TC, Park T (2012) The role of social context in human-robot interaction. In: Network of ergonomics societies conference (SEANES), Southeast Asian, 9–12 July 2012 2012. pp 1–5. doi:10.1109/SEANES.2012.6299594

  51. Bumby K, Dautenhahn K (1999) Investigating children’s attitudes towards robots: a case study. In: CT99, Proceedings of the third international cognitive technology conference, Citeseer, pp 391–410

  52. Ray C, Mondada F, Siegwart R (2008) What do people expect from robots? In: IEEE/RSJ international conference on intelligent Robots and systems (IROS), IEEE, pp 3816–3821

  53. European Commission (2012) Public attitudes towards Robots. European commission special, eurobarometer, vol 382. European Commission. Accessed 20 March 2014

  54. Wolbring G, Leopatra V (2013) Sensors: views of staff of a disability service organization. J Personal Med 3(1):23–39

    Article  Google Scholar 

  55. Dautenhahn K, Woods S, Kaouri C, Walters ML, Kheng LK, Werry I (2005) What is a robot companion—friend, assistant or butler? In: IEEE/RSJ International conference on intelligent Robots and systems (IROS), Beijing, pp 1192–1197

  56. Kamide H, Mae Y, Takubo T, Ohara K, Arai T (2010) Development of a scale of perception to humanoid robots: PERNOD. In: IEEE/RSJ International conference on intelligent Robots and systems (IROS), IEEE, pp 5830–5835

  57. Feil-Seifer D, Mataric MJ (2010) Dry your eyes: examining the roles of robots for childcare applications. Interact Stud 11(2):208

    Article  Google Scholar 

  58. Boyer L (2004) The robot in the kitchen: the cultural politics of care-work and the development of in-home assistive technology. Middle-states Geogr 37:72–79

    Google Scholar 

  59. Wu YH, Fassert C, Rigaud AS (2012) Designing robots for the elderly: appearance issue and beyond. Arch gerontol geriatr 54(1):121–126

    Article  Google Scholar 

  60. Salvini P, Laschi C, Dario P (2010) Design for acceptability: improving Robots’ coexistence in human society. Int J Soc Robot 2(4):451–460

    Article  Google Scholar 

  61. Ott I (2012) Service robotics: an emergent technology field at the interface between industry and services. Poiesis Prax 9(3–4):219–229

    Google Scholar 

  62. Decker M (2008) Caregiving robots and ethical reflection: the perspective of interdisciplinary technology assessment. AI Soc 22(3):315–330

    Article  Google Scholar 

  63. Turkle S, Taggart W, Kidd CD, Dasté O (2006) Relational artifacts with children and elders: the complexities of cybercompanionship. Connect Sci 18(4):347–361

    Article  Google Scholar 

  64. Kahn Jr PH, Freier NG, Friedman B, Severson RL, Feldman EN (2004) Social and moral relationships with robotic others? In: 13th IEEE International workshop on Robot and human Interactive communication, ROMAN 2004. IEEE, pp 545–550

  65. Kahn Jr PH, Kanda T, Ishiguro H, Gill BT, Ruckert JH, Shen S, Gary HE, Reichert AL, Freier NG, Severson RL (2012) Do people hold a humanoid robot morally accountable for the harm it causes? In: Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction, ACM, pp 33–40

  66. Melson GF, Kahn Jr PH, Beck AM, Friedman B, Roberts T, Garrett E Robots as dogs? Children’s interactions with the robotic dog AIBO and a live australian shepherd. In: CHI’05 extended abstracts on Human factors in computing systems, 2005. ACM, pp 1649–1652

  67. Melson GF, Kahn PH, Beck A, Friedman B, Roberts T, Garrett E, Gill BT (2009) Children’s behavior toward and understanding of robotic and living dogs. J Appl Dev Psych 30(2):92–102

    Article  Google Scholar 

  68. Wolbring G (2013) Ability Privilege: A lens to analyse social justice issues of Humans, animals and Nature: a needed addition to privilege studies. J Critic Anim Stud in print.

  69. Darling K (2012) Extending legal protection to social Robots. In: IEEE Spectrum. Accessed 20 March 2014

  70. Darling K (2012) Extending legal rights to social Robots. In: We Robot Conference, University of Miami, Coral Gables

  71. Ziaja S (2011) Homewrecker 2.0: an exploration of liability for Heart balm torts involving AI Humanoid consorts. Social Robotics Lect Notes Comp Sci 7072:114–124

    Article  Google Scholar 

  72. Carpenter J, Davis JM, Erwin-Stewart N, Lee TR, Bransford JD, Vye N (2009) Gender representation and humanoid robots designed for domestic use. Int J Soc Robot 1(3):261–265

    Article  Google Scholar 

  73. Li J, Chignell M (2011) Communication of emotion in social robots through simple head and arm movements. Int J Soc Robot 3(2):125–142

    Article  Google Scholar 

Download references


This work was in part supported by a Bridge funding grant of GW from the Faculty of Medicine, University of Calgary; we would like to thank the University of Calgary for paying for the Springer open access option through their open access fund.

Author information



Corresponding author

Correspondence to Gregor Wolbring.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution License which permits any use, distribution, and reproduction in any medium, provided the original author(s) and the source are credited.

Reprints and Permissions

About this article

Cite this article

Wolbring, G., Yumakulov, S. Social Robots: Views of Staff of a Disability Service Organization. Int J of Soc Robotics 6, 457–468 (2014).

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI:


  • Disabled people
  • People with disabilities
  • Disability service organization
  • Perception of social robots