Relating conversational expressiveness to social presence and acceptance of an assistive social robot
- 1.5k Downloads
Exploring the relationship between social presence, conversational expressiveness, and robot acceptance, we set up an experiment with a robot in an eldercare institution, comparing a more and less social condition. Participants showed more expressiveness with a more social agent and a higher score on expressiveness correlated with higher scores on social presence. Furthermore, scores on social presence correlated with the scores on the intention to use the system in the near future. However, we found no correlation between conversational expressiveness and robot acceptance.
KeywordsSocial presence Technology acceptance Social robots Gerontechnology Human–robot interaction
The last few years, a growing number of HRI research projects concern themselves with eldercare (Mynatt et al. 2000; Pollack 2005; Taggart et al. 2005). Indeed, the future of eldercare could be that of elders living longer independently, supported by technology. Robotics could be an essential part of this, also because robots and screen agents with social abilities could function both as assistive technology and as social company (Forlizzi 2005). But will elders be willing to accept all this assistive technology, especially when it concerns interactive systems that could be perceived as autonomous and intelligent such as robots and screen agents (Forlizzi et al. 2004)? These systems differ from other technologies, because they concern technologies that are not always perceived just as such: a robot or screen agent can be (partly) perceived as a social actor, and it could be that interaction with it follows the same principles as inter-human communication rather than those of human–machine interaction—and this should show in the behavior of people interacting with robots or screen agents (Bartneck and Forlizzi 2005).
Recent research with robots in an eldercare environment shows that elders can feel positively about robots (Kidd et al. 2006) and that robots can have a comforting effect that is comparable to the effect pets have (Beck and Katcher 1996; Beck et al. 2003; Pineau et al. 2003; Wada et al. 2003; Wada and Shibata 2006). Experiments focusing on the effects of social behavior of robots and screen agents show that a more social or more caring condition does have an effect that is comparable to that of humans behaving more sociable or more caring (Heylen et al. 2002; Bickmore and Picard 2004; Heerink et al. 2006a).
The research presented here is part of a project on developing a methodology for predicting and explaining the acceptance of robots and screen agents by elderly users. It aims to enable researchers to denote the different factors that influence acceptance of robots and screen agents after a 3 min test. Our main instrument is a questionnaire with a rating scale, filled out by the subjects after their test session. Besides this, we want to explore the possibilities of user observation and develop an instrument that can be used additional to our questionnaire. Since our questionnaire is directed to result in quantitative data, we are specifically interested in processing data on behavior that can be related to acceptance of a robot or screen agent as a conversational partner.
Earlier publications on our research (Heerink et al. 2006a, b, 2008) reported on the results of experiments with the robot in eldercare institutions, which included conversational behavior analysis, but we did not relate this to social presence or acceptance. In this paper, we present and discuss data from a new experiment with a robot, focusing on the development of a more profound instrument for analysis of conversational behavior data. The result of this analysis of data, obtained by observing video recordings of the 3 min sessions, will be linked to questionnaire results to explore the relationship between user behavior and both social presence and robot acceptance.
After a short review of related research, we will describe the set up and instruments used, next we will present and interpret the results.
2 Robots in eldercare
Several projects have addressed the response of elderly users toward different types of robots that could serve different purposes, varying from just being good company to physical support and giving advice. An example of a pet-like robot with no other functionalities than being good company is Paro. Since 2002, a number of experiments with this seal shaped robot have been carried out (Shibata et al. 2003; Wada et al. 2003; Wada and Shibata 2006). In early studies, it was positioned in a group of elders where they could interact with it, mainly by caressing and talking to it. The aim of this study was to observe the use of a robot in a setting described as ‘robot assisted activity’ and to prove that elders felt more positive after a few sessions. This was done by measuring the moods of the participants, both with a face scale form and the Profile of Mood States (POMS) questionnaire. More recently, research with Paro focuses on collecting physical data on elders that have been exposed to the robot to measure its effect on their well-being.
An example of a robot with more functionalities that was subject to experiments in an eldercare institution is Pearl (Montemerlo et al. 2002; Pollack et al. 2002; Pineau et al. 2003). This robot was used in open-ended interactions, delivering candies and used to guide elders through the building to the location of a physiotherapy department. The experiments with Paro and Pearl both registered a high level of positive excitement on the side of elders, suggesting that a robot would be accepted. In case of Paro, it would merely be beneficial as a pet—a study by Libin and Cohen-Mansfield (2006) shows that a robotic pet is preferred over a plush toy cat—and in case of Pearl, it would be used as an actual assistant.
A robot with advanced assistive functionalities to be applied in eldercare is the German Care-o-bot (Graf et al. 2004; Parlitz et al. 2007). It is intended to provide assistance in many ways, varying from being a walking aid to functioning as a butler.
Other projects focus on an assistive environment rather then on the development of a specific robot. An example of this is the Italian RoboCare project (Cesta and Pecora 2005, 2006) in which a robot is an interface to a smart home for older adults.
Research concerning experiments with screen agents for elders is reported by Bickmore and Picard (Bickmore and Picard 2004, 2005; Bickmore et al. 2005a, b). The study focuses on the acceptance of a relational agent (a screen agent that simulates a personal interest in the user) appearing on a computer screen and functioning as a health advisor for older adults. Findings (scores on questions related to affection, trust and acceptance) indicate that the agent was accepted by the participants as a conversational partner on health and health behavior issues and rated high on trust and friendliness. It was also found to be successful as a health advisor. Other research with the same agent (Bickmore and Picard 2005) is focused on the ability to function in long-term relationships in which social abilities also appear essential. It is linked to the notion of social presence (Lombard and Ditton 1997; Lee and Nass 2003) that people feel in interaction with systems, which can play a role in interpreting the responses of participants when they apparently perceive social abilities.
3 Social presence and conversational expressiveness
Since it is not unusual for humans to treat systems and devices as social beings (Reeves and Nash 1996), it seems likely that humans treat embodied agents as such. The extent to which they do so seems to be related to a factor that is often related to as either ‘Presence’ or, more specifically ‘Social presence’. Many research projects that are related to our research incorporate this concept (DiSalvo et al. 2002; Lee and Nass 2003; Bickmore and Schulman 2006).
The term presence originally refers to two different phenomena. First, it relates to the feeling of really being present in a virtual environment and can be defined as ‘the sense of being there’ (Witmer and Singer 1998). Second, it can relate to the feeling of being in the company of a social entity: ‘the perceptual illusion of non mediation’ (Lombard and Ditton 1997). In our context, the second definition is relevant.
In an earlier study, we found a crucial role for social presence in the process of functional and conversational acceptance of embodied agent technology (Heerink et al. 2008). Therefore, we intend to incorporate measuring social presence when measuring acceptance of social assistive robots and screen agents.
The experience of presence of a social entity usually shows by a higher rate and intensity of expressions that a speaker uses (Wagner and Smith 1991; Lee and Wagner 2002). It demonstrates the amount of conversational engagement one feels (Nakano and Nishida 2005). We call this conversational expressiveness: the amount and intensity of facial expressions and gestures when engaged in a conversation. We hypothesize that also for our user group, a higher score on the construct of social presence will correlate with a higher score on conversational expressiveness. As in earlier research (Heerink et al. 2008), we found a higher score on social presence to correlate with a higher score on acceptance (as indicated by the expressed intention to use the system), we suspect conversational expressiveness to correlate with intention to use.
4 Robot acceptance
Defining user acceptance as “the demonstrable willingness within a user group to employ technology for the tasks it is designed to support” (Dillon 2001) brings the need to develop evaluation methodologies. Specifically for robots and screen agents, several methods have been used, varying from applying heuristics (Clarkson and Arkin 2007) or other usability type tests (Yanco et al. 2004), classifying tests (Riek and Robinson 2008), and role-based evaluation (Scholtz 2004) to measuring physical responses (Dautenhahn and Werry 2002). Also, Technology Acceptance Modeling is used (de Ruyter et al. 2005): a methodology that does not only provide insight in the probability of acceptance of a specific technology, but also in the influences underlying acceptance tendencies.
However, technology acceptance models have not been developed for systems that can be perceived as a social entity, such as a robot or screen agent and also not particularly for elderly users. Influences that are known to be of importance in acceptance of a social entity have never been adapted by any technology acceptance model and neither have influences that are known to be of influence by elderly users.
Therefore, in our study, we researched the possibilities of using an acceptance model for quantitative research on acceptance of robots and screen agents by elderly users. We aimed to include specific influences representing social acceptance and the specific demands of elderly users.
The sole instrument used in technology acceptance methodology is traditionally a questionnaire with replies on a Likert scale. Relating conversational expressiveness to acceptance would add behavior analysis to the instrumentation and thus enrich acceptance methodology.
By analyzing data from an experiment with elderly participants using a robot, we want to find out whether there would be differences in measured conversational expressiveness between users of a more expressive and less expressive condition. Furthermore, we want to know whether conversational expressiveness of users can be related to their experience of social presence.
The participants were 40 elderly citizens, living in an eldercare institution. Given the results of an earlier study (Heerink et al. 2007), we expected the more social condition to evoke more conversational expressiveness by the participants.
5.1 Experimental design
For the experiment, a specific interaction context was created where the robot was used in a Wizard of Oz fashion: it was connected to a hidden operator who was controlling its behavior. A Wizard of Oz setup made it possible to have a similar discourse pattern for all sessions as all uses had the same limited set of tasks. This we considered an advantage when comparing counted behavior in different sessions.
The robot in the more social condition would gaze straight at the participant; the robot in the less social condition would look past the participant.
The robot made mistakes such as saying good morning in the afternoon or the other way round. When this would be made clear, the robot in the more social condition would apologize for the mistake, the robot in the less social condition would not. This feature was demonstrated to all participants in an introduction session.
The robot in the more social condition would smile when appropriate and express cheerfulness in its facial expression, the robot in the other condition did not.
The robot in the more social condition remembered the participant’s name and used it—the robot in the less social condition did not.
The robot in the more social condition would support the conversation by nodding and blinking, the less social robot would not do this.
The robot in the more social condition was better in turn taking by waiting until the conversation partner finished speaking, the robot in the less social condition was less polite.
The robot in the more social condition was more outgoing, both in facial expressiveness and in use of its voice (less monotonous, with pitch variation).
As stated, the intention was to create a more social condition, which means not all differences concern a more expressive robot.
Besides these behavioral differences, the functionalities and spoken texts were the same for both conditions.
5.2 Used robot
The iCat’s base contains two microphones to record the sounds it hears, and a loudspeaker is built in for sound and speech output. We used the iCat with a female voice, because this was the voice that was the one-three pretest subjects felt most comfortable with.
Participants were elderly people (17 male, 23 female) between 65 and 96 years old, living in eldercare institutions in the cities of Lelystad and Loosdrecht, in the Netherlands. They were divided among the two conditions as equally as possible (the social condition featured one more male and one less female). They were first exposed to the robot in groups (two groups of eight participants and one group of four participants for each condition). After a short introduction by one of the researchers, the robot told them what its possibilities were: it could be used as an interface to domestic applications, for monitoring the user, companionship, information providing, agenda-keeping, and memorizing medication times and dates. They were told that for today’s experiment, the robot was only programed to perform three tasks: setting an alarm, give directions to the nearest supermarket, and giving the weather forecast for tomorrow. The experimenter subsequently demonstrated how to have a conversation with the robot in which it performed these tasks. After this group session, the participants were invited one by one to have a conversation with the robot, while the other group members were waiting in a different section of the room (separated by sound proof movable walls). The conversation was standardized as much as possible, and we asked the participants to have the robot perform the three simple tasks. Furthermore, we told them that the robot would be available in the next 5 days.
While being engaged in conversation, the participants’ behavior was observed by a researcher and recorded by camera. The group session and the individual session were both about 5 min, so the maximum time spent with the robot was 10 min for each participant. To give an example, a typical conversation would start with the participant saying ‘Good morning iCat’. The robot, being seemingly asleep until being spoken to, would raise its head and respond with ‘Good morning, what can I do for you?’ Subsequently, the participant could ask what the iCat’s possibilities were, or go straight to a task like setting the alarm. In the latter case, iCat would ask for the time for the alarm to go of and for the sound to make at the given time (choices were music, alarm bell, and calling the participant’s name).
5.4 Behavior analysis methodology
Although participants were observed during the experiment, we based our analysis on observations of the video’s afterward. During the analysis, non-verbal forms of conversational expressiveness were counted for each participant such as greeting the robot nodding or shaking the head, smiling, looking surprised or irritated (frowning), and moving toward or away from the robot. This list of items considering conversational expressiveness was generated by listing classical feedback gestures (see Scherer 1987; Cerrato 2002; Axelrod and Hone 2005; Sidner and Lee 2005; Heylen et al. 2006) without categorizing them to specific communicative functions. The gestures are not specifically intentional or non-intentional, but they can be identified as conversational behavior.
To each counted item, the observers attributed two values: one for the strength (weight) of it and one for the certainty. of the observer. Both could be one, two, or three points. So if the observer would be sure of someone laughing very loud, this would score two times three points.
The observers were students who were trained to observe objectively, but were unaware of the nature of the experiment. They watched the video’s in which the camera was turned toward the participant, so the robot was not visible. They were not made aware of the different conditions of the robot. We had two observers for each video and added their scores for each behavior.
5.5 Used questionnaire
Used statements for intention to use and social presence
I’m thinking of using the robot the next few days
I am planning to use the robot the next few days
I am certainly going to use the robot the next few days
When working with the robot, I felt like working with a real person
I occasionally felt like the robot was actually looking at me
I can imagine the robot as a living creature
I often realized the robot is not a real person
Sometimes it seemed as if the robot had real feelings
The different types of expressive behavior by participants during their interaction with the robot were counted for each participant, added for each condition and analyzed to measure conversational expressiveness. To account for inter-rater reliability, we calculated Lin’s concordance (Lin 1989, 2000), which we found to be 0.94 on average.
Means and t scores on items of conversational expressiveness
‘Don’t know’ gesture
Suddenly moving away
‘Don’t know’ gesture
Suddenly moving away
We categorized the behavior types by them being positive or negative (reflecting a positive or negative attitude toward the conversational partner) and looked at the total number of times a type of behavior occurred for the different conditions.
Pearson correlation scores for constructs and categorized conversational expressiveness
Table 3 shows that there is a correlation between social presence and conversational expressiveness. There is no correlation, however, between intention to use and conversational expressiveness.
t scores comparing a more and less social condition on constructs and conversational expressiveness
7 Discussion and conclusions
There is a clear pattern of more conversational expressiveness, a higher frequency of non-verbal behaviors, of participants that were in conversation with the robot in a more social condition. This corresponds with a higher score on social presence, showing users experiencing a social entity are indeed responding to that. This may say something about the effect of what we understand as social presence on users, but although social presence corresponds with intention to use, conversational expressiveness only partly seems to be an indication of acceptance. Positive expressions may correspond with higher scores on acceptance, while increasing amount of negative expressions (shakes, frowns, and taking distance) may indicate a lower acceptance rate.
Still, we find this research shows that behavior observation can be an additional instrument for studies on robot acceptance. However, there would be more possibilities to explore, considering both qualitative and quantitative instruments. A detailed discourse analysis for example, could provide clues that can be related to acceptance, although a different (non-Wizard of Oz) setup would in that case be more appropriate.
Another item for further research could be the question whether conversational expressions occurred as in response to the same expressions by the robot (a smile in response to a smile, a frown in response to a frown). In that case, we would be speaking of imitative behavior. This would be the occurrence of a well-known phenomenon in psychology called the chameleon effect (Chartrand and Bargh 1999). It concerns imitative behavior between humans, which seems to occur naturally unless two people do not like each other. The occurrence of this behavior could even very well be interpreted as a sign of acceptance (Kahn et al. 2006). But during behavior analysis the observers just counted the number of behaviors, without looking at the behavior of the robot that evoked it–the camera was always directed toward the participant. In future research, this possibility of imitative behavior could be something to observe, also when comparing robots with different embodiments, since it could add interesting viewpoints to HRI theory on this aspect (Dautenhahn 1994; Dautenhahn and Nehaniv 2002).
Furthermore, there are factors like enjoyment, perceived ease of use, perceived usefulness, attitude, and anxiety that influence acceptance (Venkatesh et al. 2003; Heerink et al. 2008), and future results could explore their relation with conversational expressiveness and social presence. This would not only provide a more complete picture of the relationship between conversational behavior in human–robot interaction and acceptance, it would also tell us more about what enhances the sense of social presence for this particular user group.
We like to express our gratitude to the staff and caretakers as well as the test participants of “De Uiterton” in Lelystad and “De Emtinckhof” in Loosdrecht for their cooperation. Also, we like to thank Willem Notten, Bas Terwijn and Rogier Voors for their work on programming the system and Rick van Midde, Albert van Breemen and Martin Saerbeck for their support.
This article is distributed under the terms of the Creative Commons Attribution Noncommercial License which permits any noncommercial use, distribution, and reproduction in any medium, provided the original author(s) and source are credited.
- Axelrod L, Hone K (2005) Identifying affectemes: transcribing conversational behaviour. In: Proceedings of the symposium on conversational informatics for supporting social intelligence and interactionGoogle Scholar
- Bartneck C, Forlizzi J (2005) A design-centred framework for social human-robot interaction. In: Proceedings IEEE international workshop on robot and human interactive communicationGoogle Scholar
- Beck A, Katcher A (1996) Between pets and people. Purdue University Press, West LafayetteGoogle Scholar
- Beck A, Edwards N, Friedman B, Khan P (2003) Robotic pets and the elderly from http://www.ischool.washington.edu/robotpets/elderly/
- Bickmore T, Picard RW (2004) Towards caring machines. In: Proceedings of CHI, ViennaGoogle Scholar
- Bickmore T, Schulman D (2006) The comforting presence of relational agents. In: Proceedings of CHIGoogle Scholar
- Bickmore T, Caruso L, Clough-Gorr K (2005) Acceptance and usability of a relational agent interface by urban older adults. In: Proceedings of the ACM SIGCHI conference on human factors in computing systemsGoogle Scholar
- Cerrato LA (2002) Comparison between feedback strategies in human-to-human and human–machine communication. In: Proceedings of ICSLPGoogle Scholar
- Cesta A, Pecora F (2005) The ROBOCARE Project: intelligent systems for elder care. In: Proceedings of the AAAI fall symposium on “caring machines: AI in elder care”Google Scholar
- Cesta A, Pecora F (2006) Integrating intelligent systems for elder care in robocare. In: Proceedings of international conference on aging, disability and independence (ICADI)Google Scholar
- Clarkson E, Arkin RC (2007) Applying heuristic evaluation to human-robot interaction systems. FLAIRS Conference: 44–49Google Scholar
- Dautenhahn K (1994) Trying to imitate—a step towards releasing robots from social isolation. In: Proceedings from perception to action conference, IEEEGoogle Scholar
- Dautenhahn K, Nehaniv CL (2002) An agent-based perspective on imitation. In: Dautenhahn K, Nehaniv CL (eds) Imitation in animals and artifacts. MIT press, Cambridge, pp 1–40Google Scholar
- Dautenhahn K, Werry I (2002) A quantitative technique for analysing robot-human interactions. In: Proceedings of the IEEE/RST Intl. Conference on intelligent robots and systems, Lausanne, pp 1132–1138Google Scholar
- de Ruyter B, Saini P, Markopoulos P, van Breemen AJN (2005) Assessing the effects of building social intelligence in a robotic interface for the home. Special Issue of IwC: social impact of emerging technologies 17:522–541Google Scholar
- Dillon A (2001) User acceptance of information technology. In: Karwowski W (ed) Encyclopedia of human factors and ergonomics. Taylor and Francis, LondonGoogle Scholar
- DiSalvo CF, Gemperle F, Forlizzi J, Kiesler S (2002) All robots are not created equal: the design and perception of humanoid robot heads. In: Proceedings of the conference on Designing interactive systems: processes, practices, methods, and techniquesGoogle Scholar
- Forlizzi J, DiSalvo C, Gemperle F (2004) Assistive robotics and an ecology of elders living independently in their homes. J HCI Spec Issue Hum Robo Interact 19:25–59Google Scholar
- Heerink M, Kröse B, Evers V, Wielinga B (2006a) Studying the acceptance of a robotic agent by elderly users. Int J Assis Robot Mechatron 7(3):33–43Google Scholar
- Heerink M, Kröse BJA, Wielinga BJ, Evers V (2006a) The Influence of a robot’s social abilities on acceptance by elderly users. In: Proceedings RO-MAN, HertfordshireGoogle Scholar
- Heerink M, Kröse BJA, Wielinga BJ, Evers V (2007) Observing conversational expressiveness of elderly users interacting with a robot and screen agent. In: Proceedings international conference on rehabilitation robotics (ICORR), NoordwijkGoogle Scholar
- Heerink M, Kröse BJA, Wielinga BJ, Evers V (2008) The influence of social presence on acceptance of a companion robot by older people. J Phys Agents Spec Issue Hum Interact Domest Robo 2(2):33–40Google Scholar
- Heylen D, van Es I, Nijholt A, van Dijk B (2002) Experimenting with the gaze of a conversational agent. In: Proceedings international CLASS workshop on natural, intelligent and effective interaction in multimodal dialogue systemsGoogle Scholar
- Heylen D, Nijholt A, Reidsma D (2006) Determining what people feel and think when interacting with humans and machines: notes on corpus collection and annotation. In: Proceedings 1st California conference on recent advances in engineering mechanicsGoogle Scholar
- Kahn PH, Ishiguro H, Friedman B, Kanda T (2006) What is a human?—Toward psychological benchmarks in the field of human-robot interaction. In: Proceedings RO-MANGoogle Scholar
- Kidd CD, Taggart W, Turkle S (2006) A sociable robot to encourage social interaction among the elderly. robotics and automation, 2006. ICRA 2006. In: Proceedings 2006 IEEE international conference on, pp 3972–3976Google Scholar
- Lee KM, Nass C (2003) Designing social presence of social actors in human computer interaction. In: Proceedings of the SIGCHI conference on human factors in computing systemsGoogle Scholar
- Lombard M, Ditton TB (1997) At the heart of it all: the concept of presence. J Comput-Mediat Commun 3(2). Available online: http://www.ascusc.org/jcmc/vol3/issue2/lombard.html
- Montemerlo M, Pineau J, Roy N, Thrun S, Verma V (2002) Experiences with a mobile robotic guide for the elderly. In: Proceedings of the AAAI national conference on artificial intelligenceGoogle Scholar
- Mynatt, E. D., I. Essa and W. Rogers (2000) Increasing the opportunities for aging in place. In: Proceedings of the CUU 2000 Conference on Universal Usability. ACM, New YorkGoogle Scholar
- Nakano YI, Nishida T (2005) Awareness of perceived world and conversational engagement by conversational agents. AISB 2005 symposium: conversational informatics for supporting social intelligence & interaction, EnglandGoogle Scholar
- Pollack M (2005) Intelligent technology for an aging population: the use of AI to assist elders with cognitive impairment. AI Magazine, pp 9–24Google Scholar
- Pollack ME, Brown L, Colbry D, Orosz C, Peintner B, Ramakrishnan S, Engberg S, Matthews JT, Dunbar-Jacob J, McCarthy CE, Thrun S, Montemerlo M, Pineau J, Roy N (2002) Pearl: a mobile robotic assistant for the elderly. AAAI workshop on automation as eldercareGoogle Scholar
- Reeves B, Nash C (1996) The media equation: how people treat computers, televisions, and new media as real people and places. Cambridge University Press, New YorkGoogle Scholar
- Riek LD, Robinson P (2008) Robot, rabbit, or red herring? Societal acceptance as a function of classification ease. Ro-man, MunichGoogle Scholar
- Scherer KR (1987) Toward a dynamic theory of emotion: The component process model of affective states. Geneva Stud Emotion CommunGoogle Scholar
- Scholtz JSC (2004) Toward a framework for evaluating ubiquitous computing applications. IEEE Pervasive ComputGoogle Scholar
- Shibata T, Wada K, Tanie K (2003) Statistical analysis and comparison of questionnaire results of subjective evaluations of seal robot in Japan and U.K. In: Proceedings of the 2003 IEEE international conference on robotics & automationGoogle Scholar
- Sidner CL, Lee C (2005) Engagement during dialogues with robots. AAAI spring symposiaGoogle Scholar
- Taggart W, Turkle S, Kidd C (2005) An interactive robot in a nursing home: preliminary remarks. Towards social mechanisms of android science. Stresa, Italy, Cognitive Science Society, pp 56–61Google Scholar
- Venkatesh V, Morris MG, Davis GB, Davis FD (2003) User acceptance of information technology: toward a unified view. MIS Q 27(3):425–478Google Scholar
- Wada K, Shibata T (2006) Robot therapy in a care house—results of case studies. In: Proceedings RO-MANGoogle Scholar
- Wada K, Shibata T, Saito T, Tanie K (2003) Psychological and social effects of robot assisted activity to elderly people who stay at a health service facility for the aged. In: Proceedings of the 2003 IEEE/RSJ international conference on intelligent robots and systems, Las Vegas, Nevada, USAGoogle Scholar