Abstract
The field of robotics has grown exponentially over the years, especially the social aspect, which has enabled robots to interact with humans meaningfully. Robots are now used in many domains, such as manufacturing, healthcare, education, entertainment, rehabilitation, etc. Along with their widespread usage in many real-life environments, robots have been used as companions to humans. With the increased amount of research done on human–robot companionship (HRC), it is important to understand how this domain is developing, in which direction, and what the future might hold. There is also a need to understand the influencing factors and what kind of empirical results are in the literature. To address these questions, we conducted a systematic literature review and analyzed a final number of 134 relevant articles. The findings suggest that anthropomorphic and zoomorphic robots are more popular as human companions, while there is a lack of interest in functional and caricatured robots. Also, human-like and animal-like features are implemented more in companion robots. Studies rarely exploit the mobility available in these robots in companionship scenarios, especially in outdoor settings. In addition to that, co-existence and co-performance-based implementation with humans have been observed rarely. Based on the results, we propose a future research agenda that includes thematic, theoretical, methodological, and technological agendas. This study will help us understand the current state and usage of robotic companions which will then potentially aid in determining how HRC can be leveraged and integrated more seamlessly into human lives for better effectiveness.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
1 Introduction
The integration of robots into our daily lives is happening at an unprecedented pace, leading to a rapid increase in human–robot interactions. While robots offer numerous functional benefits across various aspects of daily life and professional settings, the importance of social aspects in robotics is also gaining prominence in many facets of robotic interaction. Companion robots represent the forefront of social robotics, equipped with advanced capabilities that enable them to foster deep emotional connections with humans, transcending mere two-way interactions. These advanced capabilities have demonstrated their desirability and effectiveness in several contexts such as healthcare [139], wellbeing [34], gait rehabilitation [98], education [119], entertainment [77] and shopping [20], particularly among specific demographics like the elderly, children, students, and stroke survivors. As robots continue to expand their presence in our lives, their ability to serve as companions will become increasingly vital, supported by evolving visions of human–robot interaction [108].
Companion robots can be defined as robots that aim to establish emotional connections with humans and facilitate social interactions, while also assisting with specific tasks or activities, depending on the intended purpose or setting. They are capable of introducing social presence [97], empathy [95], contextual behavior [147], and co-existence [176]. Their capabilities often include responding to verbal commands, recognizing faces, interpreting emotions, and sometimes even providing physical support [7]. They have also proved to be effective social technologies in situations where people are forced to refrain from companionship with humans (e.g., during pandemics: [168]), or in contexts where people are socially isolated because of physical (e.g., aging: [3]) or mental (e.g., healthcare for depression patients: [54]) conditions. Overall, the impact of companion robots on society will likely be multifaceted, from reshaping how we use robots on a daily basis, to how we behave around them in different situations as if they were social beings [24]. It is also worth mentioning that companion robots, while they might be socially assistive [152], do not necessarily belong to the same group as the latter category [163]. However, the main objective of companion robots is not just to be social, but to reshape the dynamics between humans and robots by making interactions more personal and effective [142]. Additionally unlike socially assistive robots that focus on introducing social features in an assistive scenario [51], the value of companion robots lies in their capability to connect with humans in mental terms. Through this connection, companion robots are able to influence interactions in an affective way, generating the feeling of companionship.
The already significant adaptation of companion robots at such an early stage of development points to a future where they will likely be more common and diversified, impeaching on the daily lives of humans in many different facets. Consequently, we need a strong understanding of their potential roles, their influences on humans, the distinct ways of designing and developing companion robots, and how we interact with them. Additionally, we need to understand the principles defining human–robot companionship to advance this field, and establish a vocabulary with which to structure informed decisions during design and development. However, previous work in the field has not addressed these concerns, and there is a lack of comprehensive knowledge in terms of how human–robot companionship (HRC) is formed, how effective HRC is in different scenarios, and how it influences human–robot interaction (HRI). Several long-term studies have attempted to investigate how HRC is created and maintained, for example, through embodiment [172], trust [105], and empathy [104]. To some extent, these studies have demonstrated how human–robot companionship might be formed, as well as the usefulness, effectiveness and suitability of robotic companions in different scenarios. However, these efforts remain scattered and the need for more holistic knowledge remains unfulfilled. Additionally, we still lack comprehensive information about the types of robots used for creating companionship, their characteristics, appearance, behavior, and their interaction modalities. Overall, as a result in the academic community, efforts to design ideal robotic companions for specific scenarios are mostly speculative, reducing their effectiveness. Similarly in the industrial community, the commercial development process of companion robots is not very well-informed, potentially making the products unsuccessful. A comprehensive knowledge base proposing a broad overview of what HRC is, and detailing the domains have been predominantly employed HRC, which ones are underexplored, which interaction modalities and robot behaviors are associated with companionship, and what types of user experiences are associated with the companionship aspects of robots would immensely benefit both industry and academia by defining the state of the art, and establishing the vocabulary and practices to form future directions.
To satisfy this need, this article aims to provide a comprehensive overview and structured knowledge of HRC, by addressing the aforementioned points through a systematic literature review. Additionally, we aim to identify research trends and themes to understand knowledge gaps in this field, pointing out future directions around HRC. We contribute to the field of human–robot interaction by answering the following research questions:
RQ1
What types of robots or robotic applications have been investigated in the corpus of human–robot companionship?
RQ2
What are the domains and scenarios in which human–robot companionship has been investigated?
RQ3
What are the factors that define and influence human–robot companionship in the literature?
RQ4
What kind of empirical results are in the corpus of human–robot companionship, and what are the current gaps and potential future research avenues?
The contributions of this study are threefold:
-
1.
Comprehensive overview of the field regarding explored domains, deployment facilities, robots used, features of robots, roles of robots, interaction modalities, research methods, analysis methods, and measurement instruments.
-
2.
Findings regarding the trends and gaps in the HRC field to date related to above-mentioned areas.
-
3.
Thematic, Methodological, Theoretical, and Technological future agendas that help guide further studies in HRC.
We expect that these outcomes will be helpful for a wide array of HRC researchers from different fields by providing information regarding predominant domains, methods, interaction modalities, robot behaviors and types, and also by identifying gaps in these mentioned areas. This paper can be used both for kickstarting studies by presenting a broad understanding of the field (contribution 1), but also as inspiration to conduct studies in more underexplored fields which are signposted in the findings and agendas section of this paper (contribution 2 and 3). Overall, all three contributions can also offer the companion robot industry a clear overview of researchers’ methods and empirical evidence relating to how companionship with robots is formed, and potential development areas for the future.
2 Background
2.1 Companionship
Companionship is the phenomenon of having someone or something as a friend or companion in different times and situations, creating a sense of fellowship in the process [7, 45]. Companionship can also be linked with social support that by contributing to psychological wellbeing [142], helps humans lead an easier and more relaxed life [142, 143]. Companionship can vary in different scenarios of human life, such as in the workplace and in daily life. Previous research supports the notion that social interaction and companionship with peers can work as a supporting mechanism for reducing both work-related and day-to-day stress [27], 26]. Social support and companionship from partners and coworkers can also result in a better work-life balance which positively impacts perceived work satisfaction, as well as work-life balance [166].
The definitions and understandings of companionship leave room for non-human creatures and non-living things to become human companions as well. Animals or pets are other living agents that have been identified as mediators of supportive social activities that can result in bonding and companionship. According to Holbrook et al. [79], pets bring many positive aspects to human lives, such as opportunities to appreciate nature and wildlife, for inspiration and learning, to be childlike and playful, altruistic and nurturant, for companionship, caring, experiencing comfort and/or calmness, to be a parent, and to strengthen bonds with other humans. A study by Dotson and Hyatt [47] focusing on dog–human companionship identified several dimensions such as a symbiotic relationship, anthropomorphism, activity/youth, and boundaries or a willingness to adapt as certain benefits. According to Wells [175], companion animals have both short-term and long-term effects on both the physical and mental health of their owners. Evidence suggests that pet owners usually have lower incidences of high blood pressure and stress when in the company of a pet, and also tend to develop less chronic diseases compared to others [14, 58]. Companionship is therefore a very important aspect in both human and non-human lives, which has great significance in making human lives more meaningful.
2.2 Robots as Companions
Robot companions are robots capable of performing various tasks through their ability to interact physically, socially, emotionally, and safely with humans [68]. Robots have been employed as companions or collaborators for humans, especially in social interaction contexts. They have been given the role of healthcare assistants [139], companions for people with dementia [34], gait rehabilitation assistants [98], reading companions [119], exercise coaches [151], music and video listening companions [77], and shopping assistants [20]. Considering that robots are deployed in many scenarios with humans where social interaction is a priority, understanding their socio-technical aspects [17] is crucial to both the relationship, as well as the information flow.
Technology as a mode of companionship has been explored in various domains, especially in scenarios where it can aid people with various disabilities and disorders [94]. Companion robots have been used in dementia care to accompany patients in their day-to-day lives, improving their physical and mental state [94, 176]. Additionally, mobile robots have been widely used in gait rehabilitation for patients with several neurodevelopmental disorders, assisting patients in improving their walking ability as well as stability [57, 62]. Robots have also been used in education as mathematics tutors, and in assisting students to learn a new language [19, 122]. In these scenarios, social behavior and human factors help promote and create a sense of companionship.
Research studies have been conducted to understand the effects of several factors on human–robot relationships and companionship. According to these studies, empathy between humans and robots can create a positive sense of belonging [94], while social behavior [145] and collaboration [18] can play important roles in making the relationship more meaningful. Ethical issues such as vulnerability [38] and trust [111] are also mentioned in the literature as issues that need to be considered when designing for HRC.
The effect of embodiment in human–robot companionship has also been investigated, and refers to the physical form and presence of a robot [141, 172]. These studies highlight that the look of robots is effective on the feelings humans develop towards them. Importantly, the results of the study indicate that lived experiences and feelings towards robots might differ significantly when humans interact with robots in real life, compared to seeing them through videos and pictures [104]. Trust is another fundamental factor influencing human–robot interaction on multiple levels, and shapes user acceptance, fostering cooperation, collaboration, and a sense of dependency on robots [105]. Trust impacts performance by influencing task efficiency and the likelihood of users to follow robot instructions. Additionally, it contributes to the emotional connection between humans and robots, influencing overall satisfaction. Trust also influences how users perceive and recover from robot errors, affecting their openness to adopting new technologies [145]. Overall, companion robots have been developed with many qualities that can affect human lives in social, personal and spiritual ways, which indicates that the properties, features, technicalities and contexts of companion robots are highly varied, and studies have yet to define those aspects. In the presented study, apart from producing a detailed map of these aspects, we also put forth what companionship with robots can mean, and how this field can develop.
2.3 Human–Robot Co-existence
Co-existence can be considered as two entities sharing the same space at the same time [5]. Similarly, human–robot co-existence [176] can be defined as humans and robots sharing the same space at the same time. Co-existence can be both interactive and non-interactive. A human and a machine can stay in the same space without interacting, despite co-existing at that time. However, interactive co-existence might be more fulfilling for users if the interaction is meaningful [86]. Such interactions can be made meaningful if they come naturally to the involved parties and can create an exchange of ideas or emotions, taking the interaction to a specific goal [43]. Hence, HRC can be considered as a form of interactional co-existence as both parties share the same space while interacting with each other to achieve a common goal.
The idea of co-existence to achieve a common goal leads to the concept of co-performance [94], which considers intelligent and computational agents capable of learning and maintaining social practices along with humans. This can be related to daily chores, exercising, recreational activities, or having a conversation. As robots are becoming more and more common in daily life scenarios, it is essential to find the best implementation of co-existence and co-performance aspects. Several factors have been investigated to understand optimal co-existence, such as the interplay between humans and non-humans [57, 62], utilizing the capabilities of agents in collaboration scenarios [18], and the roles of different agents. Co-existence and performance are some of the core concepts of companionship as they help to conceptualize dynamic scenarios where humans and robots exist in the same space, recognizing each other’s existence, and where they co-perform to serve their purpose.
3 Method
We have conducted a systematic review of the literature on human–robot companionship (HRC). Systematic reviews can help identify, synthesize, and analyze the previous literature on a specific topic and help understand the current state of the art, which can then be used to identify possible future directions [174]. As we aim to identify key factors of HRC from the literature (including applications, domains, scenarios, and influencing factors), a systematic literature review appears to be a well-suited approach. As these types of reviews aim to gather a comprehensive set of information on the focus topic, they may also be helpful in identifying potential research gaps and future agendas in the field. The PRISMA statement [130] was followed for structuring the review process. The following subsections elaborate on the search strategy, article screening, as well as the inclusion criteria employed in the review.
3.1 Search Strategy
To obtain relevant literature, a search string was created with a view to covering all of the aspects of HRC. We were specifically interested in literature that has deployed robots as companions, although we did not limit the search to the keywords “robot” and “companion”. Because of the SCOPUS’s extensive coverage of peer-reviewed literature, we utilized it as our primary database for finding relevant literature. The database was queried with the following search string:
TITLE-ABS-KEY (robot* AND companion*) AND (LIMIT-TO (DOCTYPE, “cp”) OR LIMIT-TO (DOCTYPE, “ar”) OR LIMIT-TO (DOCTYPE, “ch”)) AND (LIMIT-TO (LANGUAGE, “English”)).
The search scope was limited to conference papers, journal articles, and book chapters. Only literature published in the English language was considered in the process. Also, no start limit year was imposed on the database search as we aimed to cover all of the published literature available on the topic. The query was first performed on 16 July 2021 and returned 1731 documents. A second search was performed using this same query on 17 October 2023, returning 462 additional records. As a result, the total number of articles included in the initial screening was 2193.
3.2 Review Procedure
To create a clear selection process as well as a structured set of records, we set specific selection criteria (provided in Table 1).
The selection analysis consisted of several phases, and was carried two times for two separate searches. First, all 2193 (1731 + 462) articles were compiled into an Excel file which included information on, e.g., title, publication avenue, publication year, keywords, abstracts, etc. The files were checked for duplicates, and a total of 13 (12 + 1) articles were removed as a result. The resulting 2180 (1719 + 461) articles were then screened using their title and abstract, keeping the above-mentioned inclusion criteria in mind, resulting in the inclusion of 1748 (1376 + 372) articles. The remaining 432 (343 + 89) articles were included for further analysis.
We attempted to access the full texts of the 432 articles, of which 39 (23 + 16) were found to be inaccessible through online databases. We contacted the authors of these articles through email and ResearchGate, but none responded in time for them to be included in the further analysis. An additional 275 (222 + 37) articles were excluded after going through the full texts: 23 (8 + 15) of which were review or summary articles, 15 (11 + 4) were not peer-reviewed, and the remaining 221 (203 + 18) were deemed to be out of the scope of this review based on the inclusion criteria.
After the selection process was concluded, we were left with a total of 134 (98 + 36) articles that were used for information extraction and analysis. The first author carried out the initial and full-text screening, while the remaining authors provided continuous discussions and guidance throughout the process. Table 2 contains a list of attributes that have been extracted from articles during the final coding and analysis phase.
The information extracted from the data based on the above-mentioned criteria was gathered through inductive coding [165], and has been reported in the following sections as the findings of this literature review. The whole process flow of searching, screening, eligibility, and inclusion is shown in Fig. 1.
4 Results
4.1 Publication Year
Figure 2 demonstrates the number of articles published each year on human–robot companionship or related topics, starting from 2004. It can be seen that from 2004 to 2010, there were some scattered efforts to explore the topic. Starting from 2011, there has been a significant increase in the number of publications. In 2020, the number of publications almost doubled compared to the previous year, which can be interpreted as an increase in interest and the possibilities around the topic. This could also have a connection with social isolation restrictions seen all over the world during the COVID-19 pandemic, which increased the amount of research on robotic companions filling the void of human companions. From the year 2021 onwards, the amount of literature has kept a steady pace as interest in companion robots has been sustained, although it did not approach or surpass the 2020 numbers. However, the year 2023 is likely to have more literature available, as this search was carried out on October 17, 2023.
Although recent numbers have not consistently increased from previous years, there is a clear indication that the amount of work on the topic has gradually increased over a longer period of time. One trend that can be observed (albeit not very strong as we need more data) is that the number of publications has nearly doubled every ~ 5 years, starting from 2011. In the years 2011–2014, the number of publications ranged from 4 to 6, while from 10 to 13 between 2015 and 2019, increasing to 20 publications in 2020. Overall, this indicates the emerging importance of the topic, and therefore a need for a comprehensive study through which to understand the state of the art and future directions.
4.2 Publication Avenue
As demonstrated in Table 3, out of the 134 finally selected publications, 52 (38.8%) were journal articles, 67 (50%) conference papers, and 15 (11.2%) book chapters.
The 134 publications belong to 78 unique venues. Popular venues were: International Conference on Human–Robot Interaction (HRI) (13 entries), Lecture Notes in Computer Science (11 entries), ACM CHI Conference (8 entries), International Conference on Robot and Human Interactive Communication (RO-MAN) (8 entries), International Journal of Social Robotics (5 entries), IEEE International Conference on Robotics and Automation (4 entries), International Conference on Intelligent Robots and Systems (IROS) (3 entries), and the Journal of the American Medical Directors Association (3 entries). International Conference on Human-Agent Interaction (HAI), International Conference on Multimodal Interaction (ICMI), Applied Sciences (Switzerland), International Conference on Humanoid Robotics (Humanoids), Australasian Journal on Aging, and ACM Interaction Design and Children Conference, IDC all had 2 entries each. All of the other venues had 1 paper each in the corpus.
4.3 Domain Areas
The list of domain areas in which robots have been employed as companions is demonstrated in Table 4. Note that some of the studies seem to belong to multiple domains, however, the domain area categorization has been performed on the basis of the purpose of the robot. For example, if a robot uses games for the purpose of wellbeing, this robot has been categorized under wellbeing. Based on this principle, the studies were divided into domain areas. Appendix A provides a more detailed look at each study including their domain, deployment facility, used robots, robot features, participant age groups, and the interaction spans.
The most popular domain area for exploring the companionship between humans and robots is wellbeing. Studies have focused on understanding how robotic companions can make lives easier for users through psychological support [87, 140], emotional support [68, 90], stress reduction [48, 49], break-taking during long working hours [182], accompanying in isolation [99, 178] and depression management [154]. Robots have also been deployed as companions in supported living facilities to reduce depression [23].
Education is another domain that has frequently been explored in terms of robotic companions. The majority of these studies have investigated different reading activities in the presence of the robot, while the robot acts as either a motivator or mediator of users’ reading behavior [36, 44], 113, 118, 120, 181]. In other studies, robots have been given the role of peer tutors for mathematical problem-solving [16], tutors of a second language [64], and language teachers for children [81]. In addition to these, robots have been deployed as sign language tutors for children [171], as well as for children with autism spectrum disorder (ASD) [12].
There are a significant number of studies where robots have been used to promote or facilitate socialization. Robots have been studied as companions for reducing loneliness [32, 131, 138], combating social isolation [150, 169], facilitating outdoor socialization [60, 135, 136], and promoting social interaction in the elderly [46]. They have also been deployed as commensal or dining companions [91, 110].
The healthcare and disability assistance domains have been explored quite extensively. The majority of the studies in the healthcare domain focus on the effects of robotic companionship on people with dementia [3, 34, 96, 133, 167]. Domestic health assistance [67, 137] and medication adherence through companion robots have also been investigated [25]. Robotic companions have been introduced into the lives of people with disabilities, such as children with ASD [13, 114], people with neurodevelopmental disorders (NDD) [52, 61], children with anxiety [8], and down syndrome [100].
Robotic companions have been used as motivational agents in different situations, such as exercising [50, 116, 151] and jogging [65, 53]. The influence of companions has also been explored on the choice of food [59], shopping behavior [20], and sustaining longer on different tasks [152].
Other areas where robotic companions have been introduced less frequently are rehabilitation, entertainment, navigation, guidance, and assistance. Robots have been used in stroke rehabilitation therapy [31, 117], gait rehabilitation [98], therapeutic support for people with dementia [72], and walking training [66]. In terms of entertainment, robots have been used as music-listening [71, 78], and video-watching companions [77]. In addition, robotic gaming companions [75, 80, 102, 103] have also been explored. There have been several studies where robots have been introduced as walking companions both indoors [132] and outdoors [149]. Navigation guidance through robots [6, 37] has been extensively explored in these studies. In the assistance area, robots have been deployed to assist humans in a socially active way in homes [161, 186], and in care facilities [115].
4.4 Types of Robots Used
A total of 89 different robots have been used to investigate companionship in the selected corpus. Here, they have been categorized based on the categorization proposed by Fong et al. [55]. Fong divides robots into 4 main categories based on their appearance and visual features: anthropomorphic (human-like), zoomorphic (animal-like), functional (machine-like), and caricatured (object-like). Table 5 specifies the robots used, as well as their mobility. Almost half of the robots used were anthropomorphic and a significant number were zoomorphic. This indicates that human- or animal-like robots are thought to be more suitable for companionship. Data demonstrates that a major section of the anthropomorphic and functional robots happen to be mobile, while zoomorphic and caricatured robots are mostly stationary or immobile. Some robots were selected for studies more predominantly than others—for example, the Nao robot has been used in 12 studies, Paro in 10 studies, and Joy for All animal robots in 9 studies. However, the use of 89 different robots suggests a notable variety in the research domain.
Out of the 89 different robots, 38 were mobile robots and the other 51 did not have any mobility. 22 of 46 (47.8%) anthropomorphic robots were mobile. On the other hand, only 4 out of 20 zoomorphic robots have mobility. According to these numbers, mobility has been a more important factor for anthropomorphic robots compared to zoomorphic robots. For functional robots, the majority (11 out of 14) are mobile as their mobility is one of the more important functionalities for them to become companions, given that these types of robots have been used mostly in rehabilitation scenarios to help users move or walk. Finally, caricatured robots by definition look like real-world objects, and as objects do not move by themselves, most of these robots (7 out of 8) do not either.
4.5 Robots Used in Different Domains
We have divided the use of companion robots into 10 different domains (see Table 4), featuring 89 robots. Figure 3 demonstrates the usage of different types of robots in each domain, which offers an idea about current deployment trends and possible future directions.
For wellbeing scenarios, the use of zoomorphic robots (15 out of 33 instances) is more common than other types. Anthropomorphic robots have also been used frequently in wellbeing scenarios (9 out of 33 instances). Functional, and caricatured robots have been used with less frequency. Zoomorphic robots used in wellbeing scenarios are huggable or designed to be used on the lap such as Joy for All animal robots and Paro. These robots were found to have positive effects on the wellbeing of the subjects, especially the elderly. This can be justified based on previous research that animal companions (i.e., pets) can have a positive impact on the daily lives of humans, and for the same reason, zoomorphic robots are popular in wellbeing scenarios.
In education scenarios, anthropomorphic robots have been used most frequently (17 out of 24 instances). The use of zoomorphic, functional, and caricatured robots is also seen but in rare circumstances. In general, educators or helpers in the education context are perceived as such that they would be able to help with studying, give instructions when needed, and answer questions to some extent. This could be a reason for deploying anthropomorphic robots in education scenarios. A similar trend can be seen in socialization is that anthropomorphic robots are used more (9 out of 17 instances). However, zoomorphic robots are also popular in such scenarios (6 out of 17 instances). A few functional, and caricatured robots have been deployed as conversation starters and prompters in socialization.
Both anthropomorphic and zoomorphic robots have been deployed in similar numbers in healthcare scenarios, but functional and caricatured robots have not been seen in such scenarios this far. In disability assistance scenarios, anthropomorphic robots have been used more than the other types (7 out of 12 instances), however, zoomorphic and functional robots are not rare. There is a similar trend to be seen in the motivation and influence domain as anthropomorphic robots (5 out of 10 instances) are more frequently used, but the other types can also be seen to some extent. For rehabilitation, anthropomorphic (4 out of 9 instances) and zoomorphic robots (3 out of 9 instances) are more commonly used. There is also one instance each of functional and caricatured robots being used in rehabilitation. For entertainment purposes, the majority of the implementations have adapted anthropomorphic robots (5), while zoomorphic (2) and caricatured (2) robots have also been used. However, functional robots have not been used in such scenarios. Among a small number of instances of navigation and guidance in general, anthropomorphic robots seem to be the popular choice (4 out of 5 instances). A functional robot was used in the other instance, meaning zoomorphic or caricatured robots have not been considered for navigation and guidance domain. A similar trend can be seen in the assistance domain where anthropomorphic (2 out of 3 instances) robots and functional robots (1 out of 3 instances) have been used.
To summarize, zoomorphic robots were most commonly used in wellbeing scenarios, while anthropomorphic robots were the most popular type of robot in every other domain. Especially in the education domain, anthropomorphic robots were utilized more than all three other types of robots combined. Notably, the healthcare domain did not utilize functional or caricatured robots at all, while zoomorphic and caricatured robots were not used in any navigation or assistance scenarios. As the number of functional and caricatured robots in general is significantly lower in HRC, it is understandable that some domains might not utilize them at all.
4.6 Roles of Robots
Companion robots in each domain area have been given different roles based on their purpose. The different roles of robots in different domains have been listed in Table 6.
In scenarios like wellbeing and healthcare that are similar to each other, the co-existence or presence of robots with the users appears to be the most important role. For the studies in these two domains, robots have been primarily put into the same rooms as the subjects to observe their response and acceptance of the robots. In addition, it has also been observed how robots can shape people’s daily lives and influence them to have a better quality of life. In wellbeing scenarios, another important role of robots has been as social isolation companions. These implementations were mostly observed during and after the COVID-19 pandemic (years 2020–2022). In the first search performed in 2021, 16 out of 98 (16.7%) articles were wellbeing-related, however, 17 out of 36 (47.2%) of new articles derived from the second search in 2023 were in the wellbeing domain. Relating to this, robots have also been deployed as healthcare companions where the robots are equipped with functionalities like medication adherence, mood detection, fall detection, activity suggestions, and communication facilitation. A similar caregiver role has also been adopted, and another prominent role in wellbeing scenarios is robots as pets. This can also be linked to the needs of people who do not have people around, especially for the elderly whose wellbeing can be improved through companionship. There are other roles such as that of conversational companion, persuasive agent, gaming companion, psychology coach, huggable companion, and empathic companion in the wellbeing domain.
In the education domain, the roles of companion robots have mainly revolved around reading, learning, and tutoring. The most common roles are learning companion and reading companion, along with tutoring roles such as language tutor, sign language tutor, mathematical logic tutor, and peer tutor. One other role that has been seen is of playing companion for preschool children, where the robot facilitates learning through play.
A companion to loneliness is the most commonly applied role in the socialization domain. In this role, the robots engage in meaningful interactions with the users by suggesting things to do, providing updates about the weather, reminding them to do specific tasks, as well as playing games with them. In most cases, they are deployed as social companions. There are some instances where robots have been deployed as outdoor companions where they walk side by side with the user and make their presence felt to create a social situation.
In the disability assistance domain, robots have been given many different roles such as daily life companions, conversational companions, communication companions, social companions, and learning and playing companions. All of these roles are mainly pointed towards being a companion in daily tasks, helping to be active, having conversations, facilitating communication with family members, and overall being a social agent in day-to-day life. One other disability-specific role has been reported, which is distracting children with autism when they are performing repetitive behaviors. In disability assistance scenarios, the difference between companion robots and assistive robots might be confusing. However, assistive robots are not necessarily companion robots, especially as not all assistive robots have social features and connections with their users, [155], 156]. On the other hand, companion robots have features that can evoke emotional and social responses from their user, while also being assistive. For this reason, many companion robots might appear to be assistive and be introduced in assistive scenarios, but their impact is much different in terms of social and emotional connection. While some of these roles appear to be assistive in manner and come under the domain of disability assistance, the studies mentioned here have deployed robots as companions in those scenarios where they accompany people with different disabilities. The robots were used in these studies as companions in different tasks, but not only as assistants.
Companion robots have been employed to motivate and influence people into performing specific tasks. The most common scenarios of motivation are exercising and jogging where robots are designed to help the users keep up with exercise routines through rewarding interactions. For jogging scenarios, companion robots have been deployed to stay near the user and give them a feeling of having company. Other roles in this domain include commensality and shopping.
For rehabilitation scenarios, robots have been used to provide therapeutic support as well as to facilitate walking after a stroke. There are a number of instances where robots have been given the role of entertainer or entertainment facilitator, playing the roles of gaming companions, music-listening, TV-watching and video-watching companions, as well as playing companions in digital games. Robots have also been developed as navigation companions where they work as walking and navigation guides.
4.7 Deployment Facilities
Robotic companions have been deployed in different domain areas for different purposes, which has led to them being used in different scenarios and facilities. Table 7 shows the deployment facilities where robots have been used as companions. Companion robots have been used most frequently in the home or domestic facilities. Out of 113 reported facilities in the selected corpus, 42 were home environments. Aged living/care facilities such as residential care facilities, long-term care facilities, rehabilitation centers, and elderly care facilities have also been engaged in companion robot deployment. Educational institutions are among the favored facilities, for example in schools, university campuses, libraries, and dorm rooms. Major inclinations towards companion robots being used for healthcare and wellbeing purposes justify the high instances of care facilities and healthcare centers seen in the literature, while educational facilities indicate an increasing interest and focus on research with robots. Among other facilities, there are some outdoor field implementations as well as some occupational facilities.
4.8 Robot Features
Table 8 demonstrates which features and functionalities have been implemented in robots and with what frequency to create human–robot companionship. According to the list of features, we have divided them into the categories of verbal, non-verbal, expressive, personalization, functional, navigation, and other features. The use of voice, sound, and some sort of verbal communication functionalities have been used very frequently. In terms of non-verbal features, the movement of body parts has been the most used feature. Use of expressive features consisting of emotional expressions through animated faces and eyes were also among common ways of communication. Verbal, non-verbal and expressive features are the most commonly used communication modes for humans, which are also used in companion robots, understandably because humans might find these features familiar and thus easy to interact with.
Personalization features have also been explored, such as emotion recognition [67], song suggestions [77, 78], and person recognition and tracking [3, 89, 170] which allow the robot to provide personalized interaction and output to the user. However, these have not been extensively explored. Touchscreen interaction as a functional feature has been explored significantly. Although this requires a touch-enabled display or tablet, this mode of interaction with robots is easy to implement and easy to use. As half of the robots are mobile, navigation, path planning and following are quite common features of these robots. Lastly, interactive games and memory assistance are other features that have been added to robots.
4.9 Domain-Wise Features
Mapping the features of companion robots to different domains is important because it will help us to understand which features are most useful in which domains, which will give us a clear idea of the state of the art and potentially unexplored domain-feature combinations. The different features used in different domains are shown in Table 9. The most adopted feature in the wellbeing domain is the movement of body parts, which is also popular in most other domains. The reason for it being very popular could be because it helps to create a more human-like approach to interaction. Humans naturally move different parts of their body while interacting, such as their hands, head, eyes, etc., and the designs of companion robots seem to conform to this convention. Sound and haptic feedback are two other popular features that have been employed in wellbeing scenarios. For education scenarios, the movement of body parts feature has again, been frequently employed. Apart from that, touchscreen display, voice, and facial expression are other popular features in this domain. Displays help to show lesson content, and the touch interactivity makes the feedback process easier. The voice and facial expression features help the user to understand feedback and know how they are progressing.
For socialization, features like navigation, movement of body parts, voice-based conversation, and people tracking are more often used. These features indicate the use of robots in social situations where the robot could walk up to people, recognize them, use gestures by moving different body parts, and have voice-based conversations with them. In terms of healthcare scenarios, there are several popular features, such as sound, the movement of body parts, voice communication, haptic feedback, display, and navigation. This indicates that there are different types of applications for healthcare. Again, in disability assistance scenarios, the movement of body parts, voice, and navigation are more commonly used features. Similarly, the motivation and influence scenarios have adopted the same movement of body parts and voice-based features more commonly. In other scenarios such as rehabilitation, entertainment, voice-based communication, navigation, and the movement of body parts are frequently used.
4.10 Interaction Modalities
Different interaction modalities and techniques have been tested on different robotic companions for creating meaningful interactions between humans and robots. Figure 4 presents all of the interaction modalities used in the selected corpus, divided between input and output modalities. For both input and output, voice is the most popular modality [16], 169, 181], while different sounds have also been used quite frequently [68, 92]. Touchscreen displays are another modality that has been commonly used and belongs to both input and output categories. A reason for this could be that connecting a touch display to a robot is easier than implementing complex features [96, 115, 145], and this also allows the user to have a well-defined interaction method that is self-explanatory and easy to use. Notably, instead of building actual robot body parts or faces, many implementations use displays to show a digitally created face and visual features, which might be because building actual robot body parts is more complex than creating a digital avatar or at least a digital face. Also, the communication between humans and robots seems to be limited to giving commands and receiving some kind of related feedback. But this contradicts the notion of companionship, since the robots have the potential to do more than simply obey commands, and become companions rather than servants which are only there for service functions.
Some of the robots take video data as input, which allows for deeper analysis and better personalization features such as emotion or person recognition, and improved actuation based on these features [3, 89, 170]. Touching and handling the robot have also been introduced in different ways, such as through hugging, shaking, patting, squeezing, pushing, and snuggling [34, 110, 131], and mimic the ways how humans usually interact with animal companions.
Similar to the input modalities, voice, display and sound are largely popular as output modalities. Apart from these, different types of gestures and the movement of body parts are very popular. These types of output modalities usually try to imitate human and animal characteristics. The use of LEDs has been frequent to show emotions as well as expressions. Animal-like output modes can also be seen, such as purring, simulated limping, frowning, and tail flipping [42, 67, 131].
4.11 Research Method
Out of the 134 articles reviewed, 39 employed quantitative research methods, 68 employed qualitative research methods, 19 were mixed method studies with both qualitative and quantitative approaches, and 8 were design studies (Table 10). Among the 68 qualitative studies, an ethnographic study (n = 18) was the most common type, and tried to understand the social and behavioral aspects of HRC. The second most common type of study was field study or trial (15). Such studies were conducted as trials or the first field implementation of a designed application of robot companions, and 4 of these studies were pilot studies. Similarly, prototype and concept evaluation studies were also frequent (n = 8) as many of the studies were in the initial stages. Case study (n = 9), focus group study (n = 4), and phenomenological (n = 4) studies featured among the common research methods that have been employed. The types of research methods indicate that there are many HRC-related studies in their initial stages, and there will be more in-depth studies in the works. Accordingly, there have also been 8 design studies where design implications and guidelines are laid out for the future development of HRC.
4.12 Variables
Two major types of independent variables are robot-related variables (where either the robot or robotic applications have been varied), and human-related variables (where human abilities, presence, and other factors have been varied). The robot-related independent variables are divided into categories like assistance, navigation and movement, behavior, roles and types, features, and presence. Table 11 demonstrates the independent variables used in the experiments. For studies in the assistance category, workout assistance and working assistance were used as variables to understand how effective robotic assistants can function in these scenarios compared to other things. For the navigation of robots, the degree and types of obstacles and types of distractions were used as variables to see how the robots react and how the companion human feels because of that reaction. For movement, movement directions and different reactions were used as variables through manipulation, as well as through musical beats.
Different types of robot behaviors have been addressed in experiments, such as movement behavior, interactive behavior, and emotional behavior. For example, the robots would either stop if there were obstacles, or they would use obstacle avoidance in such situations. In addition to their movement, they would also communicate with surrounding people. As variable interactions, robots would act as being playful, randomly empathic, adaptively empathic, neutral, or serious. In some studies, the robots were designed to either express emotion or not.
Roles of robots have been used as variables to deploy them as assistants, psychology coaches, guides, compatriots, and peers, in order to understand their usage scenarios and effectiveness. Different types of robots have also been compared, such as robotic dogs and toy dogs. Robot features such as communication styles, appearance, lifelikeness, memory capabilities, and socialization have been used as variables to understand which features work better and how. However, the most common variable was the presence or existence of the robot itself through control groups.
The dependent variables indicate things that have been measured, so as to understand how they affect human–robot companionship or how they help to create and maintain it. Table 12 lists all of the dependent variables in different categories. We have divided the dependent variables into 3 main categories: robot-related, human-related, and interaction-related. Robots’ performance, intelligence, sociability, likeability, attractiveness, acceptability, and usability are the robot-related variables that have been measured. The human-related dependent variables are divided into behavior, mental perception, and wellbeing-related categories. Behavior-related dependent variables are reading behavior, readiness to change, intention to exercise, learning behavior, and persistence. Mental perception-related variables are perceived companionship, comfort, stress, trust, safety, intimacy, anxiety, and exertion. There are two wellbeing-related behaviors seen as psychological wellbeing and quality of life. Pleasantness, flow, engagement, enjoyment, eeriness, and rapport have been categorized as interaction-related variables because they mainly measure the quality or perceived feeling through the interactions.
4.13 Measurement Instruments
For measuring different dependent variables, a total of 74 scales and questionnaires have been used in the literature. They are divided into 6 categories and are listed in Table 13. For measuring usability and acceptance-related variables which can also be seen as robot-related variables, 16 scales were used. Among them, the Godspeed questionnaire (7), Unified Theory of Acceptance and Use of Technology (UTAUT) (3), Working Alliance Inventory-Short Revised (WAI-SR) (3), and the Almere Model (3) have been used more frequently. For measuring different mental states, emotions and overall mental perceptions, 26 different scales have been used. Among them, The UCLA Loneliness Scale (4), Geriatric Depression Scale (GDS-15) (3), PHQ-9 (3), Borg rating scale (2), Geriatric Depression Scale (GDS) (2), Brief Mood Introspection Scale (BMIS) (2), and Physical Activity Enjoyment Scale (2) have been used more than once. For measuring the quality of life and wellbeing-related variables, the Ryff’s Psychological Wellbeing Scale (RPWS) (3), Quality of Life Alzheimer’s disease (QoL) scale (2), FACES Pain Rating Scale (2), and SF-12 scale (2) have been seen to be used multiple times. The Mini-IPIP (International Personality Item Pool) scale (3), Cohen-Mansfield Agitation Inventory (2), State–Trait Anxiety Inventory (STAI) (2), Robot Attitude Scale (2), and the Medication Adherence Report Scale (MARS) (2) seem to be popular scales to measure attitude and behavior-related variables. There are other small numbers of scales to measure interest and competence which are not very commonly used.
4.14 Analysis Methods
The data gathered through the studies has been analyzed in two main ways—qualitative and quantitative. Table 14 lists all of the data analysis methods divided into qualitative and quantitative categories. There are two types of quantitative analysis methods seen in the literature: descriptive statistics have been used in 18 studies while the others have adopted inferential statistics. The most commonly used inferential statistical methods are ANOVA (19), Paired sample t-tests (12), Pairwise comparisons with the Bonferroni adjustment (6), Wilcoxon signed rank test (5), Kruskal–Wallis test (5), Friedman test (4), and Mann–Whitney U test (6). As qualitative analysis methods, thematic analysis (39) is the most commonly used, along with qualitative content analysis.
The distribution of different analysis methods employed in the HRC literature shows that thematic analysis is the most frequently used method. Inferential statistics are also popular, however, they are not as popular as qualitative methods. This indicates the fact that a significant proportion of the studies carried out around HRC so far are qualitative and explorative in nature. These types of studies usually involve speculative methods, trial-and-error, and phenomenological research which require in-depth and explorative analysis. HRC is a developing field as can be seen from the year-wise publication numbers (see Sect. 4.1), and explorative studies are expected to understand the domain and its proper purpose. On the other hand, there are quantitative studies, which while not as much as qualitative studies, also span a significant portion of the corpus. These quantitative studies include research that aimed to compare HRC to other types of HRI or to a control condition, in order to understand the actual effects of HRC. This also supports the fact that HRC lacks systematic and established quantitative measurement instruments, and our paper can contribute to the development of such instruments since we have extracted the different aspects of companionship seen across studies, and formed a comprehensive list.
5 Findings
We obtained the results shown in the previous section by coding the 134 articles. From these results, we then derived more specific findings (F1–F10) and implications through qualitative content analysis [82]. In this section, we answer the proposed research questions through these findings.
5.1 Frequency of Different Robots Used (F1)
From the robot types, features and interaction modalities, it is evident that human- and animal-like robots and features are adapted by the majority. Many robots have been designed to imitate human- or animal-like behaviors, features, and interactions such as verbal and non-verbal behaviors. Also, animal-specific interaction modes such as touching, hugging and patting are seen quite frequently, which are similar to the interactions between humans and their pet animals. This suggests that anthropomorphic and zoomorphic robots are more frequently used as companions, and the current knowledge trend considers human- and animal-like appearances to be more suitable for non-human companions. Although there are some works where functional or object-like robots have been introduced [74, 91], they are infrequent. An investigation of how functional-looking and caricatured robots can create a sense of companionship by examining what features and behavior apart from appearance can affect the companionship factor between humans and robots is needed, and calls for an increased amount of research on features and behavior regarding the companionship of robots. F1 answers RQ1 (What types of robots or robotic applications have been investigated in the corpus of human–robot companionship?).
5.2 Human- and Animal-like Features (F2)
According to our analysis, anthropomorphic and zoomorphic appearances have been used more frequently, which indicates that HRC developers choose appearances familiar to human users rather than innovating novel robotic expressions. Human-like and social features have been seen in all of the robots used in companionship scenarios, supporting the understanding of HRC. Additionally, all the natural interaction modalities used or recognized by humans such as voice, touch and sound were used quite frequently. This supports the fact that novel robotic expressions are underexplored, and there might be more to uncover when it comes to designing, measuring and analyzing the interactions between humans and robots supported by features that are less likely to exist in living companions. F2 partially answers RQ1 (What types of robots or robotic applications have been investigated in the corpus of human–robot companionship?).
5.3 Indoor versus Outdoor Interaction (F3)
Interaction methods that are common and self-explanatory to humans such as display, sound and voice commands have been used most frequently in the literature. These are some of the easier and less error-prone interaction techniques available, and thus making use of them ensures more seamless communication between humans and robots. However, a difference in interaction modes can be seen for different deployment facilities, especially indoors and outdoors. As mentioned as a limitation, the outdoor implementation of companion robots has been very limited, and most of the interaction modalities mentioned in this study have been applicable to indoor settings. For outdoor settings, the robots are less autonomous as of yet, and thus need to be controlled through wireless controllers via commands. While the robots used in the HRC literature have common features such as voice and sound, they are not utilized much in outdoor scenarios as the purposes in outdoor settings were different. Robots in outdoor settings were deployed mostly for navigation-related purposes [6, 37, 65, 135], and they lack autonomous features. This can be seen as a limitation of the current technology, and it would not be unreasonable to expect better and more natural interactions in the future with outdoor mobile robots. F3 partially answers RQ2 (What are the scenarios in which human–robot companionship has been investigated?) and RQ4 (What are the current gaps and potential future research avenues in human–robot companionship?).
5.4 Limitations in the Explored Domains (F4)
This study was conducted to gather comprehensive knowledge on robots that have been used or employed as human companions in different scenarios. The data suggests that robots have been deployed in many domains such as wellbeing, education, healthcare, assistance, and socialization. This denotes the variety of scenarios where robots can be used as companions. One trend that can be observed here is that, so far, robots have been used as companions in mostly wellbeing, healthcare, or similar domains. In an age where many households already have autonomous robots (e.g., robot vacuum cleaners) [10, 128, 164], more exploration is needed on how robotic companions interact with humans in other contexts, for example indoors, outdoors, day-to-day companionship, and nature exploration. A minor difference can be seen in the types of robots in terms of their deployment in different scenarios. As an example, functional robots are used more in physical activity and outdoor settings [65, 117, 53], zoomorphic robots are more frequently used in elderly care settings [83, 138], while anthropomorphic robots are used more in social situations [31, 115, 145]. Also, the deployment facilities indicate that robots are mostly used as companions in domestic or indoor settings. This could be the result of robotic technology being extremely prone to errors, and it is risky to take them outdoors while still being safe and effective [146, 157]. Another shortcoming of the technology might be that mobile robots still cannot navigate on unusual and uneven terrains, making them unsuitable for outdoor implementations [21]. This is also supported by the fact that half of the robots used in the literature are mobile, but have rarely been used outdoors. F4 partially answers RQ2 (What are the domains in which human–robot companionship has been investigated?) and RQ4 (What are the current gaps and potential future research avenues in human–robot companionship?).
5.5 Definition of Human–Robot Companionship (HRC) (F5)
The definition of human–robot companionship is very unclear from the current corpus. As a result, there is much confusion in the literature about when, how, and in which scenario a robot can be considered as a companion. It is also unclear what characteristics that turn a robot into a companion. So, what is human–robot companionship? An answer to this question might lie in the measured variables in the literature for companionship scenarios. In order to create companionship, robot-related, human-related, and interaction-related factors are all influential.
The robot-related factors mainly consist of robots’ abilities to create a positive impression on humans, as well as their usability and acceptability to humans. The robot needs to be efficient in performing what it is supposed to do, and it needs to apply intelligence while doing so. The robot also needs to be appealing, likable, and have social features in order to make sure users can relate to them, creating a bond in the process.
Human-related factors can be behavior-related, mental perception-related, and wellbeing-related. The robot is expected to be able to influence human behavior in order to improve it. For example, the robot should be able to influence humans’ behavior in exercising, learning, and in social conduct. Through interaction with companion robots, humans should feel comfortable, less stressed, and less anxious. They should also be able to trust the robots and feel safe while in contact with them. Other mental perceptions, such as depression, pain and loneliness should be positively influenced by a companion robot. All of the above-mentioned factors may lead to a better quality of life through psychological wellbeing.
In terms of the interactions between humans and robots, they should be pleasant, engaging, and enjoyable. Being able to create pleasantness in the interaction can lead to the creation of a bond or fellowship that can influence human lives for the better. According to the literature, these factors can create a sense of companionship. These are factors that should be considered when designing robotic companions in the future. It would also be important to investigate and compare how these factors can improve the process of designing robotic companions. Based on the evidence in the literature, HRC can be defined as the bond or fellowship created through:
-
The robot’s intelligence, usability, and ability to influence human behavior
-
The robot’s ability to influence the mental perception of humans that leads to wellbeing and a better quality of life
-
Pleasant, enjoyable and engaging interactions in daily life
F5 partially answers RQ3 (What are the factors that define and influence human–robot companionship in the literature?).
5.6 Roles of Robots (F6)
Robots in the HRC literature have been given diverse roles across the 10 domains discussed in this paper. More often than not, each role has been associated with specific tasks, and the robots’ success as companions depends on their ability to complete those tasks. While the role of a companion can be different for different domains and situations, companionship cannot be created through assistance or task completion only. Ensuring effective co-existence is important when it comes to human–robot interaction [148]. From the analysis of this study, we can see that the roles of robots vary in the form of different services. There are instances [32, 34, 106, 133, 140] where the robots do not provide any specific service and just co-exist, providing humans with a social context. However, service-related roles are also part of this co-existence, where the robots have specific tasks to do. The thing that shapes this coexistence into companionship is the social elements of the interaction, and the perceived mental state of the users. In the literature around HRC, co-existence has generally been explored on a very small scale and service-type interactions have been given more priority. While the service provided is important, ensuring meaningful coexistence is imperative for creating human–robot companionship. F6 partially answers RQ4 (What kind of empirical results are there in the corpus of human–robot companionship?).
5.7 Lack of Co-existence and Co-performance (F7)
Natural interaction methods and scenarios of co-existence have been relatively underexplored. Here, co-existence means the natural process when two entities stay in the same place and acknowledge each other’s existence. There is also a lack of more complex and thoughtfully programmed interactions such as context-aware conversations, voice and facial recognition, and mood recognition. Also, the interaction modalities used indicate that they are heavily oriented to provide communication between humans and robots that focuses on giving commands and receiving feedback. However, the concept of companionship might expect interactions that might facilitate co-performance [80], rather than positioning robots as entities that obey commands. This fact also puts a question mark on the perceived meaning of companionship with robots. From the purpose of the reviewed studies, at times it is hard to infer whether the robots are meant to be companions, or just another order following agent that humans can control. F7 partially answers RQ4 (What kind of empirical results are there in the corpus of human–robot companionship, and what are the current gaps and potential future research avenues?).
5.8 Theoretical and Methodological Limitations (F8)
In some cases, it is unclear what makes a robot a companion in literature. There are several works where robots have been declared as companion robots, but did not exert companion-like behavior. They have been deployed as assistants or caregivers in such scenarios, being given specific tasks with no social interaction. Such implementations can be seen to some extent in industrial environments [40, 73] and teaching [180] where task-based efficacy is considered more important than being a companion. This indicates a lack of a clear theoretical and contextual background for HRC. It is also evident from the corpus that the definition of HRC is not clear. The definition of HRC we provided in F5 can help to resolve the confusion in future studies of whether they are studying companionship or not.
Additionally, there is a clear absence of appropriate frameworks for designing robotic companions. Despite the existence of a number of recently created frameworks [11, 15] for developing social robots, the use of such frameworks has not been documented in the literature under consideration. All of the studies have either used an available commercial robot, or have developed a specific feature to test in companionship scenarios. Since very specific goals have been the focus of every study that has sought to construct robotic companions, the inherent features and companionship elements have all been developed with that specific goal in mind. In addition to the lack of a framework, there is no dedicated scale or measurement instrument to measure HRC. Human-related, robot-related, and interaction-related measures can therefore be seen, but they cannot be directly linked with companionship.
F8 partially answers RQ4 (What are the current gaps and potential future research avenues in human–robot companionship?).
5.9 Nobody Cares About the Robots (F9)
Specific measurements and analysis from the literature puts emphasis on the human perceptions of robots, how humans feel after interacting with the robots, if they enjoy it, if they feel safe, and if they can trust the robot. Although robots are non-living entities and thus do not have human-like feelings, when it comes to companionship, both parties need to be considered. The current corpus only focuses on understanding the interactions from a human perspective, which is obviously an important element in human-centered design and maybe the only element we can appropriately study. However, none of the studies have approached these interaction designs from a robot’s perspective, or from the perspective evaluating the different benefit relationships between humans and robots as suggested by human–machine symbiosis work [63]. Thinking from a robot’s perspective might help us to understand how humans naturally behave around robots, as well as what type of interaction would introduce the feeling of agency between humans and robots. Different levels of agency and the detailed consideration of machines in smart, autonomous, and connected systems have been important issues for more-than-human design and human–machine integration [41, 123]. In our case, it can be equally important to understand the underlying themes of a two-way interaction if companionship is the primary aspect of a study. The literature on HRC does not yet touch upon this perspective. F9 partially answers RQ4 (What kind of empirical results are there in the corpus of human–robot companionship?).
5.10 Companion Robots in Extraordinary or Unusual Social Circumstances (F10)
The year 2020 has been the most prolific year in terms of the number of papers published on and around HRC (see Sect. 4.1). This was the same year when a global COVID-19 pandemic hit the world and sent everyone into social isolation. During the years 2019–2021, people were stuck at home, working from home, and isolating at home to either keep themselves from becoming infected with COVID or so as to not spread it. This led to the majority of the earth’s population spending a lot of time alone, which had severe social and psychological consequences. This also affected the adaptation and research on robots relating to social isolation scenarios. Robots were introduced as companions for older adults in social isolation [99], companion pets in depression and loneliness [54], and companions during lockdown [169, 178]. Although Thunberg and Ziemke [168] have suggested that caregiver presence was more important for older adults in care homes during the pandemic, companion robots have still been seen as one of the possible technological solutions in such scenarios. This was not the first pandemic for humanity, nor will it be the last [129], and with the expected ecological crisis [126] or with conflicting political situations between countries, the societally extreme conditions that can occur can lead to situations of enforced social isolation which might render the companionship value of robots even more important. The companionship aspects of robots have been one of the prominent avenues explored in unusual social scenarios. Therefore, it can be concluded that companion robots have been one of the avenues of interest explored during extraordinary and unusual social circumstances such as the global pandemic, and this might also be the case in any similar future circumstances. F10 partially answers RQ2 (What are the scenarios in which human–robot companionship has been investigated?).
6 Future Agenda
Based on the findings of our research, we have identified 4 broad types of agenda, namely thematic, theoretical, methodological, and technological. Each agenda consists of several directions which are listed in Table 15, together with their connection to the findings presented previously.
6.1 Thematic Agenda: Next Phases of HRC
6.1.1 Diversity of Explored Domains and Scenarios (A1)
Direction 1. Investigate how companionship can be created outdoors and how the interaction with robotic companions differs between indoors and outdoors. The majority of implementations in the HRC domain have been in indoor settings. The analysis suggests that only 4 out of 113 facilities where companion robots were deployed have been outdoor settings. However, there can be numerous applications for outdoor companion robots, few of which are explored in the literature, such as a walking [66] or jogging [65, 53] companion. While these studies have presented the concept and tried to reflect on its potential, more in-depth studies are needed to understand the domain and find effective design implications for such companion robots. There are several instances where companion robots have been employed in the entertainment domain in indoor facilities, which indicates a possible gap in exploring the outdoor recreational use of such robots. Recreational uses of robotic companions in outdoor scenarios are relatively unexplored. Additionally, the recreational value of outdoor activities such as spending time in nature or taking a walk outdoors is well known [1]. Social interaction and companions are an important part of this because there is always an inherent value of social connection [28], shared experiences [9, 186], or feeling secure against any uncertainties (e.g., wildlife, unknown paths) of exploring unknown outdoor territory.
Direction 2. Understand the domain-specific needs and characteristics of companion robots to help improve their effectiveness and efficiency. Thus far, robots have been deployed to perform very specific tasks in certain domains, and have been limited to tasks only. Additionally, in terms of domain, wellbeing and education have seen the majority of implementations. However, other domains such as socialization, healthcare, influence, and entertainment show a lot of promise in terms of positive outcomes. Also, there seems to be a lack of knowledge on which features and types of robots are suitable for each domain. This points to a need to diversify the domains and purposes of using companion robots, in order to make them more effective and efficient in diverse contexts.
6.1.2 Daily Companionship and Co-existence (A2)
Direction 3. Understand which roles are suitable for companion robots in day-to-day lives. The current state of the research shows that the deployment of companion robots have been mostly explorative, and that several roles have overlapped between multiple domains. Also, the roles of companion robots have so far been decided based on the tasks they are supposed to complete, somewhat ignoring some of their intangible contributions like agency, social presence, and psychological support. This is the usual nature of a developing field, and with more time and exploration, more concrete directions can be found. Introducing day-to-day companionship would have potential to discover contexts and situations where robots accompany humans in their daily lives similar to other humans in a family. When a robot is deployed for a specific task for a specific scenario, the interaction becomes more reliant on task completion and the robot’s ability to do it efficiently. In contrast, having robots present in daily life scenarios will streamline their usage as companions. This will potentially help create a new perspective of having robots around, not just to do specific tasks as a service, but to accompany humans in different activities. Instead of framing robots as things that follow orders like servants, the concept of companionship might anticipate interactions that could facilitate co-performance.
Direction 4. Study how co-existence and co-performance affect the interaction between humans and robots in different situations. Co-existence is considered to be a vital element for the creation of companionship, as it represents two entities sharing the same space at the same time without the need to engage in direct interaction at all times [5, 176]. Although the function of a companion may vary depending on the context and domain, companionship cannot be established solely through help or task performance. It is crucial to ensure effective co-existence when it comes to interactions between humans and robots. Thus, it will be very important to explore scenarios where robots just coexist with humans, providing them with a social context, yet offering no particular service. Robots will only be fully perceived as companions if and when they are considered as co-existing beings and not deployed for any specific task-based entity.
6.1.3 Diversity of Robot Types (A3)
Direction 5. Understand the effectiveness of robotic companions with different appearances in diverse scenarios. A total of 89 different robots were used in different companionship scenarios, where each robot had a unique appearance. HRC is a very dynamic field, and each separate situation demands contextual decisions, thus making the choice of robots somewhat unpredictable. However, it is unclear why a certain type of robot with a certain appearance is chosen for companionship scenarios. There is a lack of justification around the choice of robots, which also supports previous claims that the majority of the work done in the domain so far has been explorative. While explorative work offers a greater expansion of the field and helps draw novel conclusions, it would still be important to follow general guidelines and a justification around different types of robots. This is especially needed for developers for focusing their product towards certain customers, and researchers from other domains who need to decide which robots to adopt for their specific purpose.
Direction 6. Explore how perceived companionship is related to the appearance and type of robots. Analysis of the types of robots used in different domains implies that anthropomorphic and zoomorphic robots are more frequently utilized as companions, and that the present knowledge trend favors non-human companions with human- and animal-like appearances. Robots that are functional or resemble objects have been used in several works, however, this is not very common. There are several previous studies that have focused on understanding how the embodiment of robots might affect companionship [141, 172], and have found that embodiment has a positive influence on perceived companionship and agency. However, these findings relate to any type of embodiment in general, while the reviewed studies only focus on specific types of embodiments, such as anthropomorphic or zoomorphic. To obtain a deeper understanding of how different types of embodiment and the appearance of robots differently influence companionship, comparative studies need to be performed. This will also inform us as to the reasons behind the popularity of anthropomorphic and zoomorphic robots as well as the shortcomings or advantages of using other types.
6.1.4 Effects of Robot Features (A4)
Direction 7. Understand how different features and their combinations affect the sense of HRC to evaluate their relative effectiveness. 18 different types of high-level features have been explored in the extant HRC research. The most common features are the movement of body parts, voice/sound, gestures, and expressions. But it is interesting to note that these features were not particularly chosen to be designed for companion robots—rather, robots which happened to have these features were chosen to be companions. This indicates the focus on choosing a type of robot instead of investigating how a specific feature of the robot affects HRC. This creates a need for investigating different features and their combinations in different scenarios. Focusing on specific features and even combinations of features will play an important role in understanding how each feature can make interactions more meaningful, and how different combinations of these features can have varied consequences. Additionally, it will help robot developers to know their effects and so create more efficient robots for specific scenarios.
Direction 8. Reflect on features other than human-like ones to create diversity in development and prevent uncanny interactions. Human-like features are commonly seen in the investigation of HRC and the development of companion robots. It is obvious that humans can relate to these features more deeply, creating an avenue for robots to be perceived as more human-like or ‘known’. However, whenever human-like behaviors are developed artificially, there is a chance of creating what is known as ‘the uncanny valley effect’ [173]. The uncanny valley effect can be described as the unsettling or eerie feeling that people experience while interacting with something that attempts to imitate human qualities, but fails to be realistic. This effect can occur when robots are developed with human-like features, but not very efficiently so that they clearly appear to be uncanny or eerie. This does not mean that human-like features should be avoided, but they should be developed with caution so that machines like robots do not try to be the replacement of a human, but rather become an alternative with their own identity. Features that represent machines such as robotic movements or voices can be examples of such features that can draw the boundary and so set proper expectations.
6.1.5 Building from a Robot’s Perspective (A5)
Direction 9. Explore designs from the robot’s perspective in addition to creating distinct identities for robots, in order to better understand the inherent dynamics of HRC. The rapidly increasing number of robots introduced in scenarios involving humans indicates a future where robots will play an important role in society. Robots are now more frequently developed with artificial intelligence, which is supposed to make them more suitable for human companionship through meaningful interaction. We are bracing ourselves to welcome a posthuman or transhuman future [56, 85], where human–machine symbiosis will become a very important factor as humans and machines become dependent on each other to perform tasks [84]. Robots have been touted to become parts of humans, or even the other way around. But despite the fact that robots are non-living objects without emotions, both parties must be taken into account when looking for companionship. Companionship is a two-way street where both parties play certain roles and achieve their goals through co-performance [94]. In that regard, it would be crucial to understand the interactions that robots respond to, and if they truly have feelings, how those feelings would change depending on how they were engaged with. Thinking and building from a robot’s perspective could also be fruitful in order to find the right balance when it comes to the seamless integration of robots into human lives.
6.1.6 Designing for Extraordinary or Unusual Social Situations (A6)
Direction 10. Conduct speculative and exploratory studies with companion robots considering extraordinary and unforeseen social situations. The recent COVID-19 pandemic led to people become socially isolated inside their homes, and saw several instances of companion robot deployment [54, 144, 178] for reducing loneliness and stress related. This is a novel, emergent and practical real-life scenario where companion robots have undergone effective deployment. In line with this finding, it would be beneficial to test the deployment of companion robots for future situations where another socially unusual scenario might occur. For this, an investigation into which unusual social situations might occur and how we could design HRC to address those situations might be of great value. This will help us prepare for such situations by already having some insight into how companion robots might be able to help, instead of starting from scratch. This can also be linked to the understanding of the roles that robotic companions can take up in the future.
6.2 Theoretical Agenda: Defining HRC
6.2.1 Understanding the Definition of Companionship (A7)
Direction 11. Understand the determinants of how and when a robot becomes a companion to humans, in order to clarify the definition of HRC. Overall, the study provides a comprehensive set of information on robots that have been used as companions. However, the types of interactions suggest a more commanding and obeying relationship between humans and robots which contradicts companionship, and thus needs to be defined more carefully. It is very important to define the roles of a robot when considered as a companion, as command-following machines cannot be autonomous whereas companions are supposed to be autonomous. However, the idea of companionship means facilitating interactions that create co-performance [94] scenarios, rather than positioning robots as only command-following entities.
In some cases, it is unclear what makes a robot a companion. Several studies have employed similar robots in similar scenarios as assistive or social robots, but these have not been touted as companions. In such literature, robots have been considered as assistants with no other interaction or social behavior, or only focused on specific tasks. Such implementations are usually in industrial environments [40, 73], and teaching [180], to some extent where task-based efficacy is considered more important than being a companion. Also, in some of the studies employing the same type of robots for different purposes, it has not been consistent as to whether they are considered as companions or not. This means that companionship is more related to the types of interactions with the robot, rather than the robot’s appearance or features. All of this indicates that a clearer view is needed of what a companion robot is, and in which scenarios. Particularly, there needs to be a clear theoretical and contextual background for HRC studies which will help to understand when and how companionship is formed. This leads us to the next theoretical agenda which is the definition of HRC. Currently, there are no established definitions of HRC in the literature, which can lead to confusion regarding questions about robots being companions or not. Defining or understanding how HRC is formed can help resolve any confusion in future studies as to whether they are studying companionship or not, and our study provides the first effort to define what HRC is (in F5), while opening a space for further definitions that can be based on empirical findings.
6.3 Methodological Agenda: Designing and Evaluating the Evolutions of Companionship Between Humans and Robots
6.3.1 Measurement of HRC (A8)
Direction 12. Develop instruments for measuring and evaluating HRC in different scenarios. In the whole corpus of HRC, there is no dedicated scale or measurement instrument that may be used to measure companionship. Each study has approached the measurement of HRC differently. They have investigated specific factors of companionships in most cases, such as human-related, robot-related, and interaction-related where categorized. This makes sense as HRC is a very diverse field where each study differs from the others in terms of motivation, scenario, and expected outcomes. This makes it hard to evaluate companionship factors comprehensively. It is clear that different factors combine to form some sort of evaluation of the perceived companionship. All of the measurement instruments used in the corpus have specific purposes, such as usability and acceptance, mental states and emotions, quality of life and wellbeing, attitude and behavior, and prior or situational interest and competence. Established questionnaires have been used for such different measurements. But while this is suitable and makes sense for such a diverse field, it might be beneficial to understand how these factors might combine to form companionship. Consequently, an investigation into the measurement of HRC combining all of these factors might be of good value.
6.3.2 Guidelines for Designing and Evaluating Robotic Companions (A9)
Direction 13. Create comprehensive guidelines and design implications for robotic companions in different domains/scenarios. There is an evident lack of frameworks for designing robotic companions. Although there are several frameworks [11, 15] for designing social robots, there is no framework for designing companion robots—hence the use of frameworks has gone unseen in the considered literature. All of the studies that have attempted to design robotic companions have targeted very specific purposes, and thus the features and companionship factors have been designed according to that specific purpose. This is logical in the sense that each unique scenario demands different considerations, and it is hard to replicate guidelines from studies that belong to a different domain. As a result, in some cases, researchers have created customized user studies or surveys to understand design implications, but these have been employed on very small scales, and thus do not have the credibility to be considered as sufficient for future referential use. While it is not feasible to develop a generalized guideline for designing companion robots due to the diversity of scenarios, it might be good to prepare guidelines at least for specific domains where studies usually have similar patterns.
6.4 Technological Agenda: Designing and Developing the New Era of Robotic Companions
6.4.1 Efficacy and Availability of Robotics Technology (A10)
Direction 14. Develop more robust and less error-prone robotic systems and make them accessible for deployment in diverse facilities both indoors and outdoors. Robotic technology has improved a great deal in recent years, however, the technological shortcomings are still evident in the literature. As mentioned in the analysis, the majority of implementations have been employed indoors. Outdoor adaptations are seen less frequently, mostly because the technology is not ready. Robots are still error-prone, especially those that are mobile or have navigation capabilities. Outdoor environments do not have consistently plain roads, and if robots are expected to navigate through uneven terrain, it makes them much more error-prone than they already are. This is another reason for the lack of diversity in the employment domains of companion robots. As a result, there is a need to improve the currently available robotic technology. The use of artificial intelligence and advanced data manipulation techniques can help robots to become less error-prone, especially in outdoor settings. The Boston Dynamics Spot (Spot®—The Agile Mobile Robot | Boston Dynamics, 2022) robot dog is a prime example of such advancement. But although long-term sustainability is yet to be tested, it is a step in the right direction.
Direction 15. Develop more affordable technology in order for it to be accessible for a mass of people. The fact that these technologies are very new and sophisticated makes it harder to make them more affordable to a mass of people. Especially, very few people have the ability to invest in a companion robot for personal use. This lack of affordability and accessibility to companion robots prevents people from enjoying the benefits of companionship. The lack of adaptation also creates a knowledge gap for companion robots in terms of mass adaptability and their long term effects. We need to develop affordable and cost-efficient robots so that we can learn to deliver ideal companionship experiences by learning from a mass amount of users. Robots are technologies for the future, and if we are to reap the benefits of such cutting-edge technology, it is imperative to make them more easily accessible to the general population.
7 Limitations
Ample amounts of studies have deployed robots in different human interaction scenarios, however, we were specifically interested in companionship scenarios for this study. For this reason, we only reviewed articles that have used or at least mentioned robots as companions in their studies. We searched for literature that explicitly mentioned companionship with robots. This might have limited the span of the study, and the fact that HRC is often confused with other human–robot interaction scenarios has not helped. Even in the literature that we reviewed, there were often scenarios where confusion emerged about how a robot becomes a companion to humans. Thus, as a result of the search criteria we set, we may have missed some relevant literature.
As this study is a systematic literature review, the main objective was to understand what has been done so far in the domain of HRC, in terms of the robots used, influential aspects of the interactions, and research and analysis methods. While we have been able to summarize these factors and at the same time identify the gaps in the literature, it was not possible to derive guidelines, for example, based on hands-on practice. However, we have pointed out several future agendas and directions, each of which can be turned into standalone investigations that might produce specific guidelines. Following on from this, we expect and welcome the research community to attempt to cover these directions, so as to create a more comprehensive knowledge of the domain.
Another limitation of the study could be the fact that we have only searched for literature through the Scopus database. Querying multiple databases could have resulted in more literature. However, we chose the Scopus database because of its extensive coverage of peer-reviewed literature. Additionally, systematic literature reviews have the limitation of being restricted by the search criteria they employ. Our search string was created to provide access to all studies whose main focus is on robotic companionship in a way that either the abstract, the title, or the keywords include the words “robot*” and “companion*”. We acknowledge that we might have missed out on some literature because of the limitations of the systematic literature review process itself. In addition, we were not able to access all of the eligible articles as some of them were not open access.
8 Conclusion
This study reports a systematic review of 134 peer-reviewed articles to understand how robots have been used as human companions, as well as their usage scenarios, features, and interaction modalities. There have been no previous studies that have tried to understand the state-of-the-art of human–robot-companionship (HRC), including domains, robots used, deployment facilities, robot roles, features, research methods, analysis methods, and future avenues. In addition, there was a lack of comprehensive understanding about what HRC entails, and how it is developed, influenced, and leveraged. Several studies have attempted to investigate standalone factors such as embodiment, empathy, and trust. Our biggest contribution is to address this through the presented systematic literature review. This study has addressed the aforementioned gaps and contributed knowledge to the field in three concrete ways. We have presented a comprehensive overview of the field including explored domains, deployment facilities, robots used, features of robots, roles of robots, interaction modalities, research methods, analysis methods, and measurement instruments. We expect that these analyses will be helpful for a wide array of HRC researchers from different fields by providing information regarding existing predominant domains, methods, interaction modalities, robot behaviors and types, and also by identifying gaps in these mentioned areas. HRC researchers can use this paper as a starting point for designing studies by having a broad understanding of the field in terms of what has already been done, and where knowledge gaps need to be addressed. The study can also work as an inspiration to conduct studies in more underexplored fields which have been signposted in the findings and agendas section of this paper, for example, diversifying the domains, and understanding the effectiveness of robot features in different domains. Furthermore, these contributions can also be helpful for the companion robot industry by way of giving a clear overview of the available companion robots, their features, and usage scenarios. This will help designers and developers better understand the scope of demand, and to develop more suitable robotic technologies as companions.
A major finding from the study is that robots with human-like and animal-like features have been used more frequently for companionship scenarios. We found that robots have been deployed as companions in many different scenarios, with wellbeing, education, socialization, and disability assistance being some of the prominent areas of deployment. The roles and features of companion robots have also varied in different domains. Similar to human and animal-like features, human and animal-like interaction modalities are often seen in the literature with companion robots.
In terms of research methods, a balanced mix of qualitative and quantitative studies have been carried out, while a significant number of mixed method and design studies also reported. Study variables have revolved around human-related and robot-related factors, which indicates that both human- and robot-related factors have been taken into account for understanding companionship. We also gathered lists of the measurement instruments and analysis methods seen in the literature. The number of different instruments is particularly high which is a result of trying to measure a large variety of variables, but not specifically companionship. This has also served to indicate that there is no measurement instrument available for HRC at the moment.
The major findings of the review include a lack of outdoor deployment, a lack of a concrete understanding of HRC, and a limited variety in usage scenarios with co-existence and co-performance factors. We concluded our analysis of the corpus with four broad types of agenda, summarized as thematic, theoretical, methodological, and technological. The thematic agenda includes diversifying explored domains and robot types, introducing co-existence and co-performance, and understanding the effects of different features on different scenarios. The theoretical agenda mainly focused on creating a better understanding of HRC and the factors that influence it, and expanding the definition we have made in this paper with further empirical studies. The methodological agenda included the creation of a framework for designing and evaluating companion robots, as well as a dedicated measurement instrument for HRC. Finally, the technological agenda calls for a better availability and efficiency of companion-type robots in the future.
References
Aasetre J, Gundersen V (2012) Outdoor recreation research: different approaches, different values? Norsk Geografisk Tidsskrift - Norwegian J Geogr 66(4):193–203. https://doi.org/10.1080/00291951.2012.707987
Ab Aziz A, Ghanimi HMA (2020) Reading with robots: a personalized robot-based learning companion for solving cognitively demanding tasks. Int J Adv Sci Eng Inf Technol 10(4):1489
Abdollahi H, Mollahosseini A, Lane JT, Mahoor MH (2017) A pilot study on using an intelligent life-like robot as a companion for elderly individuals with dementia and depression. In: 2017 IEEE-RAS 17th international conference on humanoid robotics (Humanoids), 541–546. https://doi.org/10.1109/HUMANOIDS.2017.8246925
Abendschein B, Edwards A, Edwards C (2022) Novelty experience in prolonged interaction: a qualitative study of socially-isolated college students’ in-home use of a robot companion animal. Front Robot AI 9:733078. https://doi.org/10.3389/frobt.2022.733078
Abu-Nimer M (2001) Reconciliation, justice, and coexistence: theory and practice. Lexington Books, Maryland
Acosta Calderon CA, Mohan RE, Zhou C (2008) Robotic companion lead the way! In: 2008 10th international conference on control, automation, robotics and vision, 1504–1510. https://doi.org/10.1109/ICARCV.2008.4795747
Ahmed E, Buruk O “Oz,” Hamari J (2022) Robots as human companions: a review. PACIS 2022 proceedings. https://aisel.aisnet.org/pacis2022/246
Arnold L (2016) EmobieTM: A robot companion for children with anxiety. In: 2016 11th ACM/IEEe international conference on human-robot interaction (HRI), 413–414. https://doi.org/10.1109/HRI.2016.7451782
Arts I, Fischer A, Duckett D, van der Wal R (2021) The Instagrammable outdoors – investigating the sharing of nature experiences through visual social media. People Nature 3(6):1244–1256. https://doi.org/10.1002/pan3.10239
Asafa TB, Afonja TM, Olaniyan EA, Alade HO (2018) Development of a vacuum cleaner robot. Alex Eng J 57(4):2911–2920. https://doi.org/10.1016/j.aej.2018.07.005
Axelsson M, Oliveira R, Racca M, Kyrki V (2021) Social robot co-design canvases: a participatory design framework. ACM Trans Human–Robot Interact 11(1):1–39. https://doi.org/10.1145/3472225
Axelsson M, Racca M, Weir D, Kyrki V (2019) A participatory design process of a robotic tutor of assistive sign language for children with autism. In: 2019 28th IEEE international conference on robot and human interactive communication (RO-MAN), 1–8. https://doi.org/10.1109/RO-MAN46459.2019.8956309
Bakracheva M, Chivarov N, Ivanov A (2020) Companion robotic assistants for improving the quality of life of people with disabilities. Int Conf Autom Inform (ICAI) 2020:1–6. https://doi.org/10.1109/ICAI50593.2020.9311320
Barker SB, Wolen AR (2008) The benefits of human-companion animal interaction: a review. J Vet Med Educ 35(4):487–495. https://doi.org/10.3138/jvme.35.4.487
Bartneck C, Forlizzi J (2004) A design-centred framework for social human-robot interaction. RO-MAN 2004. In: 13th IEEE international workshop on robot and human interactive communication (IEEE Catalog No.04TH8759), 591–594. https://doi.org/10.1109/ROMAN.2004.1374827
Bautista AN, Gerardo JR, Lallave H, Luigi P, Ong E, Cu J, Lapinid MR, Limjap A (n.d.) (2020) Towards the design of a robot peer-tutor to help children learn math problem-solving. 3
Baxter G, Sommerville I (2011) Socio-technical systems: from design methods to systems engineering. Interact Comput 23(1):4–17. https://doi.org/10.1016/j.intcom.2010.07.003
Bellamy R, Andrist S, Bickmore T, Churchill E, Erickson T (2017). Human-agent collaboration: Can an agent be a partner? 1289–1294. https://doi.org/10.1145/3027063.3051138
Belpaeme T, Kennedy J, Ramachandran A, Scassellati B, Tanaka F (2018) Social robots for education: a review. Sci Robot. https://doi.org/10.1126/scirobotics.aat5954
Bertacchini F, Bilotta E, Pantano P (2017) Shopping with a robotic companion. Comput Hum Behav 77:382–395. https://doi.org/10.1016/j.chb.2017.02.064
Biswal P, Mohanty PK (2021) Development of quadruped walking robots: a review. Ain Shams Eng J 12(2):2017–2031. https://doi.org/10.1016/j.asej.2020.11.005
Biswas M, Murray JC (2015) Towards an imperfect robot for long-term companionship: case studies using cognitive biases. IEEE/RSJ Int Conf Intell Robots Syst (IROS) 2015:5978–5983. https://doi.org/10.1109/IROS.2015.7354228
Bradwell HL, Winnington R, Thill S, Jones RB (2020) Longitudinal diary data: six months real-world implementation of affordable companion robots for older people in supported living. In: companion of the 2020 ACM/IEEE international conference on human-robot interaction, 148–150. https://doi.org/10.1145/3371382.3378256
Briggs G, Scheutz M (2014) How robots can affect human behavior: investigating the effects of robotic displays of protest and distress. Int J of Soc Robotics 6(3):343–355. https://doi.org/10.1007/s12369-014-0235-1
Broadbent E, Peri K, Kerse N, Jayawardena C, Kuo Ih, Datta C, MacDonald B (2014) Robots in older people’s homes to improve medication adherence and quality of life: a randomised cross-over trial. In: Beetz M, Johnston B, Williams M-A (eds) Social robotics. Springer International Publishing, Berlin, pp 64–73
Buunk BP, Peeters MCW (1994) Stress at work, social support and companionship: towards an event-contingent recording approach. Work Stress 8(2):177–190. https://doi.org/10.1080/02678379408259988
Buunk BP, Verhoeven K (1991) Companionship and support at work: a microanalysis of the stress-reducing features of social interaction. Basic Appl Soc Psychol 12(3):243–258. https://doi.org/10.1207/s15324834basp1203_1
Cacioppo JT, Patrick W (2008) Loneliness: human nature and the need for social connection. W. W. Norton & Company, New York
Cagiltay, B., White, N. T., Ibtasar, R., Mutlu, B., & Michaelis, J. (2022). Understanding Factors that Shape Children’s Long Term Engagement with an In-Home Learning Companion Robot. Interaction Design and Children, 362–373. https://doi.org/10.1145/3501712.3529747
Caruana N, Moffat R, Miguel-Blanco A, Cross ES (2023) Perceptions of intelligence & sentience shape children’s interactions with robot reading companions. Sci Rep 13(1):7341. https://doi.org/10.1038/s41598-023-32104-7
Casas JA, Céspedes N, Cifuentes CA, Gutierrez LF, Rincón-Roncancio M, Múnera M (2019) Expectation versus reality: attitudes towards a socially assistive robot in cardiac rehabilitation. Appl Sci 9(21):4651
Casey D, Barrett E, Kovacic T, Sancarlo D, Ricciardi F, Murphy K, Koumpis A, Santorelli A, Gallagher N, Whelan S (2020) The perceptions of people with dementia and key stakeholders regarding the use and impact of the social robot MARIO. Int J Environ Res Public Health 17(22):E8621. https://doi.org/10.3390/ijerph17228621
Chau JM, Hilario J, Lopez E, Bejar E, Ramirez J, Moran A (2021) BINBOT: a low-cost robot companion that teaches basic concepts through binary questions. In: 2021 7th international conference on control, automation and robotics (ICCAR), 93–96. https://doi.org/10.1109/ICCAR52225.2021.9463449
Chen K, Lou VW, Tan KC, Wai M, Chan L (2020) Effects of a humanoid companion robot on dementia symptoms and caregiver distress for residents in long-term care. J Am Med Dir Assoc 21(11):1724-1728.e3. https://doi.org/10.1016/j.jamda.2020.05.036
Chen Y-C, Yeh S-L, Lin W, Yueh H-P, Fu L-C (2023) The effects of social presence and familiarity on children-robot interactions. Sensors 23(9):4231. https://doi.org/10.3390/s23094231
Chu J, Zhao G, Li Y, Fu Z, Zhu W, Song L (2019) Design and implementation of education companion robot for primary education. In: 2019 IEEE 5th international conference on computer and communications (ICCC), 1327–1331. https://doi.org/10.1109/ICCC47050.2019.9064253
Clotet E, Martínez D, Moreno J, Tresanchez M, Pallejà T, Font D, Teixidó M, Palacín J (2014) Outdoor robotic companion based on a google androidTM smartphone and GPS guidance. In: Omatu S, Bersini H, Corchado JM, Rodríguez S, Pawlewski P, Bucciarelli E (eds) Distributed computing and artificial intelligence, 11th international conference. Springer International Publishing, Berlin, pp 433–440
Coeckelbergh M (2011) Artificial companions: empathy and vulnerability mirroring in human–robot relations. Studies in ethics, law, and technology, 4(3). https://doi.org/10.2202/1941-6008.1126
Coghlan S, Waycott J, Lazar A, Barbosa Neves B (2021) Dignity, autonomy, and style of company: dimensions older adults consider for robot companions. Proc ACM Human–Comput Interact 5(CSCW1):1–25. https://doi.org/10.1145/3449178
Corteville B, Aertbelien E, Bruyninckx H, De Schutter J, Van Brussel H (2007) Human-inspired robot assistant for fast point-to-point movements. In: Proceedings 2007 IEEE international conference on robotics and automation, 3639–3644. https://doi.org/10.1109/ROBOT.2007.364036
Coulton P, Lindley JG (2019) More-than human centred design: considering other things. Des J 22(4):463–481. https://doi.org/10.1080/14606925.2019.1614320
de Graaf MMA, Allouch SB (2017) The influence of prior expectations of a robot’s lifelikeness on users’ intentions to treat a zoomorphic robot as a companion. Int J Soc Robot 9(1):17–32. https://doi.org/10.1007/s12369-016-0340-4
de Medeiros WG (2014) Meaningful interaction with products. Des Issues 30(3):16–28
Degiorgi, M., Garzotto, F., Gelsomini, M., Leonardi, G., Penati, S., Ramuzat, N., Silvestri, J., Clasadonte, F., & Kinoe, Y. (2017). Puffy—An inflatable robotic companion for pre-schoolers. 2017 26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), 35–41. https://doi.org/10.1109/ROMAN.2017.8172277
Dix A, Finlay J, Abowd GD, Beale R (2004) Human-computer interaction. Pearson Education, London
Döring N, Richter K, Gross H-M, Schröter C, Müller S, Volkhardt M, Scheidig A, Debes K (2016) Robotic companions for older people: a case study in the wild. Stud Health Technol Inform 219:147–152. https://doi.org/10.3233/978-1-61499-595-1-147
Dotson MJ, Hyatt EM (2008) Understanding dog–human companionship. J Bus Res 61(5):457–466. https://doi.org/10.1016/j.jbusres.2007.07.019
Edwards A, Edwards C, Abendschein B, Espinosa J, Scherger J, Vander Meer P (2020) Using robot animal companions in the academic library to mitigate student stress. Library Hi Tech. https://doi.org/10.1108/LHT-07-2020-0148
Engler S, Hunter J, Binsted K, Leung H (2018) Robotic companions for long term isolation space missions. In: 2018 15th international conference on ubiquitous robots (UR), 424–430. https://doi.org/10.1109/URAI.2018.8441838
Fasola J, Mataric MJ (2012) Using socially assistive human-robot interaction to motivate physical exercise for older adults. Proc IEEE 100(8):2512–2526. https://doi.org/10.1109/JPROC.2012.2200539
Feil-Seifer D, Matarić MJ (2011) Socially assistive robotics. IEEE Robot Autom Mag 18(1):24–31. https://doi.org/10.1109/MRA.2010.940150
Fisicaro D, Garzotto F, Gelsomini M, Pozzi F (2019) ELE - a conversational social robot for persons with neuro-developmental disorders. In: Lamas D, Loizides F, Nacke L, Petrie H, Winckler M, Zaphiris P (eds) Human-computer interaction – interact 2019. Springer International Publishing, Berlin, pp 134–152
Floyd Mueller F, Muirhead M (2015) Jogging with a quadcopter. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. ACM Seoul Republic of Korea, pp. 2023–2032. https://doi.org/10.1145/2702123.2702472
Fogelson DM, Rutledge C, Zimbro KS (2022) The impact of robotic companion pets on depression and loneliness for older adults with dementia during the COVID-19 pandemic. J Holist Nurs 40(4):397–409. https://doi.org/10.1177/08980101211064605
Fong T, Nourbakhsh I, Dautenhahn K (2003) A survey of socially interactive robots. Robot Auton Syst 42:143–166. https://doi.org/10.1016/S0921-8890(02)00372-X
Forlano L (2017) Posthumanism and design She Ji. J Des Econom Innov 3(1):16–29. https://doi.org/10.1016/j.sheji.2017.08.001
Forlizzi J, Disalvo C (2006) Service robots in the domestic environment: a study of the roomba vacuum in the home. 258–265. https://doi.org/10.1145/1121241.1121286
Friedmann E, Son H (2009) The human-companion animal bond: how humans benefit. Vet Clin North Am: Small Anim Pr 39(2):293–326. https://doi.org/10.1016/j.cvsm.2008.10.015
Gallagher CP, Niewiadomski R, Bruijnes M, Huisman G, Mancini M (2020) Eating with an artificial commensal companion. In: companion publication of the 2020 international conference on multimodal interaction, 312–316. https://doi.org/10.1145/3395035.3425648
Garrell A, Sanfeliu A (2012) Cooperative social robots to accompany groups of people. Int J Robot Res 31(13):1675–1701. https://doi.org/10.1177/0278364912459278
Garzotto F, Gelsomini M, Kinoe Y (2017) Puffy: a mobile inflatable interactive companion for children with neurodevelopmental disorder (p. 492). https://doi.org/10.1007/978-3-319-67684-5_29
Gaver W, Sengers P, Kerridge T, Kaye J, Bowers J (2007) Enhancing ubiquitous computing with user interpretation: field testing the home health horoscope. In: proceedings of the SIGCHI conference on human factors in computing systems, 537–546. https://doi.org/10.1145/1240624.1240711
Gerber A, Derckx P, Döppner DA, Schoder D (2020) Conceptualization of the human-machine symbiosis – a literature review. Hawaii international conference on system sciences 2020 (HICSS-53). https://aisel.aisnet.org/hicss-53/cl/machines_as_teammates/5
Gordon G, Spaulding S, Westlund JK, Lee JJ, Plummer L, Martinez M, Das M, Breazeal C (n.d.). Affective personalization of a social robot tutor for children’s second language skills. 7.
Graether E, Mueller F (2012) Joggobot: a flying robot as jogging companion. CHI ’12 extended abstracts on human factors in computing systems, 1063–1066. https://doi.org/10.1145/2212776.2212386
Gross H-M, Meyer S, Scheidig A, Eisenbach M, Mueller S, Trinh TQ, Wengefeld T, Bley A, Martin C, Fricke C (2017) Mobile robot companion for walking training of stroke patients in clinical post-stroke rehabilitation. IEEE Int Conf Robot Autom (ICRA) 2017:1028–1035. https://doi.org/10.1109/ICRA.2017.7989124
Gross H-M, Mueller S, Schroeter C, Volkhardt M, Scheidig A, Debes K, Richter K, Doering N (2015) Robot companion for domestic health assistance: implementation, test and case study under everyday conditions in private apartments. IEEE/RSJ Int Conf Intell Robots Syst (IROS) 2015:5992–5999. https://doi.org/10.1109/IROS.2015.7354230
Gross H-M, Scheidig A, Muller S, Schutz B, Fricke C, Meyer S (2019) Living with a mobile companion robot in your own apartment—final implementation and results of a 20-weeks field study with 20 seniors. Int Conf Robot Autom (ICRA) 2019:2253–2259. https://doi.org/10.1109/ICRA.2019.8793693
Gross H.M, Schroeter Ch, Mueller S, Volkhardt M, Einhorn E, Bley A, Langner T, Merten M, Huijnen C, Heuvel H van den, Berlo A van (2012) Further progress towards a home robot companion for people with mild cognitive impairment. In: 2012 IEEE international conference on systems, man, and cybernetics (SMC), 637–644. https://doi.org/10.1109/ICSMC.2012.6377798
Hagio Y, Kamimura M, Hoshi Y, Kaneko Y, Yamamoto M (2022) TV-watching robot: toward enriching media experience and activating human communication. SMPTE Motion Imag J 131(4):50–58. https://doi.org/10.5594/JMI.2022.3160804
Hansika W, Nanayakkara L, Silva P, Gammanpila A (2020) AuDimo: a musical companion robot to switching audio tracks by recognizing the users engagement. https://doi.org/10.1007/978-3-030-60117-1_7
Hebesberger D, Koertner T, Gisinger C, Pripfl J, Dondrup C (2016) Lessons learned from the deployment of a long-term autonomous robot as companion in physical therapy for older adults with dementia a mixed methods study. In: 2016 11th ACM/IEEE international conference on human-robot interaction (HRI), 27–34. https://doi.org/10.1109/HRI.2016.7451730
Helms E, Schraft RD, Hagele M (2002) rob@work: Robot assistant in industrial environments. In: 11th IEEE international workshop on robot and human interactive communication proceedings, 399–404. https://doi.org/10.1109/ROMAN.2002.1045655
Hirokawa E, Suzuki K (2018) Design of a huggable social robot with affective expressions using projected images. Appl Sci 8(11):2298. https://doi.org/10.3390/app8112298
Hirose J, Hirokawa M, Suzuki K (2014) Robotic gaming companion to facilitate social interaction among children. In: the 23rd IEEE international symposium on robot and human interactive communication, 63–68. https://doi.org/10.1109/ROMAN.2014.6926231
Ho H-R, Cagiltay B, White NT, Hubbard EM, Mutlu B (2021) RoboMath: designing a learning companion robot to support children’s numerical skills. Interaction design and children, 283–293. https://doi.org/10.1145/3459990.3460709
Hoffman G, Bauman S, Vanunu K (2016) Robotic experience companionship in music listening and video watching. Pers Ubiquit Comput 20:1–13. https://doi.org/10.1007/s00779-015-0897-1
Hoffman G, Vanunu K (2013) Effects of robotic companionship on music enjoyment and agent perception. In: 2013 8th ACM/IEEE international conference on human-robot interaction (HRI), 317–324. https://doi.org/10.1109/HRI.2013.6483605
Holbrook MB, Stephens DL, Day E, Holbrook SM, Strazar G (n.d.). Copyright © 2001 Academy of marketing science. a collective stereographic photo essay on key aspects of animal companionship: the truth about dogs and cats
Hosseini SMF, Lettinga D, Vasey E, Zheng Z, Jeon M, Park CH, Howard AM (2017) Both “look and feel” matter: essential factors for robotic companionship. In: 2017 26th IEEE international symposium on robot and human interactive communication (RO-MAN), 150–155. https://doi.org/10.1109/ROMAN.2017.8172294
Hsiao H-S, Chang C-S, Lin C-Y, Hsu H-L (2015) “iRobiQ”: the influence of bidirectional interaction on kindergarteners’ reading motivation, literacy, and behavior. Interact Learn Environ 23(3):269–292. https://doi.org/10.1080/10494820.2012.745435
Hsieh H-F, Shannon S (2005) Three approaches to qualitative content analysis. Qual Health Res 15:1277–1288. https://doi.org/10.1177/1049732305276687
Ihamäki P, Heljakka K (2021) Robot dog intervention with the golden pup: activating social and empathy experiences of elderly people as part of intergenerational interaction. Hawaii international conference on system sciences. https://doi.org/10.24251/HICSS.2021.230
Inga J, Ruess M, Robens JH, Nelius T, Rothfuß S, Kille S, Dahlinger P, Lindenmann A, Thomaschke R, Neumann G, Matthiesen S, Hohmann S, Kiesel A (2023) Human-machine symbiosis: a multivariate perspective for physically coupled human-machine systems. Int J Hum Comput Stud 170:102926. https://doi.org/10.1016/j.ijhcs.2022.102926
Jasanoff S (2016) Perfecting the human: posthuman imaginaries and technologies of reason. In: Hurlbut JB, Tirosh-Samuelson H (eds) Perfecting Human futures: transhuman visions and technological imaginations. Springer, Berlin, pp 73–95
Jawale PP, Ohol SS (2024) Improvisation in human-robot interaction using optimized multimodal operational techniques. In: Shaw RN, Siano P, Makhilef S, Ghosh A, Shimi SL (eds) Innovations in electrical and electronic engineering. Springer, Berlin, pp 403–413
Jeong S, Alghowinem S, Aymerich-Franch L, Arias K, Lapedriza A, Picard R, Park HW, Breazeal C (2020) A robotic positive psychology coach to improve college students’ wellbeing. In: 2020 29th IEEE international conference on robot and human interactive communication (RO-MAN), 187–194. https://doi.org/10.1109/RO-MAN47096.2020.9223588
Joglekar P, Kulkarni V (2018) Humanoid robot as a companion for the senior citizens. IEEE Punecon 2018:1–4. https://doi.org/10.1109/PUNECON.2018.8745399
Kahn PH, Friedman B, Perez-Granados DR, Freier NG (2004) Robotic pets in the lives of preschool children. 4.
Khosla R, Chu M-T (2013) Embodying care in Matilda: an affective communication robot for emotional wellbeing of older people in Australian residential care facilities. ACM Trans Manag Inf Syst 4(4):1–33. https://doi.org/10.1145/2544104
Khot RA, Arza ES, Kurra H, Wang Y (2019) FoBo: towards designing a robotic companion for solo dining. In: extended abstracts of the 2019 CHI conference on human factors in computing systems, 1–6. https://doi.org/10.1145/3290607.3313069
Kidd CD, Taggart W, Turkle S (2006) A sociable robot to encourage social interaction among the elderly. In: proceedings 2006 IEEE international conference on robotics and automation, 2006. ICRA 2006, 3972–3976. https://doi.org/10.1109/ROBOT.2006.1642311
Kohori T, Hirayama S, Hara T, Muramatsu M, Naganuma H, Yamano M, Ichikawa K, Matsumoto H, Uchiyama H (2018) Development and evaluation of an interactive therapy robot. In: Cheok AD, Inami M, Romão T (eds) Advances in computer entertainment technology. Springer International Publishing, Berlin, pp 66–83
Kuijer L, Giaccardi E (2018) Co-performance: conceptualizing the role of artificial agency in the design of everyday life. https://doi.org/10.1145/3173574.3173699
Kwak SS, Kim Y, Kim E, Shin C, Cho K (2013) What makes people empathize with an emotional robot?: the impact of agency and physical embodiment on human empathy for a robot. In: 2013 IEEE RO-MAN, pp. 180–185. https://doi.org/10.1109/ROMAN.2013.6628441
Law M, Sutherland C, Ahn HS, MacDonald BA, Peri K, Johanson DL, Vajsakovic D-S, Kerse N, Broadbent E (2019) Developing assistive robots for people with mild cognitive impairment and mild dementia: a qualitative study with older adults and experts in aged care. BMJ Open 9(9):e031937. https://doi.org/10.1136/bmjopen-2019-031937
Lee KM, Peng W, Jin S-A, Yan C (2006) Can robots manifest personality?: an empirical test of personality recognition, social responses, and social presence in human–robot interaction. J Commun 56(4):754–772. https://doi.org/10.1111/j.1460-2466.2006.00318.x
Lee H, Eizad A, Pyo S, Afzal MR, Oh M-K, Jang Y-J, Yoon J (2020) Development of a robotic companion to provide haptic force interaction for overground gait rehabilitation. IEEE Access 8:34888–34899. https://doi.org/10.1109/ACCESS.2020.2973672
Lee OE, Lee H, Park A, Choi NG (2022) My precious friend: human-robot interactions in home care for socially isolated older adults. Clin Gerontol 47:161–170. https://doi.org/10.1080/07317115.2022.2156829
Lehmann H, Iacono I, Dautenhahn K, Marti P, Robins B (2014) Robot companions for children with down syndrome: a case study Interaction Studies. Soc Behav Commun Biol Artif Syst 15(1):99–112
Lehmann H, Iacono I, Robins B, Marti P, Dautenhahn K (2011) “Make it move”: Playing cause and effect games with a robot companion for children with cognitive disabilities. In: proceedings of the 29th annual european conference on cognitive Ergonomics - ECCE ’11, 105. https://doi.org/10.1145/2074712.2074734
Leite I, Castellano G (2012) Modelling empathic behaviour in a robotic game companion for children: an ethnographic study in real-world settings. 8
Leite I, Mascarenhas S, Pereira A, Martinho C, Prada R, Paiva A (2010) ”Why Can’t we be friends?” An empathic game companion for long-term interaction (p. 321). https://doi.org/10.1007/978-3-642-15892-6_32
Leite I, Pereira A, Mascarenhas S, Martinho C, Prada R, Paiva A (2013) The influence of empathy in human–robot relations. Int J Hum Comput Stud 71(3):250–260. https://doi.org/10.1016/j.ijhcs.2012.09.005
Lewis M, Sycara K, Walker P (2018) The role of trust in human-robot interaction. In: Abbass HA, Scholz J, Reid DJ (eds) Foundations of trusted autonomy. Springer International Publishing, Berlin, pp 135–159
Liang A, Piroth I, Robinson H, MacDonald B, Fisher M, Nater UM, Skoluda N, Broadbent E (2017) A pilot randomized trial of a companion robot for people with dementia living in the community. J Am Med Dir Assoc 18(10):871–878. https://doi.org/10.1016/j.jamda.2017.05.019
Lu S-C, Blackwell N, Do E (2011) mediRobbi: An interactive companion for pediatric patients during hospital visit (p 556). https://doi.org/10.1007/978-3-642-21605-3_60
Mahdi H, Akgun SA, Saleh S, Dautenhahn K (2022) A survey on the design and evolution of social robots—past, present and future. Robot Auton Syst 156:104193. https://doi.org/10.1016/j.robot.2022.104193
Malik SA, Aburahmah L, Azuddin M (2022) An exploratory study on the use of social companion robot for adults with motor disabilities. In: Saeed F, Mohammed F, Ghaleb F (eds) Advances on intelligent informatics and computing. Springer International Publishing, Berlin, pp 616–629
Mancini M, Niewiadomski R, Huisman G, Bruijnes M, Gallagher CP (2020) Room for one more? - introducing artificial commensal companions. In: extended abstracts of the 2020 CHI conference on human factors in computing systems, 1–8. https://doi.org/10.1145/3334480.3383027
Martelaro N, Nneji VC, Ju W, Hinds P (2016). Tell me more designing HRI to encourage more trust, disclosure, and companionship. In: 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI), 181–188. https://doi.org/10.1109/HRI.2016.7451750
Marti P, Iacono I (2011) Learning through play with a robot companion. In: Everyday Technology for Independence and Care, IOS Press, pp. 526–533. https://doi.org/10.3233/978-1-60750-814-4-526.
Marti P, Iacono I (n.d.). Learning through play with a robot companion. 9
Mayadunne MMMS, Manawadu UA, Abeyratne KR, De RS, Silva P (2020) A robotic companion for children diagnosed with autism spectrum disorder. Int Conf Image Process Robot (ICIP) 2020:1–6. https://doi.org/10.1109/ICIP48927.2020.9367368
McGinn C, Bourke E, Murtagh A, Donovan C, Cullinan MF (2019) Meeting stevie: perceptions of a socially assistive robot by residents and staff in a long-term Care Facility. In: 2019 14th ACM/IEEE international conference on human-robot interaction (HRI), 602–603. https://doi.org/10.1109/HRI.2019.8673161
Menezes P, Rocha RP (2021) Promotion of active ageing through interactive artificial agents in a smart environment. SN Appl. Sci. 3(5):583. https://doi.org/10.1007/s42452-021-04567-8
Meyer S, Fricke Ch (2017) Robotic companions in stroke therapy: a user study on the efficacy of assistive robotics among 30 patients in neurological rehabilitation. In: 2017 26th IEEE international symposium on robot and human interactive communication (RO-MAN), 135–142. https://doi.org/10.1109/ROMAN.2017.8172292
Michaelis JE, Mutlu B (2017) Someone to read with: design of and experiences with an in-home learning companion robot for reading. In: proceedings of the 2017 CHI conference on human factors in computing systems, 301–312. https://doi.org/10.1145/3025453.3025499
Michaelis JE, Mutlu B (2018) Reading socially: transforming the in-home reading experience with a learning-companion robot. Sci Robot. https://doi.org/10.1126/scirobotics.aat5999
Michaelis JE, Mutlu B (2018b) Social Reading: Field study with an in-home learning companion robot. 2.
Moyle W, Jones C, Sung B, Bramble M, O’Dwyer S, Blumenstein M, Estivill-Castro V (2016) What effect does an animal robot called cuddler have on the engagement and emotional response of older people with dementia? A pilot feasibility study. Int J Soc Robot 8(1):145–156. https://doi.org/10.1007/s12369-015-0326-7
Mubin O, Stevens CJ, Shahid S, Mahmud AA, Dong J-J (2013) A review of the applicability of robots in education. Technol Educ Learn 1(1):13. https://doi.org/10.2316/Journal.209.2013.1.209-0015
Mueller FF, Lopes P, Strohmeier P, Ju W, Seim C, Weigel M, Nanayakkara S, Obrist M, Li Z, Delfa J, Nishida J, Gerber EM, Svanaes D, Grudin J, Greuter S, Kunze K, Erickson T, Greenspan S, Inami M, Maes P (2020). Next steps for human-computer integration. In: proceedings of the 2020 CHI conference on human factors in computing systems, 1–15. https://doi.org/10.1145/3313831.3376242
Niewiadomski R, Bruijnes M, Huisman G, Gallagher CP, Mancini M (2022) Social robots as eating companions. Front Comput Sci 4:909844. https://doi.org/10.3389/fcomp.2022.909844
O’Brien C, O’Mara M, Issartel J, McGinn C (2021) Exploring the Design Space of Therapeutic Robot Companions for Children. In: proceedings of the 2021 ACM/IEEE international conference on human-robot interaction, 243–251. https://doi.org/10.1145/3434073.3444669
O’Connor B, Bojinski S, Röösli C, Schaepman ME (2020) Monitoring global changes in biodiversity and climate essential as ecological crisis intensifies. Eco Inform 55:101033. https://doi.org/10.1016/j.ecoinf.2019.101033
Okita SY (2013) Self–other’s perspective taking: the use of therapeutic robot companions as social agents for reducing pain and anxiety in pediatric patients. Cyberpsychol Behav Soc Netw 16(6):436–441. https://doi.org/10.1089/cyber.2012.0513
Orejana J, Macdonald B, Ahn H, Peri K, Broadbent E (2015) healthcare robots in homes of rural older adults (p. 521). https://doi.org/10.1007/978-3-319-25554-5_51
Osterholm MT (2020) Preparing for the next pandemic. In The Covid-19 Reader. Routledge
Page MJ, McKenzie JE, Bossuyt PM, Boutron I, Hoffmann TC, Mulrow CD, Shamseer L, Tetzlaff JM, Akl EA, Brennan SE, Chou R, Glanville J, Grimshaw JM, Hróbjartsson A, Lalu MM, Li T, Loder EW, Mayo-Wilson E, McDonald S, Moher D (2021). The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. BMJ, 372, n71. https://doi.org/10.1136/bmj.n71
Passler Bates D, Young JE (2020) SnuggleBot: a novel cuddly companion robot design. In: proceedings of the 8th international conference on human-agent interaction, 260–262. https://doi.org/10.1145/3406499.3418772
Piezzo C, Suzuki K (2017) Feasibility study of a socially assistive humanoid robot for guiding elderly individuals during walking. Future Internet 9(3):30. https://doi.org/10.3390/fi9030030
Pike J, Picking R, Cunningham S (2021) Robot companion cats for people at home with dementia: a qualitative case study on companotics. Dementia 20(4):1300–1318. https://doi.org/10.1177/1471301220932780
Randall N, Bennett CC, Šabanović S, Nagata S, Eldridge L, Collins S, Piatt JA (2019) More than just friends: In-home use and design recommendations for sensing socially assistive robots (SARs) by older adults with depression Paladyn. J Behav Robot 10(1):237–255. https://doi.org/10.1515/pjbr-2019-0020
Repiso E, Garrell A, Sanfeliu A (2020) Adaptive side-by-side social robot navigation to approach and interact with people. Int J Soc Robot 12(4):909–930. https://doi.org/10.1007/s12369-019-00559-2
Repiso E, Garrell A, Sanfeliu A (2020) People’s adaptive side-by-side model evolved to accompany groups of people by social robots. IEEE Robot Autom Lett 5(2):2387–2394. https://doi.org/10.1109/LRA.2020.2970676
Ritschel H, Janowski K, Seiderer A, Wagner S, André E (2019) Insights on usability and user feedback for an assistive robotic health companion with adaptive linguistic style. In: proceedings of the 12th ACM international conference on PErvasive technologies related to assistive environments, 319–320. https://doi.org/10.1145/3316782.3322737
Robinson H, Broadbent E, MacDonald B (2016) Group sessions with Paro in a nursing home: structure, observations and interviews: sessions with Paro in a nursing home. Australas J Ageing 35(2):106–112. https://doi.org/10.1111/ajag.12199
Robinson H, MacDonald B, Broadbent E (2014) The role of healthcare robots for older people at home: a review. Int J Soc Robot 6(4):575–591. https://doi.org/10.1007/s12369-014-0242-2
Robinson H, MacDonald B, Kerse N, Broadbent E (2013) The psychosocial effects of a companion robot: a randomized controlled trial. J Am Med Dir Assoc 14(9):661–667. https://doi.org/10.1016/j.jamda.2013.02.007
Roesler E, Manzey D, Onnasch L (2023) Embodiment matters in social HRI research: effectiveness of anthropomorphism on subjective and objective outcomes. ACM Trans Human-Robot Interact 12(1):7–1. https://doi.org/10.1145/3555812
Rook KS (1987) Social support versus companionship: effects on life stress, loneliness, and evaluations by others. J Pers Soc Psychol 52(6):1132–1147. https://doi.org/10.1037/0022-3514.52.6.1132
Rook KS (1990) Social relationships as a source of companionship: Implications for older adults’ psychological wellbeing. In Social support: An interactional view. Wiley, 219–250
Ross GM (2023) Dancing with robots: acceptability of humanoid companions to reduce loneliness during COVID-19 (and beyond). AI & Soc. https://doi.org/10.1007/s00146-023-01738-6
Rossi A, Garcia F, Maya AC, Dautenhahn K, Koay KL, Walters ML, Pandey AK (2019) Investigating the effects of social interactive behaviours of a robot on people’s trust during a navigation task. In: Althoefer K, Konstantinova J, Zhang K (eds) Towards autonomous robotic systems. Springer International Publishing, Berlin, pp 349–361
Rubio F, Valero F, Llopis-Albert C (2019) A review of mobile robots: Concepts, methods, theoretical framework, and applications. Int J Adv Rob Syst 16(2):1729881419839596. https://doi.org/10.1177/1729881419839596
Salem M, Rohlfing K, Kopp S, Joublin F (2011) A friendly gesture: investigating the effect of multimodal robot behavior in human-robot interaction. In: 2011 RO-MAN, pp. 247–252. https://doi.org/10.1109/ROMAN.2011.6005285
Salvini P, Laschi C, Dario P (2010) Design for acceptability: improving robots’ coexistence in human society. Int J Soc Robot 2(4):451–460. https://doi.org/10.1007/s12369-010-0079-2
Sarabia M, Demiris Y (2013) A humanoid robot companion for wheelchair users (Vol. 8239). https://doi.org/10.1007/978-3-319-02675-6_43
Sarabia M, Young N, Canavan K, Edginton T, Demiris Y, Vizcaychipi MP (2018) Assistive robotic technology to combat social isolation in acute hospital settings. Int J Soc Robot 10(5):607–620. https://doi.org/10.1007/s12369-017-0421-z
Schneider S, Kümmert F (2016) Exercising with a humanoid companion is more effective than exercising alone. In: 2016 IEEE-RAS 16th international conference on humanoid robots (Humanoids), 495–501. https://doi.org/10.1109/HUMANOIDS.2016.7803321
Schneider S, Kummert F (2016) Motivational effects of acknowledging feedback from a socially assistive robot. In: Agah A, Cabibihan J-J, Howard AM, Salichs MA, He H (eds) Social robotics. Springer, Berlin, pp 870–879
Schroeter Ch, Mueller S, Volkhardt M, Einhorn E, Huijnen C, van den Heuvel H, van Berlo A, Bley A, Gross H-M (2013) Realization and user evaluation of a companion robot for people with mild cognitive impairments. IEEE Int Conf Robot Autom 2013:1153–1159. https://doi.org/10.1109/ICRA.2013.6630717
Shamsuddin S, Zulkifli W, Lim TH, Yussof H (2017) Animal robot as augmentative strategy to elevate mood: a preliminary study for post-stroke depression, 218. https://doi.org/10.1007/978-3-319-66471-2_23
Sharma VK, Murthy LRD, Biswas P (2022) Enabling learning through play: inclusive gaze-controlled human-robot interface for joystick-based toys. In: Cavallo F, Cabibihan J-J, Fiorini L, Sorrentino A, He H, Liu X, Matsumoto Y, Ge SS (eds) Social robotics. Springer, Berlin, pp 452–461
Sharma V, Saluja K, Mollyn V, Biswas P (2020) Eye gaze controlled robotic arm for persons with severe speech and motor impairment. 1–9. https://doi.org/10.1145/3379155.3391324
So S, Lee N (2023) Pedagogical exploration and technological development of a humanoid robotic system for teaching to and learning in young children. Cogent Educ 10(1):2179181. https://doi.org/10.1080/2331186X.2023.2179181
Soleiman P, Salehi S, Mahmoudi M, Ghavami M, Moradi H, Pouretemad H (2014) RoboParrot: a robotic platform for human robot interaction, case of autistic children. Second RSI/ISM Int Conf Robot Mechatron (ICRoM) 2014:711–716. https://doi.org/10.1109/ICRoM.2014.6990987
Spot®—The Agile Mobile Robot | Boston Dynamics. (n.d.). Retrieved February 8, 2022, from https://www.bostondynamics.com/products/spot
Sundar SS, Jung EH, Waddell TF, Kim KJ (2017) Cheery companions or serious assistants? Role and demeanor congruity as predictors of robot attraction and use intentions among senior citizens. Int J Hum Comput Stud 97:88–97. https://doi.org/10.1016/j.ijhcs.2016.08.006
Talami F, Romero M, Borga G (2021) Edù, a robotic companion in pediatric protective isolation units. In: Malvezzi M, Alimisis D, Moro M (eds) Education in & with robotics to Foster 21st-Century Skills. Springer International Publishing, Berlin, pp 103–107
Tan CKK, Lou VWQ, Cheng CYM, He PC, Mor YY (2023) Technology acceptance of a social robot (LOVOT) among single older adults in Hong Kong and Singapore: protocol for a multimethod study. JMIR Res Protoc 12:e48618. https://doi.org/10.2196/48618
Tapus A, Tapus C, Mataric MJ (2009) The use of socially assistive robots in the design of intelligent cognitive therapies for people with dementi. In: 2009 IEEE International Conference on Rehabilitation Robotics, pp. 924–929. https://doi.org/10.1109/ICORR.2009.5209501
Thomas DR (2006) A general inductive approach for analyzing qualitative evaluation data. Am J Eval 27(2):237–246. https://doi.org/10.1177/1098214005283748
Thompson M, Carlson D, Zivnuska S, Whitten G (2012) Support at work and home: the path to satisfaction through balance. J Vocat Behav 80:299–307. https://doi.org/10.1016/j.jvb.2012.01.001
Thunberg S, Rönnqvist L, Ziemke T (2020) Do robot pets decrease agitation in dementia patients?: An ethnographic approach (pp 616–627). https://doi.org/10.1007/978-3-030-62056-1_51
Thunberg S, Ziemke T (2021) Pandemic effects on social companion robot use in care homes. In: 2021 30th IEEE international conference on robot & human interactive communication (RO-MAN), 983–988. https://doi.org/10.1109/RO-MAN50785.2021.9515465
Tsiourti C, Pillinger A, Weiss A (2020) Was Vector a companion during shutdown?: insights from an ethnographic study in Austria. In: proceedings of the 8th international conference on human-agent interaction, 269–271. https://doi.org/10.1145/3406499.3418767
Ullrich D, Diefenbach S, Butz A (2016) Murphy miserable robot: a companion to support children’s wellbeing in emotionally difficult situations. In: proceedings of the 2016 CHI conference extended abstracts on human factors in computing systems, 3234–3240. https://doi.org/10.1145/2851581.2892409
Uluer P, Akalın N, Köse H (2015) A new robotic platform for sign language tutoring: humanoid robots as assistive game companions for teaching sign language. Int J Soc Robot 7(5):571–585. https://doi.org/10.1007/s12369-015-0307-x
Wainer J, Feil-seifer DJ, Shell DA, Mataric MJ (2006) The role of physical embodiment in human-robot interaction. In: ROMAN 2006 - The 15th IEEE international symposium on robot and human interactive communication, 117–122. https://doi.org/10.1109/ROMAN.2006.314404
Walters ML, Syrdal DS, Dautenhahn K, te Boekhorst R, Koay KL (2008) Avoiding the uncanny valley: robot appearance, personality and consistency of behavior in an attention-seeking home scenario for a robot companion. Auton Robot 24(2):159–178. https://doi.org/10.1007/s10514-007-9058-3
Webster J, Watson RT (2002) Analyzing the past to prepare for the future: writing a literature review. MIS Q, 26(2), xiii–xxiii
Wells DL (2009) the effects of animals on human health and wellbeing. J Soc Issues 65(3):523–543. https://doi.org/10.1111/j.1540-4560.2009.01612.x
Weng Y-H, Chen C-H, Sun C-T (2009) Toward the human-robot co-existence Society: on safety intelligence for next generation robots. Int J Soc Robot 1(4):267. https://doi.org/10.1007/s12369-009-0019-1
Westlund JK, Breazeal C (2015) The interplay of robot language level with children’s language learning during storytelling. In: proceedings of the tenth annual ACM/IEEE international conference on human-robot interaction extended abstracts, 65–66. https://doi.org/10.1145/2701973.2701989
Wilson R, Keane I, Jones R (2022) Affective responses of older adults to the anthropomorphic genieconnect companion robot during lockdown of the COVID19 Pandemic. In: 2022 17th ACM/IEEE international conference on human-robot interaction (HRI), 1095–1099. https://doi.org/10.1109/HRI53351.2022.9889480
Wu J, Zeng D, Yang B, Gen H, Takishima Y, Hagio Y, Kamimura M, Hoshi Y, Kaneko Y, Nishimoto Y (2021) TV-watching companion robot supported by open-domain chatbot “KACTUS.” In: 20th international conference on mobile and ubiquitous multimedia, 230–232. https://doi.org/10.1145/3490632.3497865
Yamazaki R, Nishio S, Nagata Y, Satake Y, Suzuki M, Kanemoto H, Yamakawa M, Figueroa D, Ishiguro H, Ikeda M (2023) Long-term effect of the absence of a companion robot on older adults: a preliminary pilot study. Front Comput Sci 5:1129506. https://doi.org/10.3389/fcomp.2023.1129506
Yueh H, Lin W, Wang S, Fu L (2020) Reading with robot and human companions in library literacy activities: a comparison study. Br J Edu Technol 51(5):1884–1900. https://doi.org/10.1111/bjet.13016
Zhang BJ, Quick R, Helmi A, Fitter NT (2020) Socially assistive robots at work: making break-taking interventions more pleasant, enjoyable, and engaging. IEEE/RSJ Int Conf Intell Robots Syst (IROS) 2020:11292–11299. https://doi.org/10.1109/IROS45743.2020.9341291
Zhang X, Breazeal C, Park HW (2023) A social robot reading partner for explorative guidance. In: proceedings of the 2023 ACM/IEEE international conference on human-robot interaction, 341–349. https://doi.org/10.1145/3568162.3576968
Zhao Z, McEwen R (2022) “Let’s read a book together”: a long-term study on the usage of pre-school children with their home companion robot. In: 2022 17th ACM/IEEE international conference on human-robot interaction (HRI), 24–32. https://doi.org/10.1109/HRI53351.2022.9889672
Zinina A, Kotov A, Arinkin N, Zaidelman L (2023) Learning a foreign language vocabulary with a companion robot. Cogn Syst Res 77:110–114. https://doi.org/10.1016/j.cogsys.2022.10.007
Zsiga K, Tóth A, Pilissy T, Péter O, Dénes Z, Fazekas G (2018) Evaluation of a companion robot based on field tests with single older adults in their homes. Assist Technol 30(5):259–266. https://doi.org/10.1080/10400435.2017.1322158
Zylstra M (2018) Meaningful nature experiences: pathways for deepening connections between people and place, pp 40–57. https://doi.org/10.4324/9781315108186-3
Acknowledgements
This research has been conducted as part of employment with Tampere University, as a research study by the Gamification Group. This research is funded by the Academy of Finland Flagship Programme (337653—Forest-Human-Machine Interplay (UNITE)). Statement about Expansion of Previous Work: This submission is an extension of an accepted paper in the Pacific Asia Conference on Information Systems (PACIS) 2022. The title of the submission to PACIS 2022 is “Robots as Human Companions: A Review” (https://aisel.aisnet.org/pacis2022/246/). It is a systematic review of literature that investigates and gives a comprehensive overview of robots that have been used as human companions by analyzing different factors such as usage domains, scenarios, robot types, facilities, and interaction modalities. Implications gathered from the aforementioned factors have been summarized as well in the conference paper. The conference paper focused on the more technical aspects of companion robots and their usage. As an extension of the conference paper (accepted in PACIS 2022), we have expanded the scope of our analysis widely, and apart from the analysis in PACIS 2022, in this paper, we cover a comprehensive understanding of how human experience is shaped through companion robots and design considerations for future research. On top of that, a new literature search has been included in the extended version to cover studies from the later part of 2021 until October 2023. This extended version gives an extensive overview of what companionship between humans and robots entails by analyzing distinct facets such as the domains of use, usage scenarios, deployment facilities, roles of robots in different domains, and robot features adopted for different domains. Adding to the aforementioned facets, we have analyzed and summarized the research methods, variables in experiments, measurement instruments, and analysis methods in the literature. We have also reshaped the findings accordingly to fit the additional analysis. In this paper, we investigate and review the newly emerging area of robotic companions as well as their relationship to human practice. Moving from a more technical and robot-related focus in the conference paper, in the extended version, we have focused on providing a comprehensive overview of the literature that investigated human–robot companionship. We have also attempted to address several analyses such as the relationship or interactions between robots and humans as well as how this relationship is affected by different factors. In addition to a broader analysis, the paper proposes a set of practical future agendas (thematic, theoretical, methodological, and technological) for advancing the research on Human–Robot Companionship in general.
Funding
Open access funding provided by Tampere University (including Tampere University Hospital). This research is funded by the Academy of Finland Flagship Programme (Forest-Human–Machine Interplay (UNITE))—Grant number: 337653.
Author information
Authors and Affiliations
Contributions
The idea of the article was discussed and agreed upon by the authors by analyzing the significance and possible impact of the resultant study. The search keywords and inclusion criteria were decided through discussion with all the authors. The search and compilation of articles were done by the first author. The first author then performed the initial analysis while directions and comments were given by the other authors with a view to solidifying the analysis. The manuscript writing was initially done by the first author and it was then reviewed and improved by the second and third authors.
Corresponding author
Ethics declarations
Conflict of interest
All authors certify that they have no affiliations with or involvement in any organization or entity with any financial interest or non-financial interest in the subject matter or materials discussed in this manuscript.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendix A: Corpus overview
Appendix A: Corpus overview
Title | Domain | Robot used | Deployment facility | Prominent robot features | Participant age group and range | interaction span | |
---|---|---|---|---|---|---|---|
Input | Output | ||||||
Promotion of active ageing through interactive artificial agents in a smart environment | Motivation & influence | GrowMu social robot | Home | Smart camera network inside the house to detect inactivity and sending to the social robot for proceeding to the exercise routine Voice recognition to understand if user accepts the invitation | Synthesized voice Facial expression through LEDs Head movement Display | Elderly, NA | 25–30 min |
Robot companion cats for people at home with dementia: A qualitative case study on companotics | Healthcare | The ‘Ageless Innovation’ companion cat | Home | Light Touch | Mouth movement with sound Eye and head movements Rollover | NA | 3 months |
Robot dog intervention with the golden pup: Activating social and empathy experiences of elderly people as part of intergenerational interaction | Wellbeing | Joy for all Companion Pet Dog | Day activity center | Touch Sound | Voice Heartbeat | Elderly, 65–80 | 80 min |
Towards the design of a robot peer-tutor to help children learn math problem-solving | Education | Vi robot,developed on top of the NEC PaPeRo (Partner-type Personal Robot) robotic platform | School | Voice | LEDs to exhibit different behavior such as speaking, smiling, listening, thinking and processing information Nodding and head shaking movements | Children, 6–8 | 30–60 min |
SnuggleBot: A novel cuddly companion robot design | Socialization | Snugglebot | Home | Snuggle Hug | Flips its tail when hugged Changing color of horn | NA | NA |
Was vector a companion during shutdown?: insights from an ethnographic study in Austria | Socialization | Anki Vector | Home | Voice | Voice, Movements, dance | NA | 8 months |
The perceptions of people with dementia and key stakeholders regarding the use and impact of the social robot mario | Socialization | Mario | Residential care facility | Voice and touchscreen | Tablet display | Elderly, 50–76 | 12 month |
Effects of a humanoid companion robot on dementia symptoms and caregiver distress for residents in long-term care | Healthcare | Kabochan | Long term care facility | Voice, striking, shaking | talking, singing, and head nodding | Elderly, 67–108 | 32 weeks |
Eating with an artificial commensal companion | Motivation & influence | FoBo | Chocolate testing facility | No input, simulated | Gaze, nod, happy and sad sounds | NA | 3 min |
Socially assistive robots at work: Making break-taking interventions more pleasant, enjoyable, and engaging | Wellbeing | Anki Cozmo | Office desk | Timer that triggers every 30 min Video recording | Display, head movement | Young, 19–30 | 3 h |
Companion robotic assistants for improving the quality of life of people with disabilities | Disability assistance | ROBOCO | Club for people with disabilities | Joystick, voice commands, gestures and facial expressions | Movement, grabbing | Young & Elderly, 25–80 | NA |
Reading with robot and human companions in library literacy activities: A comparison study | Education | Social Robot Julia | Library | Voice | Voice Display Winking gesture | NA | 60–80 min |
A robotic positive psychology coach to improve college students’ wellbeing | Wellbeing | Jibo robot on portable robot station | On-campus dormitory rooms | Camera Sound/voice Touch screen command | Voice | NA | 1–5 weeks |
Adaptive side-by-side social robot navigation to approach and interact with people | Socialization | Tibi Dabo | Outdoor Street | Commands through display Voice | Display Voice LEDs to create facial expression | Young & Elderly, 15–76 | NA |
Room for one more?—Introducing artificial commensal companions | Socialization | myKeepon | NA | Pat on the head Touch Sound Music | Sound Rotating Jump Gaze | NA | 20 min |
People’s adaptive side-by-side model evolved to accompany groups of people by social robots | Socialization | Tibi | Outdoor Street | Commands through display Voice | Display Voice LEDs to create facial expression | Young, NA | NA |
Longitudinal diary data: Six months real-world implementation of affordable companion robots for older people in supported living | Wellbeing | Joy for All (JfA) animals | Supported living and care facility | Light Touch | Mouth movement with sound Eye and head movements Rollover Heartbeat | NA | 6 months |
A robotic companion for children diagnosed with autism spectrum disorder | Disability assistance | Developed a robot with mechanical instruments and arduino | Home | Mobile application | Lights Movement | Children, 7–10 | 20–30 min/week, for 4 weeks |
Do robot pets decrease agitation in dementia patients?: An ethnographic approach | Healthcare | Joy for All (JfA) animals | Home | Light Touch | Mouth movement with sound Eye and head movements Rollover Heartbeat | NA | 9 months |
Using robot animal companions in the academic library to mitigate student stress | Wellbeing | Joy for All (JfA) animals | University library | Light Touch | Mouth movement with sound Eye and head movements Rollover Heartbeat | NA | 1–10 min |
AuDimo: a musical companion robot to switching audio tracks by recognizing the users engagement | Entertainment | AudiMO | Home | Facial expression Emotion detection | Changes in songs | Young, 21–25 | 20–30 min |
Reading with robots: A personalized robot-based learning companion for solving cognitively demanding tasks | Education | IQRA | University | Voice | Head movement Facial expression with eyes Show learning materials on display | Young, 19–24 | 60 min |
Development of a robotic companion to provide haptic force interaction for overground gait rehabilitation | Rehabilitation & therapy | Developed a robot with Lidar | Rehabilitation center | From connected computer | Variation of speed Force | Young & Elderly, NA-60 | 5 min |
Design and implementation of education companion robot for primary education | Education | Designed a robot | NA | Voice | Voice | NA | NA |
Expectation vs. reality: Attitudes towards a socially assistive robot in cardiac rehabilitation | Rehabilitation & therapy | Nao | NA | Voice | Voice Gestures Movements LEDs | Elderly, 54 | 18 weeks |
A participatory design process of a robotic tutor of assistive sign language for children with autism | Education | InMoov | Therapy centre | Voice Interactive Display | Voice Display gestures | NA | NA |
Developing assistive robots for people with mild cognitive impairment and mild dementia: A qualitative study with older adults and experts in aged care | Healthcare | Silbot | University Retirement village | Voice Touch screen display | Voice Display Gestures | NA | 2 days |
Insights on usability and user feedback for an assistive robotic health companion with adaptive linguistic style | Healthcare | Developed a robot | NA | Voice Button control panel | Voice | NA | 1 week |
FObo: Towards designing a robotic companion for solo dining | Socialization | FoBo | NA | burping, gestures, sounds | NA | NA | |
Living with a mobile companion robot in your own apartment—Final implementation and results of a 20-weeks field study with 20 seniors | Wellbeing | SYMPARTNER COMPANION ROBOT | Home | Touch Voice | Display Voice Gestures | Elderly, 62–94 | 6 months |
Meeting stevie: perceptions of a socially assistive robot by residents and staff in a long-term care facility | Assistance | Stevie | Long term care facility | Touchscreen | gesture, speech and facial expressions | NA | NA |
ELE—A conversational social robot for persons with neuro-developmental disorders | Disability assistance | ELE | Local therapeutic center | Voice Video | movements of ears, eyes and trunk (and associated sounds) Therapist controlled Voice | Young & Middle-aged, 25–43 | NA |
Investigating the effects of social interactive behaviours of a robot on people’s trust during a navigation task | Navigation & Guidance | Pepper | Softbank robotics development facility | Voice Touch sensitive display | Voice Gestures LEDs display | Young & Middle-aged, 22–40 | NA |
More than just friends: In-home use and design recommendations for sensing socially assistive robots (SARs) by older adults with depression | Rehabilitation & Therapy | Paro | Home | Touching Patting | Haptic feedback Sound Eyelid movement | Elderly, 56–67 | 1 month |
Design of a huggable social robot with affective expressions using projected images | Wellbeing | Pepita | University | touches, hugs, and strokes, as well as negative ones such as by hitting or pushing | Expression using projected images | Young, NA | NA |
Humanoid robot as a companion for the senior citizens | Wellbeing | NAO | Old-age home | Voice | Voice Gestures Movements LEDs | Elderly, 60–90 | NA |
Assistive robotic technology to combat social isolation in acute hospital settings | Socialization | NAO | Hospital | Voice | Voice Gestures Movements LEDs | Young & Elderly, 18–100 | 8 h |
Evaluation of a companion robot based on field tests with single older adults in their homes | Assistance | Kompaï robot by Robosoft SA | Home | Voice Touch screen display | Voice | Elderly, 70–83 | 75–118 days |
Reading socially: Transforming the in-home reading experience with a learning-companion robot | Education | Minnie | Home | Voice Video feed RFID scanning | Voice Head movements | Children, 10–12 | 2 weeks |
Robotic companions for long term isolation space missions | Wellbeing | Pleo Romibo | HI-SEAS habitat—a dome for isolation for NASA astronauts | Voice, touch Sound, touch | Expressions, limping if leg touched, frown, smile emotive tones or pre-recorded spoken words | NA | 5 weeks |
Social reading: Field study with an in-home learning companion robot | Education | Minnie | Home | Voice Video feed RFID scanning | Voice Head movements | Children, 10–12 | 2 weeks |
Development and evaluation of an interactive therapy robot | Rehabilitation & Therapy | Developed a robot prototype | Care facility | Voice Camea video feed | Voice Gestures—nodding | NA | NA |
A pilot study on using an intelligent life-like robot as a companion for elderly individuals with dementia and depression | Healthcare | Ryan companionbot | Senior caring facility | Facial expression Voice Touch sensitive display | Voice Gestures Conversation Display | NA | 4–6 weeks |
Puffy—An inflatable robotic companion for preschoolers | Education | Puffy | School | Touch Voice | Color projection Voice Gestures | Children, 4–5 | 10–15 min |
Both look and feel matter: Essential factors for robotic companionship | Entertainment | Nao Darwin | NA | Voice | Voice Gestures Movements LEDs | Young & Middle-aged, 22–39 | 3 min |
Robotic companions in stroke therapy: A user study on the efficacy of assistive robotics among 30 patients in neurological rehabilitation | Rehabilitation & Therapy | ROREAS | Rehabilitation clinic | Voice Touch sensitive display | Voice display | NA | 14.5 h |
Shopping with a robotic companion | Motivation & Influence | Nao | NA | Voice | Voice Gestures Movements LEDs | NA | NA |
A pilot randomized trial of a companion robot for people with dementia living in the community | Healthcare | Paro | Dementia day care centers Participants’ homes | Touching Patting | Haptic feedback Sound Eyelid movement | Elderly, 67–98 | 12 weeks |
Mobile robot companion for walking training of stroke patients in clinical post-stroke rehabilitation | Rehabilitation & Therapy | ROREAS | Clinic | Voice Touch sensitive display | Voice display | NA | 1 h |
Feasibility study of a socially assistive humanoid robot for Guiding elderly individuals during walking | Navigation & Guidance | Pepper | University Care facility | Voice Touch sensitive display | Voice Gestures LEDs display | Elderly, 73–92 | NA |
Someone to read with: Design of and experiences with an in-home learning companion robot for reading | Education | Minnie | Home | Voice Video feed RFID scanning | Voice Head movements | Children, 11–12 | 40 min |
Puffy: A mobile inflatable interactive companion for children with neurodevelopmental disorder | Disability Assistance | Puffy | School | Touch Voice | Color projection Voice Gestures | Children, NA | NA |
Animal robot as augmentative strategy to elevate mood: A preliminary study for post-stroke depression | Wellbeing | Paro | Rehabilitation center | Touching Patting | Haptic feedback Sound Eyelid movement | Middle Aged, 43 | 30 min |
The influence of prior expectations of a robot’s lifelikeness on users’ intentions to treat a zoomorphic robot as a companion | Motivation & Influence | Pleo | NA | Voice, touch | Expressions, limping if leg touched, frown, smile | Elderly, NA | NA |
Cheery companions or serious assistants? Role and demeanor congruity as predictors of robot attraction and use intentions among senior citizens | Assistance | HomeMate | Home | Remote control Voice | Sound Display | Elderly, NA | NA |
Exercising with a humanoid companion is more effective than exercising alone | Motivation & Influence | Nao | Voice | Voice Gestures Movements LEDs | Young, NA | 45–60 min | |
Group sessions with Paro in a nursing home: Structure, observations and interviews | Socialization | Paro robot | Aged-care facility | Touching Patting | Haptic feedback Sound Eyelid movement | Elderly, NA | 12 weeks |
Murphy Miserable robot—A companion to support children’s wellbeing in emotionally difficult situations | Disability assistance | NAO | Doctor’s chamber waiting area | Children, 5–12 | 1 h | ||
Lessons learned from the deployment of a long-term autonomous robot as companion in physical therapy for older adults with dementia: A mixed methods study | Rehabilitation & therapy | SCITOS G5 | Elder care facility | Voice Touch display Cameras | Navigation Eye movements Sound | Elderly, 74–95 | 1 month |
Emobie™: A robot companion for children with anxiety | Disability assistance | Emobie | NA | Listening to children’s stories | Gesture with eyes Sound Arm movement | NA | NA |
Robotic experience companionship in music listening and video watching | Entertainment | Travis | NA | Listening to beat and music and movement | Movement of body parts and gestures Music | Young, 17–31 | 4 min |
Affective personalization of a social robot tutor for children’s second language skills | Education | Tega | School | Camera video feed | Movements Gestures Voice | Children, 3–5 | 2 months |
Motivational effects of acknowledging feedback from a socially assistive robot | Motivation & influence | Nao | NA | Voice | Voice Gestures Movements LEDs | Young, NA | NA |
What effect does an animal robot called CuDDler have on the engagement and emotional response of older people with dementia? A pilot feasibility study | Wellbeing | Cuddler | Nursing home | Hit, touch, pat, stroke, squeeze | Body movements Gestures Brawl sound | NA | 150 min |
Towards an imperfect robot for long-term companionship: Case studies using cognitive biases | Wellbeing | ERWIN MyKeepon | NA | Mood detection, sound Pat on the head, Touch, sound, Music | Head movement, expression with eyes, sound Sound, Rotating, Jump, Gaze | NA | 10 min |
Robot companion for domestic health assistance: Implementation, test and case study under everyday conditions in private apartments | Healthcare | Max | Home | Touch, stroke Voice | Navigation purring, crying Emotions through eyes | Elderly, 68–92 | 3 days |
A new robotic platform for sign language tutoring: humanoid robots as assistive game companions for teaching sign language | Education | Robovie R3 | NA | Gestures Voice | Body parts movement LEDs Signing with fingers | Children & Young, 6–32 | NA |
Jogging with a quadcopter | Motivation & Influence | Quadcopter developed | Outdoor field | Autonomous Flies following a predetermined path | Young & Middle-aged, 22–44 | 34 min | |
The interplay of robot language level with children’s language learning during storytelling | Education | Dragonbot robot | School | Tablet | Tablet Facial expression | Children, 4–6 | 2 months |
Robotic companions for older people: a case study in the wild | Socialization | MetraLabs SCITOS G3 | Home | Touch, stroke Voice | Navigation purring, crying Emotions through eyes | Elderly, 67–85 | 30–40 min |
Healthcare robots in homes of rural older adults | Healthcare | iRobi | Home | Voice Display touch screen | Sound Display LEDs Gestures | Elderly, 75 + | 3–12 months |
Using socially assistive human–robot interaction to motivate physical exercise for older adults | Motivation & influence | Bandit | NA | Voice Movement recognition | Sound Expressions Gestures | Elderly, 68–92 | 20 min |
“iRobiQ”: the influence of bidirectional interaction on kindergarteners’ reading motivation, literacy, and behavior | Education | iRobiQ | School | Voice Display touch screen | Sound Display LEDs Gestures | Children, 2–3 | 2 months |
Physiological effects of a companion robot on blood pressure of older people in residential care facility: A pilot study | Wellbeing | Paro | Residential care facility | Touching Patting | Haptic feedback Sound Eyelid movement | Elderly, 71–95 | 10 min |
Robotic gaming companion to facilitate social interaction among children | Entertainment | Nao | NA | Voice | Voice Gestures Movements LEDs | Children, 6–10 | NA |
RoboParrot: A robotic platform for human robot interaction, case of autistic children | Disability Assistance | RobotParrot | NA | Voice Touch GUI | Sound Movements of body, eyes, head | NA | 8–12 min |
Robots in older people’s homes to improve medication adherence and quality of life: A randomised cross-over trial | Healthcare | iRobiQ Cafero | Hospital Residential homes | Voice, Display touch screen Touch screen, voice | Sound, Display, LEDs and Gestures Sound, videos | NA | 6 weeks |
Outdoor robotic companion based on a Google Android™ smartphone and GPS guidance | Navigation & Guidance | Developed a robot | NA | Smartphone connected to the robot | Navigation | NA | NA |
Robot companions for children with down syndrome: A case study | Disability Assistance | KASPAR IROMEC | School for children with special needs | Remote control, imitation, tactile exploration Interactive display | Body parts movement like eyes, hands, head Display, navigation | Children, 8 | 3 months |
A humanoid robot companion for wheelchair users | Navigation & Guidance | Nao | NA | Voice | Voice Gestures Movements LEDs | Young & Middle-aged, 21–38 | 5 min |
Embodying care in matilda: An affective communication robot for emotional wellbeing of older people in Australian residential care facilities | Wellbeing | Matilda | NA | Facial expression detection Voice | Facial expression Gestures Dance, body part movements | Elderly, 71–98 | 3 days |
Realization and user evaluation of a companion robot for people with mild cognitive impairments | Disability assistance | Scitos G3 | Home | Touch, stroke Voice Touch display | Navigation Sound Emotions through eyes Display | NA | 2 days |
Self-other’s perspective taking: The use of therapeutic robot companions as social agents for reducing pain and anxiety in pediatric patients | Rehabilitation & therapy | Paro | Hospital | Touching Patting | Haptic feedback Sound Eyelid movement | Children, 6–16 | 30 min |
Effects of robotic companionship on music enjoyment and agent perception | Entertainment | Travis | Office room | Listening to beat and music | Movement of body parts and gestures Music | NA | NA |
The psychosocial effects of a companion robot: a randomized controlled trial | Wellbeing | Paro | Residential care facility | Touching Patting | Haptic feedback Sound Eyelid movement | Elderly, 55–100 | 12 weeks |
Further progress towards a home robot companion for people with mild cognitive impairment | Disability assistance | Companionable robot developed for the study | Home | Interactive display Voice | Display Expression with eyes Movement | NA | 4 days |
Cooperative social robots to accompany groups of people | Socialization | Tibi Dabo | Outdoor Street | Commands through display Voice | Display Voice LEDs to create facial expression | Young & Middle-aged, 20–40 | NA |
Joggobot: A flying robot as jogging companion | Motivation & influence | Parrot AR.Drone | NA | Video feed | NA | 10 min | |
Modelling empathic behaviour in a robotic game companion for children: An ethnographic study in real-world settings | Entertainment | iCat | School | Cameras for detecting chessboard movements Voice | Voice Facial expression | Children, 8–10 | NA |
‘Make it move’: Playing cause and effect games with a robot companion for children with cognitive disabilities | Disability Assistance | KASPAR IROMEC | Special education school | Remote control, imitation, tactile exploration Interactive display | Body parts movement like eyes, hands, head Display, navigation | Children, NA | 6 months |
Living with a robot companion—Empirical study on the interaction with an artificial health advisor | Socialization | Karotz robot | Home | Voice | Sound LEDs Ear movement | Elderly, 50–76 | 10 days |
Learning through play with a robot companion | Education | IROMEC | Primary school | Interactive display | Display, navigation | Children, 6–11 | 3 months |
mediRobbi: An interactive companion for pediatric patients during hospital visit | Wellbeing | mediRobbi | Hospital | Touch, sound | Display, sound, movements | Children, 3–7 | NA |
“Why can’t we be friends?” an empathic game companion for long-term interaction | Entertainment | iCat | NA | Cameras for detecting chessboard movements Voice | Voice Facial expression | Young, 18–28 | NA |
Application of a Learning-Companion robot in learning environments | Education | Pleo | NA | Voice, touch | Expressions, limping if leg touched, frown, smile | NA | NA |
Robotic companion lead the way! | Navigation & guidance | Robbie | Office | Touch display Voice | Voice | NA | NA |
A sociable robot to encourage social interaction among the elderly | Healthcare | Paro | Residential care facility | Touching Patting | Haptic feedback Sound Eyelid movement | NA | 4 months |
Robotic pets in the lives of preschool children | Motivation & influence | Aibo | Home | Voice | Gestures Emotional expressions | Children, 3–6 | 45 min |
Technology acceptance of a social robot (LOVOT) among single older adults in Hong Kong and Singapore: protocol for a multimethod study | Wellbeing | LOVOT | Home | voice touch | Animated eyes Verbal Gestures | Elderly, 60–75 | 4–6 weeks |
The effects of social presence and familiarity on children–robot interactions | Socialization | RoBoHon | Home | Voice | Voice Gestures | Children, 6–12 | 4 days |
Perceptions of intelligence & sentience shape children’s interactions with robot reading companions | Education | NAO Cozmo MiRo | School | Voice, touch | Sound Gestures and body movement | NA | NA |
Deploying a robotic positive psychology coach to improve college students’ psychological wellbeing | Wellbeing | Jibo | College campus Dorm rooms | Camera Sound/voice Touch screen command | Voice | Young, NA | 7 days |
A robotic companion for psychological wellbeing: A long-term investigation of companionship and therapeutic alliance | Wellbeing | Jibo | Home | Camera Sound/voice Touch screen command | Voice | NA | 8 weeks |
Robotic technologies and wellbeing for older adults living at home | Wellbeing | Paro | Home | Touching Patting | Haptic feedback Sound Eyelid movement | Elderly, 74 | 7 days |
A social robot reading partner for explorative guidance | Education | Jibo | Home | Camera Sound/voice Touch screen command | Voice | NA | NA |
Socially assistive humanoid robots: effects on depression and health-related quality of life among low-income, socially isolated older adults in South Korea | Wellbeing | Hyodol | Home | Pressing ears holding hands | Voice alarms to mobile app | NA | 3–6 months |
Lighting up wellbeing with Bulb | Wellbeing | Bulb | Home | Eye contact Touch | Sound Movement Different lights emitted | NA | 2 days |
Development of a learning companion robot with adaptive engagement enhancement | Education | Sota | Laboratory facility | Voice | Voice Gestures Head and hand movement | NA | 5 min |
Social robots as eating companions | socialization | My keepon | Chocolate testing facility | NA | NA | ||
Learning a foreign language vocabulary with a companion robot | Education | F-2 robot | Home | Camera Voice | Facial expression Gesture Voice Following user’s sound | Young, NA | NA |
The impact of robotic companion pets on depression and loneliness for older adults with dementia during the COVID-19 pandemic | Wellbeing | Joy for all pet animals | Residential care facility | Light Touch | Mouth movement with sound Eye and head movements Rollover Heartbeat | NA | 3–6 weeks |
Novelty experience in prolonged interaction: a qualitative study of socially-isolated college students’ in-home use of a robot companion animal | Socialization | Joy for all robot kitten | Home | Light Touch | Mouth movement with sound Eye and head movements Rollover Heartbeat | NA | 6 weeks |
Understanding factors that shape children’s long term engagement with an in-home learning companion robot | Education | Misty II | Home | Button on the body reading april tags | Voice display Emotions on display with animations | Children, NA | 4 weeks |
TV-watching robot: toward enriching media experience and activating human communication | Entertainment | CommU | Home | voice User movement | Voice | NA | 6 h |
Long-term effect of the absence of a companion robot on older adults: A preliminary pilot study | Wellbeing | RoBoHoN | Home | Voice | Voice Gestures | NA | 1–4 months |
Zoomorphic robots and people with disabilities | Disability assistance | Aibo | Home | Voice | Gestures Emotional expressions | NA | NA |
Using robot animal companions in the academic library to mitigate student stress | Wellbeing | JFA pet animals | Library | Light Touch | Mouth movement with sound Eye and head movements Rollover Heartbeat | NA | 1–10 min |
Pedagogical exploration and technological development of a humanoid robotic system for teaching to and learning in young children | Education | Nao | School | Voice | Voice Gestures Movements LEDs | NA | 5 min |
Affective responses of older adults to the anthropomorphic genieconnect companion robot during lockdown of the COVID19 pandemic | Wellbeing | GenieConnect | Aged care facility | Voice Touch Display | Voice Facial expression | Elderly, 80–92 | 4–30 days |
Comparison of in-home robotic companion pet use in South Korea and the United States: a case study | Wellbeing | JFA cat | Home | Light Touch | Mouth movement with sound Eye and head movements Rollover Heartbeat | NA | 1 month |
A social robot for anxiety reduction via deep breathing | Wellbeing | Ommie | Wellness center | Touch Voice | Sound Haptics | Young & Middle-aged, 18–38 | 15–20 min |
AMIGUS: A robot companion for students | Wellbeing | AMIGUS | Home | Buttons Voice Gestures | Sound Voice Expressive eyes | Young, 17–28 | NA |
A collaborative elderly care system using a companion robot and a wearable device | wellbeing | ASCC companion robot | Home | Touch display Voice Camera video feed | Expressive face on screen Sound | NA | NA |
My precious friend: human–robot interactions in home care for socially isolated older adults | Wellbeing | Hyodol | Home | Pressing ears holding hands | Voice Alarms to mobile app | Elderly, 72–93 | 18 months |
Let’s read a book together’: a long-term study on the usage of pre-school children with their home companion robot | Education | Luka | Home | Voice Person recognition | Voice Sound Conversation Music Games Expressive eyes | NA | 180 days |
Can the Paro be my Buddy? Meaningful experiences from the perspectives of older adults | Wellbeing | Paro | Nursing home | Touching Patting | Haptic feedback Sound Eyelid movement | Elderly, 68–91 | 8 weeks |
Pandemic effects on social companion robot use in care homes | socialization | JFA pets | Care homes | Light Touch | Mouth movement with sound Eye and head movements Rollover Heartbeat | Middle-aged, NA | 30–60 min |
An exploratory study on the use of social companion robot for adults with motor disabilities | Disability assistance | Zenbo | Coffee shop | Voice Facial recognition camera | voice Gestures Facial expression | Young & Middle-aged, 18–50 | NA |
RoboMath: designing a learning companion robot to support children’s numerical skills | Education | Misty II | Home | Button on the body Reading april tags | Voice Display Emotions on display with animations | Children, NA | 20–30 min/ day, for 5 days |
TV-watching companion robot supported by open-domain chatbot “kACTUS” | Entertainment | CommU | Home | Voice User movement | voice | NA | 6 h |
BINBOT: a low-cost robot companion that teaches basic concepts through binary questions | Education | Binbot | School | Interactive display Buttons | Display Sound | Children, 10–12 | NA |
Exploring the design space of therapeutic robot companions for children | Rehabilitation | TACO | School | Touch Hug Grab | Lights Haptics | Children, 6–9 | NA |
Edù, a robotic companion in pediatric protective isolation units | Healthcare | Edù | Hospital | Hug Voice | Facial expression Eyes expression Voice Movement Haptic feedback | NA | NA |
Dignity, autonomy, and style of company: dimensions older adults consider for robot companions | Socialization | ElliQ Vector Biscuit | Home | Sound, voice Voice Touch, speech | Head movement, lights, sound, voice, giving suggestions Voice, Movements, dance Body movement, sound | Elderly, 65–89 | NA |
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Ahmed, E., Buruk, O.‘. & Hamari, J. Human–Robot Companionship: Current Trends and Future Agenda. Int J of Soc Robotics 16, 1809–1860 (2024). https://doi.org/10.1007/s12369-024-01160-y
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12369-024-01160-y