The development process started from users’ perspectives (older persons and care professionals), extracting the core issues identified with the previous setup and using various assistive technologies (a monitoring camera and communication robots, in particular). The gemba principle drove this process with frontline care professionals representing the voice of care recipients, particularly those who are not physically mobile enough to reach and press the nurse call button when they need assistance.
Settings and Participants
In Japan, there are different types of care facilities. The facility selected for this study is a special nursing home for older people. There are 30 residents (25 female, 5 male; \(86.8 \, \pm 6.8\) years old). From the group of care professionals, 4 senior staff members in the nursing home participated in the design process of the robot and the data collection. The years of experience vary from 7 to 20 (11 on average) years. In order to ensure objectivity, the analysis was conducted by three researchers who were not involved in the collection of data. Ethics approval was granted by the Social Welfare Corporation Tokyo Sacramental Ethics Committee (TS 2019- 004). Consent was sought from each participant and their family. The research was conducted between October 2019 and March 2020.
User-Centered Design and Development in a Nursing Home
Most conventional communication robots have a hard shell which gives a cold impression for both residents and staff. As suggested by previous research, the sense of touch is a critical element for older adults’ engagement with social robots [5, 6, 16, 24,25,26]. The hard shell also meant that physical deployment is highly restricted in the context of nursing homes, as it requires a solid surface, such as a bedside table. As the physical distance between SARs and care recipients was not adjustable, residents had difficulty hearing and understanding what SARs were saying. From the staff’s point of view, there has always been an issue of having to go to the room to see what has happened to a resident each time a nurse call button was pressed, particularly during nighttime. It was hoped that a work system, aided by ATs, could send the warning signs or visual information wherever care professionals are, so that they can arrive in time before problems such as falls occur. In the past, there were some weaknesses in the ICT environment issues, such as time lag via the Cloud and malfunctions.
User-centered design dictates that priority should be placed on development of a communication robot that meets their needs. The overall concept therefore was to create a soft shell SAR with an input/output device which is connected to the nurse call button and the monitoring camera. Figure 1 describes the overall concept and the development process.
The system requirements include the capability of (i) remotely monitoring and sending signals to care staff when help is needed; (ii) watching over older people around the bed; and (iii) providing older people with a voice-controlled alternative method to a nurse call; and (iv) small, compact and audible robot with soft and warm appearance, which can be securely hung over the bed-rail without hindering its vertical movement.
The first two requirements ((i) and (ii)) were fulfilled by the existing devices, which monitor the person with a bed-side infrared camera. In case of emergencies such as falls, alerts were sent to the central nursing station as well as care professionals on duty. For the third requirement (iii), a voice-controlled alternative method was considered at the beginning. However, the idea of developing and installing a two-way communication system into the robot had to be rejected for reasons of cost and time. Instead, the team decided to rely on the existing nurse call system, and this was activated when the alert was received by care staff, so that the care recipient in need of help could communicate verbally. The main focus of this project was narrowed down to fulfil (iv).
From basic design to development of the interface design, the iterative, PDCA (plan, do, check, do, act) cycle method was applied (Fig. 2). The processes for the robot (exterior) design and internal functional design were separated, and the latter proceeded with the development of verbal communication functions (e.g. system design and scenarios for conversations) and non-verbal aspects.
Much attention was paid to the human–robot interface design, with consideration of its appearance and voice functionality. At each stage, discussions were held between manufacturers and frontline care professionals. As a result, softer plush fabrics were chosen, and the SAR made more compact, yet more stable on the bedrail using larger paws, though it was not feasible to add a variety of voice types. Table 1 shows the changes made to the design and functions of the SAR after the iterative process of consultations.
Development of Conversation Scenarios
One of the key aspects of the SAR design was the set of conversation scenarios, which were developed based on daily care routines (see Table 2). Messages about meals, wake-up, and bedtime (the timing of wake-up and bedtime messages were set individually) was input for all residents. A conversation scenario about recreation and toileting was tailored for each person, reflecting their lifestyles and daily patterns. In addition, the team enabled the scenario to be adjusted for each person, based on previous recordings captured by the monitoring camera. This was particularly important for conversations taking place during the night, when residents wake up. When the movement of a resident in her room is detected, the signal sent from the monitoring sensor (installed in each room) instructs the SAR’s server to speak, and based on the created scenario, recorded voice and sound synthesis are transferred via the SIP server, and the SAR speaks to the resident. For example, the Mon-chan robot asks “What’s happened?” when the resident tries to get up. In response to her saying “I want to go to the toilet”, Mon-chan says “Staff will be here in a minute, so please wait a moment.”
Mon-chan was produced this way, and this communication robot (10 [cm] \(\times \) 7 [cm] \(\times \) 5 [cm]), which looks like a stuffed animal, contains an I/O device (Table 3, Fig. 3).