Keywords

1 Introduction

We envision a time when sensor technology and smart healthcare systems are routinely used to capture subtle changes in health and activity that may represent early signs of illness and functional decline. Seniors are able to receive early treatment when health problems are still small and manageable; healthcare is more effective and efficient, because health problems are caught early. Remote family members can easily stay in touch with their aging loved ones, and, if the seniors allow access, can see pertinent information about their activity and health. Seniors are able to better manage their own chronic health conditions, with aid from automated technology, care coordinators, physical therapists, and family members. When needed, they use technology to connect with a healthcare professional from the comfort of their homes. As a result, seniors are able to maintain a high functional ability, which promotes independence. Thus, Aging in Place is a finally a reality. Seniors have real options on where to live, including staying in their own home.

As described above, our vision includes the operational use of various smart technologies across a range of housing options in urban, suburban, and rural settings. These include in-home sensors embedded in the environment, wearable sensors such as smart watches, sensors embedded in clothes or shoes, and smartphones for those that carry them. Intervention technology includes automation to help maintain good health and/or manage chronic health conditions, such as health change alerts, fall detection alerts, personalized models, tools to support personalized exercise and health coaching, socially assistive technology such as screen agents or robots, and other innovations that might be developed in the future. These capabilities would benefit anyone battling chronic health problems; however, older adults are particularly vulnerable and may lose their independence if their health problems are not managed appropriately.

This vision comes with a variety of challenges, many of which are related to how users interact with the technology. In this paper, we discuss these challenges in the context of our experience with two applications designed to support older adults. Section 2 first discusses background and related work. In Sect. 3, we consider an in-home sensor system used for clinical decision support to recognize very early health changes and manage chronic health conditions. In Sect. 4, we present an interactive remote physical therapy system designed to connect an older adult in the home with a therapist in the clinic. Section 5 then includes a discussion of challenges across these two applications.

2 Background and Related Work

Previous studies have shown the importance of the user interface when designing for consumers such as older adults and their family members. In particular, older adult users bring new challenges with regards to delivering useful information in easy to interpret formats. In addition to new models of healthcare that may be unfamiliar to consumers, there are difficulties, due to typical aging effects on vision, hearing, and haptic sensing as well as attention and working memory [1]. In spite of these challenges, older adults report that they are willing to use technology as long as it provides a worthwhile function and the interface considers their sensory limitation [2].

There have been a number of studies investigating the efficiency of graphical user interfaces and other new technologies for older adults [311]. Demiris et al. have several investigations on determining the usability of data representation techniques for older adults [3, 5, 11]. In [11], experiments with four focus groups and 31 older participants investigate the usability of smart home data visualizations for wellness assessment and cognitive processing abilities of older adults. Their analysis showed that older adults prefer integrated data visualizations rather than raw smart home data. In addition, the group has conducted studies to determine the current status and future trends of patient centered care and observed the growth and significance of informatics systems and electronic health records for patient-centered care models [4, 6].

Other research studies have explored technology acceptance for older adults in different fields [710]. In [8], Czaja et al. conducted experiments to determine the performance of older adults in using internet based health care applications. Results show that the older adults that received training, performed better on the tasks with more accuracy and efficiency. The study also indicates that a simpler interface design is necessary for improving the performance.

Among the emerging technologies in healthcare and wellness, telemedicine as a means for healthcare is one of the fastest, time efficient, and most reliable way to provide services [12]. The American Association of Retired Persons survey in 2008 shows that three-fourths of the survey respondents are ready to accept telemedicine as a part of their primary healthcare, for health diagnosis and monitoring [13].

3 In-Home Sensor System

The main function of our in-home sensor system is to recognize early signs of illness and functional decline so that early interventions can be offered when health problems are more manageable [14]. The system includes a collection of bed, motion, and depth sensors installed in the home, a data logger computer in the home, automated alert generation software, a database for storing data on a secure server, and a web interface for viewing the data and alerts [15]. Data are encrypted and sent to the server via Wi-Fi or wired network connection. Falls are detected on the home computer based on depth data; fall alerts are immediately sent to designated individuals [16, 17]. Health alerts are processed once a day on the server and sent to clinical care coordinators, as these do not require immediate attention. Specialized algorithms detect changes in sensor data patterns that might indicate impending health problems which may lead to serious health events if left unattended [15]. With proper interventions, small problems can be addressed before they become major health problems, thus maintaining a high level of functionality and a high quality of life for older adults as they age. The work thus far has relied on the clinical staff to determine whether an intervention is needed. In future work, we plan to investigate the older adults’ use of the system for managing their own chronic health conditions.

A web interface includes several graphical visualizations of the sensor data, which were originally developed to support the research team and clinical staff [18, 19]. Figure 1 shows one view of motion and bed sensor data. Passive infrared (PIR) motion sensors are placed in each room to capture activity. Motion events are generated every seven seconds if there is continuous motion in the detection cone. In addition to the bar graphs shown in Fig. 1, motion sensor data are also shown as motion density (number of events from all motion sensors per unit time). An example of a motion density map showing overall activity patterns is shown in Fig. 2. During this period, the older adult slept through the night on most days, woke about 7 am, left the apartment (black regions) for activities and some meals, and went to sleep around 10 pm most days. Several nights show unusual bathroom visit activity (middle section of red bars). Three health alerts were generated over this period, shown as the red vertical bars on Fig. 2. Changes in daily patterns are often vividly illustrated in these activity maps [20, 21].

Fig. 1.
figure 1

The Health tab of the web interface showing motion data from each room in the older adult apartment (top) and pulse (in red), respiration (in blue), and restless computed from bed sensor data. Buttons on the right toggle options on/off. The bottom slider bar selects a date range. (Color figure online)

Fig. 2.
figure 2

Density tab showing a motion density map (top) and bathroom visit map (middle). Each column represents a day, starting from midnight at the top to 11 pm at the bottom. Density colors shown on the right indicate the level of activity (higher density represents more activity); black is out of the home, e.g., for meal times and other activities. The slider bar on the bottom selects the date range. The red vertical bars below the motion density map and on the bottom show health alert days. Hovering over the alert bar shows the type of alert. (Color figure online)

A depth sensor is used to compute gait parameters and detect falls (initially, the Microsoft Kinect). The depth sensor produces a sequence of images in which each pixel value stores the distance to the nearest object at that pixel. Individuals in the home are automatically segmented and tracked in the depth images. They appear as shadowy three-dimensional silhouettes, which users have found to be acceptable [22]. Detected falls result in immediate fall alerts with a link to a short depth video clip showing the fall [16, 17]. Examples of older adult falls detected in the home can be found at https://www.youtube.com/watch?v=TFB7YOUmHho. The embedded video link allows receivers to confirm that a fall did happen and to see what happened leading up to the fall. Gait parameters are estimated based on purposeful walks that are observed in the home. The height and other gait parameter estimates are used to distinguish a resident from visitors or one resident from another resident so that gait trends can be tracked [23]. The system sends alerts for changes in walking speed, stride time, and stride length [24]. Examples of older adult walks in the home can be found at https://www.youtube.com/watch?v=MF6yZyLuuII. Figure 3 shows the web interface Gait tab, in this case showing declines in gait speed and stride length.

Fig. 3.
figure 3

Gait tab showing walking speed, stride length, and stride time. The graphs show declines in walking speed and stride length.

The in-home sensor systems have been installed in TigerPlace apartments, an aging in place housing site, and in many assisted living apartments elsewhere. For these older adults, the clinical staff receive the alerts as email and use the web interface to determine whether an intervention is warranted. The health alerts include an embedded link to the web interface for that older adult, making it easy for clinical staff to view sensor data around the time of the alert. If further investigation is needed, staff talk to the older adults directly. We have also begun to use the system in private homes. Again, a clinical care coordinator receives the alerts. However, she uses FaceTime as a video conferencing tool to talk with the older adult remotely. In one case, the older adult user had difficulties managing the iPad and FaceTime and had trouble in hearing the voice communication. The iPad was set up for her, with the Wi-Fi connection and applications. The older adult was trained in using the iPad and FaceTime but still had difficulties until she went to visit her children and was able to spend some concentrated and repetitive time practicing. After this visit, she was able to successfully have FaceTime calls. The sound is still a little troublesome for her but she is able to make due. She seems motivated to learnand use FaceTime, because she knows that she can communicate with her family with this technology. Without this motivation, she might not have wanted to learn the interface.

We have also begun to show the sensor data visualizations to the older adults being monitored and their family members as an initial step towards developing a more appropriate interface for consumers. As a start, the older adults and family members have seen motion data and depth videos of falls. Anecdotally, several older adults have shown little interest in seeing the data. The general feeling is that they are too old to bother with it, and they already have clinicians who are looking after them. A couple have expressed interest in seeing more motion data. They are all impressed by the fall sensing technology; the fall depth videos provide the easiest output for them to interpret, as the silhouettes rendered from the depth videos look like people. The floor plane is segmented and colored to help interpret the depth videos; segmented people are also colored in distinct colors. For viewing motion sensor data, older adults prefer the 24-hour motion density maps because they can clearly see their pattern of daily activity, and the bathroom density maps showing bathroom visits. The histogram graph of the motion sensor data is difficult for them to interpret, as the y axis metric (number of motion sensor events) is not meaningful.

Fig. 4.
figure 4

Interactive remote physical therapy system (a) Therapist interface for the clinic; (b) Home interface

4 Interactive Remote Physical Therapy System

The remote physical therapy (PT) system was developed to augment in-clinic time by connecting a therapist in the clinic with a PT client in the home for remote quantitative assessments [12]. The system has two different interfaces, one for the PT client in the home and another for the therapist; they are connected through a high speed network connection. The interfaces were developed with iterative feedback provided by a therapist and five older adults participants acting as PT clients. Figure 4 shows the final interfaces, which use the Microsoft Kinect for video conferencing as well as for quantitative assessments. The therapist’s interface has more detailed real-time and post therapy data, whereas on the home side, less information is displayed. A challenge was to decide how much information should be presented to be productive, while keeping the users focused in their own tasks.

For this work, two tasks were chosen that can both assess for fall risk problems and provide exercises to overcome fall risk: (1) single leg stance, and (2) tandem walk. Figure 4 shows a single leg stance. In a tandem walk, the user walks in a straight line with the feet touching heel to toe. Both exercises test balance, which is measured quantitatively as body sway, using the Kinect depth camera. Multiple trials are supported, each lasting for 20 s. A (green) timer bar on the interface shows the time left for a trial.

Both the therapist and home interfaces have two video feedback elements, showing the local and remote views. In the therapist’s interface, the remote video element is comparatively larger to help the therapist evaluate the PT client’s performance more effectively. In the home interface, both the video elements are the same size. To perform an activity trial, the PT client has to carefully observe the therapist’s movements and repeat them in each trial. On the bottom right of both interfaces, a real-time skeletal model of the PT client is provided to visually confirm correct skeletal data. Both interfaces also have a sway indicator on the top right corner of the PT client video window. This displays the degree of sway in real-time, in Anteroposterior (AP) and Mediolateral (ML) directions, represented as green, yellow, and red to show the amount of sway. Green represents a safe zone, whereas red represents a fall potential. A real-time joint alignment module is included only in the therapist’s interface, to provide feedback on the detailed posture of the client. Network strength is also shown to the therapist, to ensure network performance during remote communications. At the conclusion of the trials, the therapist can view traces of the body sway (Fig. 5) and see quantitative measurements of maximum sway in the AP and ML directions. These were not shown to the PT clients at home but could be used as feedback for them to track progress.

Fig. 5.
figure 5

Traces showing body sway (a) Single leg stance trials, (b) tandem walk trials

The hardware platform used for both the therapist and the home consisted of a 32-inch wide television mounted on a 5-foot high stand with a computer mounted on the side. The Kinect was positioned directly below and in front of the television. This large scale monitor allowed easy viewing at distances required to perform the exercises. A keyboard and mouse was required at the beginning of each session on both ends, to set up the network connection. After this was established, video conferencing was activated to allow voice communications. A researcher brought the equipment into the home for each session and set up the network connection to begin the session. We did not expect the older adult user to manage this step. In addition, voice commands were implemented for the therapist to allow for easy control through the different exercises and trials without requiring a keyboard or mouse.

The system was tested with five independent older adults, located in Kansas City, MO. The Google Fiber network was used to connect the homes with a physical therapist located on the campus at the University of Missouri in Columbia, MO. Survey results were collected from the older adults in the home and the therapist. The final session showed very high ratings from the therapist for video and audio quality at 4.7/5. The older adult participants rating the video and audio quality at 4.0/5. The average user satisfaction rating from the therapist was 4.8/5, whereas the older adult participants rated it 3.7/5. Lower scores were given by the older adults for categories of (1) using the system with confidence, (2) readily available, and (3) easy to use. These lower ratings were most likely due to the complicated network set-up required at this stage of development. An average rating of 4.25/5 was given by participants for (1) accuracy of remote movements on the interface and (2) senior following the remote activity demonstration. A minimal training time on the interface was apparently sufficient for the older adults to interpret the information on the PT interface. If they did experience problems, the therapist was there to answer questions and provide guidance.

5 Discussion

Although the in-home sensor system and the remote PT system represent different types of systems with much different functionalities, there are common HCI research challenges for both. We consider HCI challenges in the following areas:

  • Turning the system on. How should the system operation be initiated?

  • Methods for controlling operation and navigating through an interface. Hardware platforms are considered here also, as some control methods are dependent on the platform.

  • Output displays. This includes both the format and the level of detail displayed.

Other challenge areas include training (on how to use the system and how to act on the information displayed by the system) and personalization (should the system adapt to the user and will this make it easier to use or harder as it changes?). Although not considered here, these too should be investigated for older adult users.

The challenge of how to turn the system on was not addressed for older adults in either the in-home sensor interface or the remote PT interface. Thus far, we have avoided this issue by having a researcher start the system and show it to the older adult user. However, it will need to be addressed for a system to be accepted and used. Two main options are to require the user to start the system use, e.g., by pressing a button, or to have the system automatically prompt the user. This was shown to be a challenge for one user even in the seemingly simple use of FaceTime, when the call was initiated by the care coordinator. The fact that more training solved the problem is consistent with the study in [8]. For the researchers and clinical staff that currently use the web interface, they often open the system by clicking on a link in an email. However, for older adults that do not use email, this will be yet another barrier to using the system.

The challenge of what control methods to use is somewhat tied to the hardware platform. A tablet supports touch. If large buttons and large type are used, this could provide an effective control method as long as the choices are limited. Likewise, keyboard and mouse can be effective, especially for users already familiar with them, as long as the interface is not too busy. The current sensor data interface has many tabs, options, buttons, and charts that could easily overwhelm an older adult wanting to view their own health related information. A simpler control interface will be necessary for consumers, both older adults and their family members. The remote PT system did not require control by the older adult user; the system was driven completely by the therapist. Because the system was built on synchronous video conferencing, the older adult could talk to the therapist directly if there were any problems with the system.

Speech is another option for application control. In studies investigating interfaces to robots, older adults have stated their preference for speech [25]. However, speech recognition for older adults can be problematic if the recognition is not reliable. One study showed a 10 % reduction in recognition rates for older adults compared to younger adults [26]. The speech interface for the remote PT system worked very well for the therapist but was not tried for the older adult users. Each command to the PT system was begun with “Mizzou Steps” so the system was not confused by the normal conversation between the therapist and client in the home. A similar strategy could be used by the older adult in the home for other applications. For example, the user might provide a name for the system to support a command such as “Buddy, show me my gait for the last week.” If it works reliably, this could be an effective control method.

The third challenge is on how to display the output, that is, what format should be used and at what level of detail. In the remote PT system, we obtained feedback from users to determine how much detail to include. It was clear that the older adult users required less detail than the therapist. More investigation is needed to determine whether we really got it right. For the sensor data web interface, the current configuration offers too much detail for consumers. Many of the options and charts have been included to help ensure the system is operating properly. Viewing these charts gives the research team confidence in trusting the data, but many of them would likely confuse consumers. Our initial investigation with older adults was consistent with the study in [11]; older adult users preferred the motion density visualizations over the histograms of the raw motion sensor data.

Other format types are also available. Our research team has been exploring sensor data summaries in the form of textual descriptions. For example, a summary of bed sensor data could be “The bed restlessness tonight is a lot higher than most of the nights in the past two weeks” [27, 28]. This could be used as a text message sent to a family member or it could be used as an explanation of a graph, e.g., as part of training in how to use the system. It might also be used to start an automated discussion with the older adult to investigate further. For example, this could initiate a string of questions to find out if the user is in pain or is having problems with a chronic health condition. In any case, further study is needed to determine the appropriate level of detail in the message and the best time period (day, week, month, etc.).

In summary, we have reviewed two application designed to help older adult age in place and have discussed HCI challenges with respect to the older adult user. Addressing these challenges will provide new opportunities to engage older adults with technologies that can help them better manage their own chronic health conditions, address health problems, maintain function, and ultimately retain independence which is so crucial for aging in place. Getting the interface right will be essential to achieving this vision.