Keywords

1 Introduction

The increase of the average lifespan across the world has been accompanied by an unprecedented upsurge in the occurrence of dementia, with high socio-economic costs, reaching 818 billion US dollars worldwide, in 2015. Nevertheless, its prevalence is increasing as the number of people aged 65 and older with Alzheimer’s disease may nearly triple by 2050, from 46.8 million to 131 million people around the world, the majority of which, living in an institution [1].

Assistive technologies could enhance clinicians’ diagnosis and decision making, in order to meet individual needs, but also to be used as an objective assessing measure of cognitive status and disease progress of patients. Furthermore, assistive technology is expected to play a critical role in improving patients’ quality of life, both on cognitive and physical level, whereas cost is reduced. Drawbacks of current health services are that they often aim to evaluate single needs (e.g. pharmacological treatment) or detect problems solely via interviews, leading to generic interventions by clinicians. However, home remote monitoring of patients is a promising “patient-centered” management approach that provides specific and reliable data, enabling the clinicians to monitor patients’ daily function and provide adaptive and personalized interventions.

Towards this direction, we propose Dem@Home, a holistic approach for context-aware monitoring and personalized care of dementia at homes, prolonging independent living. To begin with, the system integrates a wide range of sensor modalities and high-level analytics to support accurate monitoring of all aspects of daily life including physical activity, sleep and activities of daily living (ADLs), based on a service-oriented middleware [2]. After integrating them in a uniform knowledge representation format, Dem@Home employs semantic interpretation techniques to infer complex activity recognition from atomic events and highlight clinical problems. Specifically, it follows a hybrid reasoning scheme, using DL reasoning for activity detection and SPARQL to extract clinical problems. Utterly, Dem@Home presents information to applications tailored to clinicians and patients, endorsing technology-aided clinical interventions to improve care. Dem@Home has been deployed and evaluated in four home pilots showing optimistic results with respect to accurate fusion and activity detection and clinical value in care.

The rest of the paper is structured as follows: Sect. 2 presents relevant work, while Sect. 3 gives an overview of the framework. Section 4 elaborates on data analytics, presenting the activity recognition and problem detection capabilities of Dem@Home. Section 5 describes the GUIs supported by the framework to provide feedback to clinical experts or patients, Sect. 6 presents the evaluation results and Sect. 7 concludes the paper.

2 Related Work

Pervasive technology solutions have already been employed in several ambient environments, either homes or clinics, but most of them focus on a single domain to monitor, using only a single or a few devices. Such applications include wandering behavior prevention with geolocation devices, monitoring physical activity, sleep, medication and performance in daily chores [3, 4].

In order to assess cognitive state, activity modelling and recognition appears to be a critical task, common amongst existing assistive technology. OWL has been widely used for modelling human activity semantics, reducing complex activity definitions to the intersection of their constituent parts. In most cases, activity recognition involves the segmentation of data into snapshots of atomic events, fed to the ontology reasoner for classification. Time windows [5] and slices [6] provide background knowledge about the order or duration [7] of activities are common approaches for segmentation. In this paradigm, ontologies are used to model domain information, whereas rules, widely embraced to compensate for OWL’s expressive limitations, aggregate activities, describing the conditions that drive the derivation of complex activities e.g. temporal relations.

Focusing on clinical care through sensing, the work in [8] has deployed infrared motion sensors in clinics to monitor sleep disturbances, limited, though, to a single sensor. Similarly, the work in [9] presents a sensor network deployment in nursing homes in Taiwan to continuously monitor vital signs of patients, using web-based technologies, verifying the system’s accuracy, acceptance and usefulness. Nevertheless, it so far lacks the ability to fuse more sensor modalities such as sleep and ambient sensing, with limited interoperability.

Other solutions involve smart home deployments of environmental sensors to observe and assess elder and disabled people activities [10, 11]. The work in [12] monitors the residents’ physical activity and vital signs by using wearable sensors, door sensors to measure presence and “fully automated biomedical devices” in the bathroom, while the system presented in [13] provides security monitoring, with actuators to control doors, windows and curtains, but none of the above records sleep. On the other hand, Dem@Home offers a unified view of many life aspects, including sleep and activities, to automatically assess disturbances and their causes, aiding clinical monitoring and interventions.

3 The Dem@Home Framework

Dem@Home proposes a multidisciplinary approach that brings into effect the synergy of the latest advances in sensor technologies addressing a multitude of complementary modalities, large-scale fusion and mining, knowledge representation and intelligent decision-making support. In detail, as depicted in Fig. 1, the framework integrates several heterogeneous sensing modalities, such as physical activity and sleep sensor measurements, combined input from lifestyle sensors and higher-level image analytics, providing their unanimous semantic representation and interpretation.

Fig. 1.
figure 1

Dem@Home architecture, sensors and clinical applications

The current selection of sensors is comprised of proprietary, low-cost, ambient or wearable devices, originally intended for lifestyle monitoring, repurposed to a medical context. Ambient depth cameras Footnote 1 are collecting both image and depth data. The Plug sensorsFootnote 2 are attached to electronic devices, e.g. to cooking appliances, to collect power consumption data. Tags Footnote 3 are attached to objects of interest, e.g. a drug-box or a watering can, capturing motion events and Presence sensors are modified Tags that detect people’s presence in a room using IR motion. A wearable Wristwatch Footnote 4 measures physical activity levels in terms of steps, while a pressure-based Sleep sensor Footnote 5 is placed underneath the mattress to record sleep duration and interruptions.

Each device is integrated by using dedicated modules that wrap their respective API, retrieve data and process them accordingly to generate atomic events from sensor observations e.g. through aggregation. In the case of image data, computer vision techniques are employed to extract information about humans performing activities, such as opening the fridge, holding a cup or drinking [14]. All atomic events and observations are mapped to a uniform semantic representation for interoperability and stored to the system’s Knowledge Base. Dem@Home applies further semantic analysis, activity recognition and detection of problems i.e. anomalies, and then all the derived information can be used by domain-specific applications offering a tailored view to different types of users.

4 Activity Recognition and Problem Detection

To obtain a more comprehensive image of an individual’s condition and its progression, driving clinical interventions, Dem@Home employs semantic interpretation to perform intelligent fusion and aggregation of atomic, sensor events to complex ones and identify problematic situations, with a hybrid combination of OWL 2 reasoning and SPARQL queries.

Dem@Home provides a simple pattern for modelling the context of complex activities. First of all, sensor observations, including location, posture, object movement and actions, are integrated with complex activities in a uniform model, as types of events, extending the leo:Event class of LODEFootnote 6 (Fig. 1). The agents of the events and the temporal context are captured using constructs from DULFootnote 7 and OWL TimeFootnote 8, respectively.

Each activity context is described through class equivalence axioms that link them with lower-level observations of domain models (Fig. 1). The instantiation of this pattern is used by the underlying reasoner to classify context instances, generated during the execution of the protocol, as complex activities. The instantiation involves linking ADLs with context containment relations through class equivalence axioms. For example, given that the activity PrepareHotTea involves the observations TurnKettleOn, CupMoved, KettleMoved, TeaBagMoved and TurnKettleOff, its semantics are defined as:

$$ \begin{aligned} PrepareTea \equiv & Context{ \sqcap }\exists contains.TurnKettleOn{ \sqcap }\exists contains.CupMoved \\ & { \sqcap }\exists contains.KettleMoved{ \sqcap }\exists contains. TeaBagMoved \\ & { \sqcap }\exists contains.TurnKettleOff \\ \end{aligned} $$

Other examples in our use case scenario are:

$$ \begin{aligned} Cooking \equiv & Context{ \sqcap }\exists contains.TurnCookerOn \\ & \;\;\;\;\;{ \sqcap }\exists contains.KitchenPresence \\ \end{aligned} $$
$$ \begin{aligned} PrepareDrugBox & \equiv Context{ \sqcap }\exists contains.DrugBoxMoved \\ & { \sqcap }\exists contains.DrugCabinetMoved \\ & { \sqcap }\exists contains.KitchenPresence \\ \end{aligned} $$
$$ \begin{aligned} WatchTV \equiv & Context{ \sqcap }\exists contains.TurnTvOn \\ & \;\;\;\;\;{ \sqcap }\exists contains.RemoteControlMoved \\ & \;\;\;\;\;{ \sqcap }\exists contains.LivingRoomPresence \\ \end{aligned} $$
$$ \begin{aligned} BathroomVisit & \equiv Context{ \sqcap }\exists contains.BathroomPresence \\ & { \sqcap }\exists contains.TurnBathroomLightsOn \\ \end{aligned} $$

5 Problem Detection

According to clinical experts involved in the development of Dem@Home, highlighting problematic situations next to the entire set of monitored activities and metrics would further facilitate and accelerate clinical assessment. Dem@Home uses a set of predefined rules (expressed in SPARQL) with numerical thresholds that clinicians can adjust and personalize to each of the individuals in their care, through a GUI. This way the thresholds attach to each participant or even to a particular period as well.

Furthermore, analysis is invoked for a period of time allowing different thresholds for different intervals e.g. before and after a clinical intervention. Problematic situations supported so far regard night sleep (short duration, many interruptions, too long to fall asleep), physical activity (low daily activity totals), missed activities (e.g. skipping daily lunch) and reoccurring problems (problems for consecutive days). An instance of a short sleep duration problem in SPARQL is given in Fig. 2.

Fig. 2.
figure 2

SPARQL Problem definition for short sleep duration

6 End-User Assessment Application

At the application level, Dem@Home provides a multitude of user interfaces to assist both clinical staff, summarizing an individual’s performance and highlighting abnormal situations, and patients, proposing simplified view of measurements and educational material.

The clinician interface offers four different approaches to monitoring a patient, i.e. Summary, Comparison, and All Observations, as well as four options of time extent of the data, i.e. One-Day, Per Day, Per Week and Per Month. In One-Day Summary, sleep measurements are obtained from one single night and are categorized as Total Time in Bed but Awake, Total Time Shallow Sleep, Total Time Deep Sleep, Total Time Asleep, Number of Interruptions and Sleep Latency (Fig. 3). In Summary Per Day, the clinician is able to select to a time interval between two dates or a single date, to observe sleep stages, physical levels and other activities of daily living, derived from power consumption, moved objects and presence in rooms (Fig. 4).

Fig. 3.
figure 3

The clinician interface regarding sleep parameters, in one-day summary session.

Fig. 4.
figure 4

Sleep, daily activities and problems in summary per day session.

Moreover, the clinician can set specific thresholds about sleeping problems during the night and a problems section will be added (bottom of Fig. 4). In Comparison per Day, different measurements of a particular time period can be combined in the same chart, allowing the clinician to check how observations affect each other, e.g. how physical activity affects sleep or how usage of a device affects a daily activity (Fig. 5). Finally, the Comparison screen shows a scatterplot for two types of measurements, while All Observations shows all collected data in detail. The Per Week/Month options offer the above-mentioned functionalities summarized per week/month.

Fig. 5.
figure 5

Comparison Per Day chart between two activities

On the other hand, patients are introduced to an alternative interface, tailored to provide easy monitoring of their daily life and simple interaction with the clinicians. Accessed by a tablet device, a limited view of the most important measurements is displayed, to avoid overwhelming the users or even stressing them out. The patient interface presents 3-day information regarding Physical Activity (daily steps and burned calories), Sleep, Usage of Appliances and Medication. Especially in Sleep section, patient is notified about how many sleep interruptions they had during the night. In addition to sensor readings, the patient interface is enhanced with educational material, such as recipes or instructions to guide them step-by-step to perform routine tasks, and the ability to exchange messages between end-users and clinicians. An example screen of what is shown to end-users can be seen on Fig. 6. Namely, end-users can see a digested view of some metrics, presented in a manner that generates positive thinking only, avoiding to burden them with problems. Overall, the application is explicitly design to help patients feel confident and secure with the system they are using, but also to encourage social interaction between users and clinicians.

Fig. 6.
figure 6

The patient interface, including message inbox (top), chart with sleep metrics for the last 3 days (middle) and alert notification about number of interruptions (bottom).

7 Evaluation

Dem@Home was evaluated in four home installations, in the residences of individuals living alone, clinically diagnosed with mild cognitive impairment or mild dementia, and maintained for four months. Sensors and relevant home areas or devices of the installation (Table 1) were selected after a visit from the clinician to the participants. The majority of deployed sensors covered the areas of kitchen, bathroom and bedroom, since these rooms are strongly linked with most daily activities.

Table 1. Sensors in home installation

Since the framework embodies an interdisciplinary approach, it was evaluated both from research and clinical perspective. Firstly, we evaluate the effectiveness of activity recognition through fusion of sensor data and existing multimedia analytics. Secondly, clinical results vary and add significant value to monitoring and interventions.

For the evaluation of the ontology-based fusion and activity recognition capabilities of Dem@Home, ground truth has been obtained through annotation (performed once), based on images from ambient cameras. We use the precision and recall measures, to evaluate the performance with respect to ADLs recognized as performed. The clinical expert suggested the monitoring of five activities, namely drug box preparation, cooking, making tea, watching TV and bathroom visit. Table 2 depicts the pertinent context dependency models defined.

Table 2. Context dependency models for the evaluation

Dem@Home’s ADL activity recognition performance has been evaluated on a dataset of 31 days, in July 2015. As observed on Table 3, the more atomic and continuous an activity is, the more accurate the detection. BathroomVisit, most accurately detected, is never interleaved to do something else. On the contrary, cooking is a long-lasting activity interrupted by instances of other events (e.g. watching TV) and influenced by uncertainty and the openness of the environment. WatchTV and PrepareTea are fairly short in duration, causing less uncertainty and interleaved events in between, yielding decent precision and recall rates. Therefore, the fact that the worst performance appears in the Cooking activity can be appointed to it being a highly interleaved and long-lasting activity as opposed to the others.

Table 3. Precision and recall for activity recognition

On the other hand, the clinical evaluation of the framework regards its capabilities and the fulfillment of clinical requirements. With Dem@Home supporting clinical interventions, significant improvement was found in post-pilot clinical assessment in multiple domains, such as increase in physical condition and sleep quality, utterly bringing about positive change in mood and cognitive state, measured objectively by neuropsychological tests. In detail, the first participant has overcome insomnia, the lack of exercise and neglecting daily chores. The second participant has shown improvement in sleep and mood, while the other two users have been benefited with respect to sleep and medication. More elaborate details on the clinical value and outcome of the experiment are ongoing work.

8 Conclusion and Future Work

Dem@Home is an ambient assisted living framework integrating a variety of sensors, analytics and semantic interpretation with a special focus on dementia ambient care. New, affordable sensors have been integrated seamlessly into the framework, along with a set of processing components, ranging from sensor to image analytics. All knowledge is semantically interpreted for further fusion and detection of problematic behaviours, while tailored user interfaces aim to detailed monitoring and adaptive interventions. Evaluation of the framework has yielded valuable and optimistic results with respect to accurate fusion and activity detection and clinical value in care.

Regarding future directions, Dem@Home could be extended for increased portability and installability. Specifically, establishing an open source, IoT-enabled semantic platform, following the latest advances in board computing would allow the platform to be easily deployed in multiple locations. Combined with the infrastructure to push the events on a cloud infrastructure, the framework could constitute a powerful platform for telemedicine and mobile health, combing sensors and sophisticated ambient intelligence techniques such as computer vision.