1 Introduction

Medical staff have to manage complex situations [1] which involve: working with professionals from different fields; reaching agreement; performing multiple tasks; dealing with frequent contingencies that requires a constant adjustment of their actions because of the dynamic nature of their work [2]; and keeping in mind several pending tasks [3].

These conditions can be improved through context-aware [4] and ubiquitous [5] systems. Users can work without thinking about the system which is responsible for adapting itself to existing conditions at any time and under any circumstance. Generally speaking, context-aware systems adapt themselves to offer functionality and information that users need based on the context. However, current technological possibilities provide a chance of taking a step forward. The context offers valuable information, such as the physical capabilities of the user, which can be used to determine the appropriate interaction mechanism. In this sense, the distribution of multiple interaction methods across the working environment represents a good opportunity for context-awareness improvement.

The distribution of interaction mechanisms is directly related to multi-device environments. The possibility to interact with multiple devices offers more ways of interaction. But it also helps to continue improving context-aware capabilities. Users can interact with specific devices according to their needs, such as mobility conditions in healthcare [6]. At the same time, a multi-device environment adds a new possibility of improvement for context-aware systems: the use of the distribution of user interfaces. As users are provided with a wide range of devices, the system can adapt the interface to the hardware and also to the users’ context.

Taking all the above into account, the authors have developed a ubiquitous and context-aware comprehensive solution for deployment in healthcare centres, named Ubi4Health [7]. The solution supports many functions, from task management, an alert notification mechanism, and falls and fainting detection to a rehabilitation assistant. This paper describes how Ubi4Health generates a novel context-aware environment for users (employees, patients and residents). The main contribution of this work is in the use of distributed interaction mechanisms, a multi-device environment and appropriate user interfaces to adapt the system to a user’s context. These elements provide users with an experience that is adapted to their needs at the information and functionality level, and also to the way they interact with the system. The related works studied [6, 810] offer solutions to specific fields, as Ubi4Health does within its modules. However, none of them considers context-awareness as more than adapting data and functionality capabilities.

The following section summarizes the way Ubi4Health improves healthcare conditions thanks to its support for a novel development of context-awareness. Finally, we present our conclusions.

2 Ubi4Health: Improving and Making Progress with Traditional Context-Aware Systems

Ubi4Health is a novel, context-aware and ubiquitous comprehensive solution to enhance working conditions in healthcare environments. The solution has been developed modularly, and each module is related to a specific domain (Fig. 1): (1) tasks module to manage tasks (alerts are tasks with a high priority); (2) rehabilitation module to improve specific rehabilitation processes; and (3) KFF (Kinect for Falls and Fainting) system to automatically detect anomalous situations in the environment.

Fig. 1.
figure 1

Interfaces and interaction mechanisms of Ubi4Health

The context-awareness in Ubi4Health represents a significant contribution. The system offers functionality and information based on a user’s context. However, the objective has been to go a step further. To that end, three features and capabilities are used together: distributed multimodality, multi-device and the distribution of user interfaces.

Firstly, Ubi4Health has a complete set of interaction mechanisms distributed over the healthcare environment (Fig. 1). On the one hand, the solution allows users to interact with traditional mechanisms (keyboard, mouse and touchscreen). These offer the possibility to work with tasks and rehabilitation modules regardless of the circumstances. However, there are situations in which it is necessary to make use of other interaction methods to enhance the user experience. In the tasks module there is a role named operator. This role has to control the completion of tasks, adequate staff distribution and the assignment of alerts. As a result, the amount of information to be managed is huge. Therefore, Ubi4Health incorporates the use of users’ gaze to interact with the operator. In this way, the system is able to know where the user is looking in order to identify loss of information on unattended displays. The system adapts its behaviour and aspect to a user’s context as it generates notifications to assist the perception of information [11].

Movement-based interaction is an essential element. In the rehabilitation module, this interaction allows patients to complete their therapies at home. The system analyses a patient’s movements and automatically guides and monitors. Also, this mechanism offers medical staff the possibility to define exercises to be performed by patients. In particular, medical staff must generate therapies, defined as posture repetitions. The postures can be created through a 3D designer, and also by the movement of the medical staff. Furthermore, in the KFF System, the solution uses movement interaction to continuously analyse a patient’s/resident’s postures. The objective is to detect falls and fainting, which is an important part of patient care.

The distributed multimodality offered by Ubi4Health is completed with the user’s voice. The solution provides the possibility of control via voice for some interfaces. This feature helps in those situations in which the use of hands or the whole body is a complicated task. An example arises during the rehabilitation process. If a patient is performing an exercise, voice is a good way of controlling the system.

This multimodality is provided together with an important infrastructure which generates the second feature related to context-awareness improvement: the use of a wide device set. Ubi4Health offers pcs, laptops, tablets, PDAs, mobile phones, as well as Kinect cameras, web cameras and touchscreens. The result is the capacity to adapt the use of the solution to the appropriate device based on the user’s needs. For example, the task module considers two user types with the role of employee who present different mobility needs. Firstly, there are users who are constantly in movement, such as doctors. They have to work with convenient devices to be carried with them and which can be used anywhere and at any time. Ubi4Health offers them mobile phones, tablets or PDAs. The other type of user has no movement needs, and is named operator. They are offered a pc, but with one peculiarity, namely a multi-display setting. The main reason for this comes from the fact of working with a large amount of information from different domains.

The last feature of Ubi4Health that is related to context-awareness is the distribution of interfaces (Fig. 1). Modularity has implied the use of different interfaces for different components of the solution. This fact adds another level of customization to different needs and objectives. In addition, one of the interfaces is a distributed user interface, which is the one related to tasks and oriented towards operators. As these users have to manage large quantities of information, the related interface is deployed over three monitors to support three domains for analysis.

3 Conclusions and Future Work

Healthcare environments are an important research area in which to make efforts to improve working conditions. Medical staff have to be able to address complex situations in which mobility, multidisciplinary and dynamic features appear as constraints. Under these conditions, an appropriate way to help users is to offer a system for their daily duties with the capacity to adapt itself according to their needs. For this purpose, this paper has presented Ubi4Health, a comprehensive solution for healthcare environments, focusing on the contribution related to context-awareness. The solution supports users with appropriate and necessary information and functionality on the basis of their current context. In addition, the use of multiple devices, distributed interaction mechanisms and user interfaces has allowed us to go another step forward. A novel scenario appears as Ubi4Health is able to reach an improved level of user experience. Based on the user’s context, the solution provides employees and patients with the appropriate device, user interface and way to interact with it at the time.

There are interesting lines of future work connected with the presented context-aware scenario. Currently, the authors are considering the possibility of extending the devices mentioned. For instance, wearable devices offer an interesting way in which Ubi4Health can increase its mobile capabilities and ways of interaction.