This paper presents a perspective on Biosignal-Adaptive Systems (BAS) which automatically adapt to user needs by continuously interpreting their biosignals and by providing transparent feedback, thereby keeping the user in the loop. The major hallmark of the described BAS is the low latency with which biosignals are processed, interpreted, and applied to perform rapid system adaptation, providing the user in the loop with immediate feedback on the BAS’s understanding of his or her condition. In contrast to explicit user input or the interpretation of observable behavior, the rapid system adaptation relies on biosignals, which in context of a concrete application can be interpreted as implicit signals of user needs. Recently, great advances have been made in sensor integration into smart devices, making it possible to collect vasts amounts of multimodal biosignal data. Furthermore, powerful machine learning methods enable rapidly processing and learning from such data. We argue that the time has come to harness the full spectrum of low-latency processing of biosignals to understand user needs and to apply this information to deliver adaptive systems accordingly. However, this will just be the beginning: real-time signal processing in combination with ubiquitous devices that are always connected to huge processing and storage capacities allow systems to provide users (and bystanders) with instant and transparent feedback and adaptations for the recognized needs. In the future, such systems could run 24/7 to assist users @home, @work, and @play from the cradle to the grave. Thus, BAS must be human-centric to curate data, archive information, learn from experience, and adapt to individual users throughout their lifespan. In this position paper, we present the concept of BAS with its key building blocks, provide selected examples of BAS from our research, and articulate selected challenges for future research.
Biosignals are generated by the human body without any intervention during physical, mental and social activities
BAS automatically adapt to user needs by continuously interpreting their biosignals in real-time
Examples of BAS are presented that have demonstrated in field experiments to improve engagement, attention awareness, and attention management
Similar content being viewed by others
Human behavior includes physical, mental, and social actions that emit a wide range of biosignals which can be captured by a variety of sensors. The processing and interpretation of such biosignals provides an inside perspective on human physical and mental activities, complementing traditional approaches of observing human behavior or collecting explicit input in form of subjective data (e.g. through surveys). Biosignals represent a promising approach to better understand user needs, apply this information for rapid system adaptation, and provide users with instant and transparent feedback on the system’s understanding of their needs. As great strides have been made in integrating sensor technologies into ubiquitous devices and in machine learning methods for processing and learning from data, we argue that the time has come to harness the full spectrum of biosignals to understand user needs and to adapt systems accordingly.
Adaptive systems are studied in many research communities. Many adaptive systems rely on data collected through observing human behavior or through explicit human input commands via human–machine interfaces. For example, navigation systems are based on GPS sensor data enabling location synchronization, or recommender systems leverage information about buying behavior and common interests of their users. These systems have advanced significantly, but they are all based on the availability of their users’ explicit input or their observable behavior. Complementary to this, biosignals offer the potential to adapt systems to the user’s biosignals, which in the context of a concrete application can be interpreted as implicit signals of the user needs. Thus, system adaptation to user needs can be performed even if the user does not provide explicit input or transmits observable behavioral data.
This position paper introduces Biosignal-Adaptive Systems (BAS) which continuously record biosignals of users to learn their current needs, and adapt system components, models, and system output to it with the goal to improve performance during the interaction with the user. The paper first describes a taxonomy of biosignals suitable to identify user needs, introduces a key conceptualization of BAS, and finally illustrates examples of BAS ranging from biosignal-adaptive interaction systems that improve assistance by identifying mental states such as attention and engagement to biosignal-adaptive enterprise systems that target increasing human worker productivity and well-being. Finally, we articulate future research challenges for the successful deployment of BAS to business and society.
2 Key concepts
Biosignals are autonomous signals produced by the living organism, energetically measurable in physical quantities using sensors . Biosignals are based on chemical and physical actions of the human body and serve to control, regulate and transfer information in the human organism. Thereby enabling orderly interaction in the overall human system. Depending on their origin, biosignals are measured in different quantities, i.e. in the form of electrical quantities (potential, current, resistance), mechanical quantities (force, pressure, movement), acoustic quantities (speech, non-verbal articulations and body noises), thermal quantities (temperature, amount of heat), and chemical quantities (concentration, pH). Figure 1 depicts our biosignals taxonomy. It indicates that a biosignal is the result of a human activity captured by a particular sensor. Human behavior may encompass several activities, e.g. attention could show from brain activity, eye gaze and facial expression. In such a case, several sensors are applied to simultaneously capture multiple human activities. We refer to the result as multimodal biosignals.
Today, the daily life of many people is interwoven with the use of digital devices. Modern devices are already well equipped with a large variety of sensors, many of them are always on and always connected to the internet. Modern smartphones for example, hold (1) multiple microphones to capture acoustic biosignals in form of speech and non-verbal articulation (laughing, breathing, snoring) from the user and bystanders; (2) multiple cameras to capture optical biosignals, such as the user’s face, facial expressions, and eye gaze; (3) inertial sensors to measure kinematic biosignals, such as acceleration and angular velocity of motion in 3D; (4) infrared sensors to measure body heat and to control other devices; and (5) laser sensors to measure distance, just to name a few. Furthermore, body-attached electrodes are used to measure electrical biosignals, some are known from medical examinations. Common examples of electrical biosignals are heart activity measured by electrocorticography (ECG), muscle activity recorded by electromyography (EMG), brain activity captured by electroencephalography (EEG), eye activity measured by electrooculography (EOG) and skin conductance measured by electrodermal activity (EDA). As a result of the widely integrated and available sensors, most BAS focus on acoustic, kinetic, optical, and electrical biosignals, while thermal and chemical biosignals are yet to be fully explored. Body temperature has been successfully used at airports since the SARS outbreak 2003 for fever screening and detection systems based on infrared cameras (optical biosignal) , but the differentiated adaptation of systems to thermal biosignals is less promising. Major reasons are that body temperature is subject to fluctuations due to changing environmental conditions and physical activity, and that the body temperature range offers little differentiation potential for BAS.
In addition to already deployed sensors that accompany and surround us in everyday life, a new generation of sensors is in the starting block that can be woven into clothing, printed on the skin, injected under the skin or implanted in the body.
Chemical biosignals are typically measured by taking blood samples, which prohibits their continuous application in adaptive systems. However, lately non-invasive sensors are proposed, such as a glucose sensors which determine the blood sugar level based on the glucose concentration in the sweat at the surface of the skin . Wearable sensor systems became available that allow for non-invasive 24/7 measurement, which is extremely useful for the growing number of diabetes mellitus patients. Once these sensors get connected to an insulin pump, it becomes a BAS based on chemical biosignals. Applications reach far beyond medical purposes, for example, lifestyle apps are possible that offer their users guidance in food choice decisions depending on their blood sugar level. Other chemical quantities like pH are also conceivable as soon as non-invasive sensors become available.
This new level of sensor integration holds both enormous potential for BAS but also challenges and risks with regard to privacy, data protection, transparency, and user empowerment. These issues will be discussed in the final section of this paper.
2.2 Adaptive systems
In biology, adaptation is described as the process of change by which an organism or species becomes better suited to its environment. The ability to adapt is crucial for the survival of living beings. The concept of adaptation has also been transferred and successfully applied to information technologies (IT). The design of adaptive systems is investigated in many different research communities in computer science, e.g. for hypertext/hypermedia systems , for speech communication systems to adapt to speakers , domains , and languages , human-computer interaction (HCI) [8,9,10], robotics , ubiquitous computing , software engineering [13, 14], systems for clinical support , for nursing care , and for everyday assistance at home , to name a few. In particular, the field of artificial intelligence (AI) has coined the notion of intelligent agents that perceive their environment through sensors and act upon that environment through actuators . Intelligent agents are characterized by the ability to learn and act autonomously. Thus, we argue that an intelligent agent represents one form of an adaptive system. In the past, intelligent agents were mainly based on predefined prior knowledge (e.g. in the form of lookup tables or condition-action rules). With the availability of growing amounts of contextual data and recent advances in machine learning, it has become possible to design intelligent agents that are able to gain experience over time and extend prior knowledge based on collecting and processing context information perceived through sensors. Today, contemporary adaptive system leverage context to automatically perform system-driven adaptations with varying degrees of intelligence. They range from the simple application of predefined knowledge (e.g. in the form of rules) to the ability to learn continuously in order to expand the predefined knowledge. Context in that sense is any information that can be used to characterize the situation of an entity, where an entity can be a person, place, or object that is considered relevant to the interaction between a user and an adaptive system . As mentioned above, biosignals capture user context by providing an inside perspective on human physical, mental, and social activities.
2.3 Building blocks of biosignal-adaptive systems
Biosignal-adaptive systems (BAS) are able to react to changing user needs in varying tasks and environments. Conceptually, BAS build on control theory . User needs and changes thereof are predicted by continuously measuring, processing, and interpreting the biosignals emitted by the user. The predicted result is provided to the technical system, which is equipped with the ability to adapt its behavior to the user needs, for example through audible or visual output to the graphical user interface, or through changes in reaction time or through changes in solution strategies. Similar models have been proposed in the field of self-adaptive software, e.g. the MAPE-K model , the observer and controller model , or the sense-plan-act model .
In contrast to these models, our proposed BAS conceptualization combines several features in an innovative way, which discriminates it from existing solutions: Firstly, our BAS closes the human–machine interaction loop by giving the user continuous and timely feedback about its interpretation of the user needs based on instant processing of multimodal biosignals. According to our concept of transparent BAS, the user receives this feedback in the form of the interpreted needs, the system adjustments made and the resulting biosignal response. Thereby secondly, BAS focus on the human as the system under control and observation, where biosignals are transformed into a control input for providing real-time adaptation following a continuous loop approach. This way the user who implicitly produces these biosignals can change the behaviour and outcome of the technical system without explicit command and control. Thirdly, since BAS performs in real-time without perceivable latency between the biosignals input and the system’s feedback, the user and the technical system form something like an oscillating circuit: not only does the system adapt to the user, but the user also tunes to the system by moderating their biosignals. This opens avenues for radically new methods and applications of co-adapting interfaces. First examples come from the field of biosignal-based spoken communication , in which we use speech-related biosignals beyond acoustics, stemming from articulatory muscle activity, neural pathways, and the brain itself, to convert them directly into audible speech that is played to users with low latency, such that they can listen to themselves thinking .
Building on existing work in the fields of physiological computing with the concept of the biocybernetic loop  and human-computer interaction [9, 10] we argue that such a BAS should consist of four interconnected building blocks as depicted in Fig. 2.
The BAS-process gets initiated (see Fig. 2 clockwise at block labeled Human), when a user engages in physical, mental or social activities with or through a technical system. Consequently, s/he emits multi-dimensional biosignals, which are measured by a sensor-equipped Biosignals Recording device. In case multimodal biosignals are captured, the signals will have to be time-synchronized between devices prior to signal transmission. The subsequent Processing and Classification component processes the received signal by performing artifact removal, normalization, and extracting features relevant for the classification or recognition task. If multimodal biosignals were measured, fusion strategies are applied to best complement information. We refer to classification if a single sample is classified into one-out-of-n classes, while recognition finds the most likely class label sequence for time series signals. Here, the system interprets for example user traits (e.g. identity, age, personality) and user states (e.g. emotion, engagement, attention, work load). The result is transmitted continuously to the Adaptation component which decides if the technical system should adapt at the given moment in time. If so, the adaptation process is performed according to the implemented adaptation strategy. Such a strategy could consist of a simple set of if-then-else rules or any complex behavior modeling. Adaptation can either be done only once or after batches of signals, or continuously to adapt dynamically to changes. Furthermore, adaptation might apply supervised learning strategies which require some grounding by supervision or interaction with users. Alternatively, an unsupervised learning mode is applied, for which the predicted class labels are treated as ground truth. The resulting adapted technical system provides an output via the graphical user interface to the user, who reacts by generating new biosignals, thereby closing the human-system interaction loop. Since the biosignal generation is an implicit process, the user influences the system behavior without having to perform explicit directed actions or system inputs.
3 Examples of biosignal-adaptive systems
We present several examples of BAS to illustrate which biosignals are useful for adapting systems to user needs. The examples showcase which biosignals are captured, processed and interpreted and how the system components and thus the system behavior is adapted and feedback is given to the user in real-time. The first class of systems concerns Biosignal-Adaptive Interaction Systems for Assistance and Activation. One application describes SmartHelm, an attention-sensitive smart helmet for driver assistance. The other application features I-CARE, an activation system for people with dementia that selects activation content based on the engagement of its users. The second class of systems demonstrates the potential in the context of designing adaptive workplaces under consideration of productivity and well-being of human workers. Biosignal-Adaptive enterprise systems process biosignals of human workers and adapt the workplace accordingly. Two applications of such systems focusing on the user states of attention and flow demonstrate the potentials of processing biosignals and provide corresponding workplace adaptations in real-time.
3.1 Biosignal-adaptive interaction systems for assistance and activation
Two applications, one for user assistance (SmartHelm) and one for user activation (I-CARE) are presented which measure multiple modalities (e.g. speech, facial expressions, eye gaze, brain activity) and fuse the resulting biosignals to reliably discriminate user states such as attention, distraction, work load, stress, and engagement. The prototypical systems feature low-latency signal processing and fast machine learning methods to provide immediate feedback and adapted system response to its users. The applications were evaluated and validated in field studies.
3.1.1 Attention-aware driver assistance: smartHelm
SmartHelmFootnote 1 is an attention-sensitive smart helmet that integrates none-invasive brain and eye activity detection with hands-free augmented reality components in a speech-enabled outdoor assistance system . It is designed for cargo bikers, who are closing the last mile in city logistics by delivering goods from a transportation hub to the final destination. Since cargo bikers typically navigate busy city roads to deliver goods under time pressure, their job requires full attention and constant adaptation to a wide variety of situations and distractions. They can therefore use any technical support that keeps them on track, reduces their stress level, and increases their safety on the road.
SmartHelm continuously tracks the activity of the eyes and brain of the bikers to interpret attention and distraction in the driving context. The interpreted user states are then applied to adapt eye- and hands-free assistance services such as navigation, task planning and communication to the biker’s needs, e.g. relevant task information is presented in a context-sensitive and least disruptive manner . Figure 3 shows the helmet prototype (top right) and look-through (top left). The bottom panel displays derived information to the expert during development, i.e. center: annotations of objects in the path, bottom right: the EEG and eye-tracking biosignals of the biker, both used to train the AI system, bottom left: a heat-map with the biker’s GPS-trace along with the automatically identified attention level. The critical task of SmartHelm is to find the sweet spot to provide useful and timely assistance without overloading the cyclist.
3.1.2 Engagement-aware activation systems: I-Care
I-CARE is a hand-held activation system that allows professional and informal caregivers to cognitively and socially activate people with dementia in joint activation sessions without special training or expertise. It is suitable for activation in ad-hoc group sessions (see left-hand side of Fig. 4) and in individual tandem sessions (see right-hand side of Fig. 4). I-CARE consists of an easy-to-use tablet application that presents activation content and a server-based backend system that securely manages the contents and events of activation sessions.
After requesting permission, I-CARE uses the microphone and camera of the tablet to record acoustic and optical biosignals. It also stores keyboard interactions and integrates an E4 wristband to measure electrical biosignals, such as ECG and EDA. I-CARE uses these multimodal biosignals of explicit and implicit user interaction to estimate which content is successful in activating individual users. Over the course of use, I-CARE’s recommendation system learns about the individual needs and resources of its users and automatically personalizes the activation content. In particular, it identifies the engagement of individual users to presenting them with content that they find interesting – thereby keeping users on track to increase activation time and intensity, which correlates with outcome. In addition, information about past sessions can be retrieved such that the activation items seamlessly build on previous sessions while eligible stakeholders are informed about the current state of care and daily form of their protegees .
3.2 Biosignal-adaptive enterprise systems
Biosignal-adaptive enterprise systems define a class of information systems in organizations where adaptive digital workplaces for human workers are provided based on monitoring, analyzing, and responding to biosignals in real-time. In the following we present two biosignal-adaptive enterprise systems for the digital workplace with a specific focus on the two psychological user states of i) attention and ii) flow. Specifically, in i) we capture eye gaze using eye-tracking technology to recognize visual attention of human workers when working with information dashboards for decision-making. In ii) we monitor heart activity with surface electrodes to capture electrical biosignals (specifically ECG signals) and on this basis recognize flow states. Building on the discovered flow states, we provide flow-adaptive notifications at the digital workplace for human workers.
3.2.1 Attentive information dashboards
Information dashboards are a critical capability in contemporary business intelligence and analytics systems supporting decision-making in organizations. Despite their strong potential to support better decision-making, the massive amount of information they provide challenges users performing data exploration tasks.
Accordingly, information dashboard users face difficulties in managing their limited attentional resources when processing the presented information on dashboards. Attentive information dashboards leverage eye-tracking in real-time and provide individualized visual attention feedback (VAF) to human workers. Specifically, we measure fixation duration and the number of fixations on predefined areas of interest of the dashboard. The underlying idea is that providing quantified information about human worker’s visual attention will improve attentional resource allocation as well as resource management of human workers.
Figure 5 depicts the basic idea of attentive information dashboards at the workplace. Specifically, we use the Tobii EyeTracker 4C to collect eye gaze and present attention feedback as an overlay to the information dashboard. Our research has demonstrated the positive effects of attentive information dashboards on user’s attentional resource allocation and resource management . Attention-aware adaptive systems at the workplace are not only relevant for processing huge amounts of information on information dashboards. In a related research project, we have shown their potential for improving attention management in virtual team meetings .
3.2.2 Flow-adaptive notification management systems
Flow refers to the holistic sensation that people feel when they act with total involvement. Promoting flow in the context of work is desirable, because it leads to increased workers’ well-being and performance .
However, with the increasing number of interruptions at the workplace, it is becoming more difficult to achieve the desirable flow state. Therefore, as part of the research project Kern funded by the German Ministry of Work and Social Affairs (BMAS)Footnote 2 we first targeted to discover flow states automatically in real-time using biosignals and supervised machine learning. Subsequently, we designed different form of adaptations which should intelligently protect human workers from notifications when being in flow states.
Figure 6 depicts the workplace as well as two participants of the field study carried out as part of the Kern project. We designed and deployed a flow-adaptive notification management system using the Polar H10 device for collecting ECG signals as depicted on the right bottom of Fig. 6. The device was connected via Bluetooth to the corresponding computer. We provided a notification management plugin to the operating system that allowed participants to manually activate/deactivate the connection. Specifically, we leverage cardiac features collected in the form of ECG signals to train a flow state classifier . We train this classifier using labeled data collected through an experience sampling method (ESM) procedure in a first step. In a second step, the flow-adaptive notification system leverages this classifier. Specifically, we implement the flow-adaptive notification system as a plugin for the collaboration tool Slack. More detailed information about the study and the evaluation results are provided in .
4 Future research challenges
In this position paper we have introduced the concept of BAS and its major building blocks. Furthermore, we have described several BAS applications from our own research. In the following, we describe the lessons learned from implementing BAS in terms of recording and annotating data, from using it to model human activities for the purpose of both analysis and synthesis, and from adapting and evaluating systems, taking into account the work of others. Subsequently, we articulate two major challenges for the successful delivery of BAS to business and society.
4.1 Implementation challenges
The implementation of BAS comes with numerous challenges. In the following we focus on four major areas that we were also confronted with: (1) data collection and annotation, (2) models of BAS-relevant human activities, (3) BAS design space for adaptation strategies, and (4) BAS evaluation.
First, even as sensors quickly improve, collecting high-quality sensor data in the field remains a major challenge. Typical challenges are (i) ethical considerations, which we have dedicated a separate subsection to (see 4.3 below), (ii) artifacts ranging from technical (e.g. sensors, connectors, network communication) and environmental (e.g. signal interference, ambient noise) to biological (e.g. sweat, eye-blinks) factors, which result in noisy data , (iii) the need of sensor calibration (e.g. eye trackers), and (iv) the need of baseline data (e.g. data for resting state in ECG, and for data normalization). Furthermore, annotations of recorded data are an integral part of machine learning and AI applications. Data annotation is one of the most time-consuming and labor intensive part, it requires talented and motivated annotators, clear annotation guidelines including semantic methodologies, ontologies, and - particularly for synchronous recordings of multimodal biosignals, it relies on suitable reliable tools . If self-reported data for example about cognitive user states needs to be collected, e.g., by using the experience sampling method , the corresponding BAS studies become very lengthy and exhausting for the participants.
Second, the development of BAS-relevant human activity models encompass a range of intricate tasks, such as the extraction and selection of good features , the choice of appropriate ML  or Deep Learning  approaches together with the definition of suitable error functions, strategies for parameter optimization, and proper evaluation metrics, just to name a few. Development of models also includes considerations of (i) robustness and generalizability with respect to data variability within and across users, tasks, and context , (ii) transferability and scalability of models, i.e. a model can handle unseen user states and traits even when only few or zero data samples are available  and a model can cope with any amount of data in a cost-effective way, and (iii) accountability and bias-awareness .
Third, we were challenged by making informed decisions regarding the BAS design space for adaptation strategies. Existing literature has shown that the design space for adaptive systems is huge . With regards to possible adaptations it ranges from the modification of content, the interaction, the task scheduling and allocation. Furthermore, different types of triggers, e.g. spatio-temporal, environment, task, human or system states should be considered. It is impossible to systematically “test” all possible design configurations.
Fourth, evaluation of adaptive systems in general is known to be a non-trivial task. Therefore, existing literature has proposed a modular approach  that includes technical performance as well as empirical evaluation building blocks. The specific characteristics of BAS as a specific class of adaptive systems make their evaluation even more challenging. For example, for implementing the flow-adaptive notification management system we first had to collect data, build a flow classifier and evaluate its technical performance. The flow-adaptive notification system made use of the classifier as a technical building block. From an evaluation point of view, we evaluated the entire system with real users in the field. It is challenging to clearly separate the dependencies between the perception of the quality of the flow classifier and the evaluation of the overall system from the user’s point of view. Furthermore, the effects of the BAS on its users heavily depend on the individuals and their context. For example, some users already proactively managed their notification setups with regards to en-/disablement. They have therefore not benefited from the BAS.
4.2 Advancing AI methods for BAS
AI methods, specifically supervised machine learning techniques have greatly advanced. However, AI methods are not yet ready to fulfill all requirements for building BAS. Thus, in the future AI methods and tools need to be advanced to continuously process and interpret biosignals, to iteratively train and update models, to dynamically adapt to changing tasks and environments, to learn which information to keep and what to forget, and to discover how to transfer acquired knowledge to unseen domains, unknown users, or new architectures and platforms. We believe that BAS, as a challenging area of interdisciplinary research at the intersection of AI/ML, sensors, and adaptive systems design, provide both the push and the pull to further develop the respective fields.
4.3 Ethical considerations of BAS
Major initiatives have been initiated in recent years focusing on ethical considerations with regards to AI-based systems such as BAS. One example is the “Ethical Aligned Design (EAD)” initiative, in which several hundreds of professionals including engineers, scientists, ethicists, sociologists, and economists from six continents have formulated societal and policy guidelines in order for intelligent systems to remain human-centric, serving humanity’s values and ethical principles . They should prioritize and have as their goal the explicit honoring of our inalienable fundamental rights and dignity as well as the increase of human flourishing and environmental sustainability. It begins with conscious contemplation, where ethical considerations help us define how we wish to live. EAD defines 8 general principles to be followed by AI system creators, namely human rights, well-being, data agency, effectiveness, transparency, accountability, awareness, and competence. It also provides clear guidelines, methods, and metrics on how to bring these general principles to practice ().
During the design, implementation, and use of the BAS described above, the authors and their teams adhere to these principles and guidelines. We strive to sensitize our students to the ethical considerations related to AI systems in general and BAS in particular by discussing them in teaching and training as well as enforcing their reflection in an early stage of each research project. However, we believe that future research is required in better understanding how to break-down the generic principles to the specific context of BAS. Furthermore, a deeper understanding of design trade-offs considering ethical principles is required. For example, BAS may positively impact well-being of individuals (e.g. increase flow), but at the same time come with new challenges with regards to data security and privacy.
In this position statement paper we presented our perspective on BAS, an increasingly important class of AI systems that are able to automatically adapt to user needs by continuously interpreting their biosignals. We described selected key concepts and building blocks of BAS as well as showcased selected BAS examples. In order to fully leverage the potential of BAS future research is required. Specifically, we highlight advancing AI methods for BAS and contextualizing ethical principles for BAS as well as achieving a deeper understanding of design trade-offs.
Availability of data and materials
SmartHelm (https://smart-helm.com/) is a collaborative project (Universities of Bremen and Oldenburg, city of Oldenburg, CitiPost, Rytle, TeamViewer, and Uvex) funded by the German Federal Ministry of Transport and Digital Infrastructure as part of the research initiative mFUND (19F2105F).
Kern - AI-based Competence Assistants (https://kern-kas.org/) is a collaborative research project (Karlsruhe Institute of Technology, SAP SE, Workwise GmbH, TÜV Rheinland B. Braun Melsungen GmbH)
Ethical aligned design
Experience sampling method
Visual attention feedback.
Schultz T, Amma C, Heger D, Putze F, Wand M (2013) Human-machine interfaces based on biosignals. Automatisierungstechnik 61(11):760–769. https://doi.org/10.1524/auto.2013.1061
Shu P-Y, Chien L-J, Chang S-F, Su C-L, Kuo Y-C, Liao T-L, Ho M-S, Lin T-H, Huang J-H (2005) Fever screening at airports and imported dengue. Emerg Infect Diseases 11(3):460–462. https://doi.org/10.3201/eid1103.040420
Zhu J, Liu S, Hu Z, Zhang X, Yi N, Tang K, Dexheimer MG, Lian X, Wang Q, Yang J, Gray J, Cheng H (2021) Laser-induced graphene non-enzymatic glucose sensors for on-body measurements. Biosensors Bioelectron 193:113606. https://doi.org/10.1016/j.bios.2021.113606
Brusilovsky P (2003) Developing adaptive educational hypermedia systems: From design models to authoring tools. Authoring Tools for Advanced Technology Learning Environments Toward Cost-Effective Adaptive Interactive and Intelligent Educational Software. https://doi.org/10.1007/978-94-017-0819-7_13
Leggetter CJ, Woodland PC (1995) Maximum likelihood linear regression for speaker adaptation of continuous density hidden Markov models. Comput Speech Lang 9:171–185. https://doi.org/10.1006/csla.1995.0010
Samarakoon L, Mak B, Lam AYS (2018) Domain Adaptation of End-to-end Speech Recognition in Low-Resource Settings. In: 2018 IEEE Spoken Language Technology Workshop (SLT), pp. 382–388. https://doi.org/10.1109/SLT.2018.8639506
Schultz T, Waibel A (2001) Language-independent and language-adaptive acoustic modeling for speech recognition. Speech Commun 35(1):31–51. https://doi.org/10.1016/S0167-6393(00)00094-7
Benyon D, Murray D (1993) Applying user modeling to human-computer interaction design. Artif Intell Rev 147:199–225. https://doi.org/10.1007/BF00849555
Avalos-Viveros H, Molero-Castillo G, Benitez-Guerrero E, Barcenas E (2018) Towards a Method for Biosignals Analysis as Support for the Design of Adaptive User-Interfaces. Research in Computing Science. 147: 9–19.https://doi.org/10.13053/rcs-147-11-1
van der Zwaag M, van den Broek E, Janssen J (2010) Guidelines for biosignal driven HCI. In: Girouard, A., Mandryk, R., Nacke, L., Solovey, E.T., Tan, D., Jacob, R.J.K. (eds.) Proceedings of ACM CHI2010 Workshop - Brain, Body, and Bytes: Physiological User Interaction, pp. 77–80. Association for Computing Machinery (ACM), United States
Billard A, Calinon S, Dillmann R, Schaal S (2008) Robot programming by demonstration. In: Siciliano B, Khatib O (eds) Springer Handbook of Robotics. Springer, Cham, pp 1371–1394. https://doi.org/10.1007/978-3-540-30301-5_60
Coutinho E, Alshukri A, de Berardinis J, Dowrick C (2021) Polyhymnia mood - empowering people to cope with depression through music listening. In: ACM International Symposium on Wearable Computers. UbiComp ’21, pp. 188–193. ACM, New York, NY, USA. https://doi.org/10.1145/3460418.3479334
Kramer J, Magee J (2007) Self-managed systems: an architectural challenge, pp. 259–268. https://doi.org/10.1109/FOSE.2007.19
Kephart JO, Chess DM (2003) The vision of autonomic computing. Computer 36(1):41–50. https://doi.org/10.1109/MC.2003.1160055
Sutton Reed T, Pincock D, Baumgart DC, Sadowski DC, Fedorak RN, Kroeker KI (2020) An overview of clinical decision support systems: benefits, risks, and strategies for success. NPJ Digit Med. https://doi.org/10.1038/s41746-020-0221-y
Huter K, Krick T, Domhoff D, Seibert K, Wolf-Ostermann K, Rothgang H (2020) Effectiveness of digital technologies to support nursing care: results of a scoping review. J Multidiscip Healthcare 13:1905–1926. https://doi.org/10.2147/JMDH.S286193
Graf B, Reiser U, Hägele M, Mauz K, Klein P (2009) Robotic home assistant Care-O-bot® 3-product vision and innovation platform. In: 2009 IEEE Workshop on Advanced Robotics and Its Social Impacts, pp. 139–144. https://doi.org/10.1109/ARSO.2009.5587059
Russell S, Norvig P (2021) Artificial Intelligence: A Modern Approach, 4th edn. Pearson, London, UK
Abowd GD, Dey AK, Brown PJ, Davies N, Smith M, Steggles P (1999) Towards a better understanding of context and context-awareness. In: Gellersen H-W (ed) Handheld and Ubiquitous Computing. Springer, Berlin, Heidelberg, pp 304–307. https://doi.org/10.1007/3-540-48157-5_29
Wiener M (1948) Cybernetics or Control and Communication in the Animal and the Machine. MIT Press, Cambridge, MA
Tomforde S, Prothmann H, Branke J, Hähner J, Mnif M, Müller-Schloer C, Richter U, Schmeck H (2011) Organic Computing A Paradigm Shift for Complex Systems. In: Müller-Schloer C, Schmeck H, Ungerer T (eds) Observation and Control of Organic Systems. Springer, Basel. https://doi.org/10.1007/978-3-0348-0130-0_21
Schultz T, Wand M, Hueber T, Herff C, Brumberg JS (2017) Biosignal-based spoken communication: a survey. IEEE/ACM Trans Audio Speech Lang Process 25(12):2257–2271. https://doi.org/10.1109/TASLP.2017.2752365
Angrick M, Ottenhoff MC, Diener L, Ivucic D, Ivucic G, Goulis S, Saal J, Colon AJ, Wagner L, Krusienski DJ, Kubben PL, Schultz T, Herff C (2021) Real-time synthesis of imagined speech processes from minimally invasive recordings of neural activity. Communications Biology 4(1). https://doi.org/10.1038/s42003-021-02578-0. Data collected at Epilepsy Center Kempenhaeghe, The Netherlands
Pope AT, Bogart EH, Bartolome DS (1995) Biocybernetic system evaluates indices of operator engagement in automated task. Biol Psychol 40(1–2):187–195. https://doi.org/10.1016/0301-0511(95)05116-3
Salous M, Küster D, Scheck K, Dikfidan A, Neumann T, Putze F, Schultz T (2022) Smarthelm: User studies from lab to field for attention modeling. In: 2022 IEEE International Conference on Systems, Man, and Cybernetics (SMC), pp. 1012–1019. https://doi.org/10.1109/SMC53654.2022.9945155
Küster D, Schering J, Janßen C, Putze F, Gómez JM, Schultz T (2020) Mobilität, Erreichbarkeit, Raum: (Selbst-) kritische Perspektiven aus Wissenschaft und Praxis. In: Appel A, Scheiner J, Wilde M (eds) Intelligente und aufmerksamkeitssensitive Systeme in der Fahrradmobilität. Springer, Wiesbaden, pp 143–158. https://doi.org/10.1007/978-3-658-31413-2_9
Schultz T, Putze F, Steinert L, Mikut R, Depner A, Kruse A, Franz I, Gaerte P, Dimitrov T, Gehrig T, Lohse J, Simon C (2021) I-CARE - An Interaction System for the Individual Activation of People with Dementia. Geriatrics 6(2). https://doi.org/10.3390/geriatrics6020051
Toreini P, Langner M, Maedche A, Morana S, Vogel T (2022) Designing attentive information dashboards. Journal of the Association for Information Systems 23, 521–552. https://doi.org/10.17705/1jais.00732
Langner M, Toreini P, Maedche A (2022) Eyemeet: A joint attention support system for remote meetings. In: CHI Conference on Human Factors in Computing Systems Extended Abstracts. CHI EA ’22. Association for Computing Machinery, New York, NY, USA. https://doi.org/10.1145/3491101.3519792
Csikszentmihalyi M (1991) Flow: The Psychology of Optimal Experience. Harper Perennial, New York, NY
Rissler R, Nadj M, Li MX, Loewe N, Knierim MT, Maedche A (2020) To be or not to be in flow at work: physiological classification of flow using machine learning. IEEE Trans Affect Comput. https://doi.org/10.1109/TAFFC.2020.3045269
Loewe N, Nadj M, Maedche A (2023) Do not disturb! designing a flow-adaptive system for notification management. In: CHI Conference on Human Factors in Computing Systems Extended Abstracts. Association for Computing Machinery, New York, NY, USA
Islam MK, Rastegarnia A, Sanei S (2021) Signal processing techniques for computational health informatics. In: Ahad MAR, Ahmed MU (eds) Signal Artifacts and Techniques for Artifacts and Noise Removal. Springer, Cham, pp 23–79. https://doi.org/10.1007/978-3-030-54932-9_2
Brugman H, Russel A (2004) Annotating multi-media/multi-modal resources with ELAN. In: Proceedings of the Fourth International Conference on Language Resources and Evaluation (LREC’04). European Language Resources Association (ELRA), Lisbon, Portugal. http://www.lrec-conf.org/proceedings/lrec2004/pdf/480.pdf
Larson R, Csikszentmihalyi M (2014) The Experience Sampling Method. In: Flow and the Foundations of Positive Psychology, pp. 21–34. https://doi.org/10.1007/978-94-017-9088-8
Barandas M, Folgado D, Fernandes L, Santos S, Abreu M, Bota P, Liu H, Schultz T, Gamboa H (2020) TSFEL: Time Series Feature Extraction Library. SoftwareX 11:100456. https://doi.org/10.1016/j.softx.2020.100456
Murphy KP (2013) Machine Learning : a Probabilistic Perspective. MIT Press, Cambridge, Mass. [u.a.]
Goodfellow I, Bengio Y, Courville A (2016) Deep Learning. MIT Press, Cambridge, Mass. http://www.deeplearningbook.org
D’Mello SK, Booth BM (2023) Affect detection from wearables in the “real” wild: fact, fantasy, or somewhere in between? IEEE Intell Syst 38(1):76–84. https://doi.org/10.1109/MIS.2022.3221854
Wu J, Osuntogun A, Choudhury T, Philipose M, Rehg JM (2007) A scalable approach to activity recognition based on object use. 2007 IEEE 11th International Conference on Computer Vision, 1–8
Anderson D, Bjarnadóttir MV, Ross D (2021) There are no colorblind models in a colorful world: How to successfully apply a people analytics tool to build equitable workplaces
Feigh K, Dorneich M, Hayes C (2012) Toward a characterization of adaptive systems: A framework for researchers and system designers. Human factors 54:1008–24. https://doi.org/10.1177/0018720812443983
Paramythis A, Weibelzahl S, Masthoff J (2010) Layered evaluation of interactive adaptive systems: Framework and formative methods. User Model. User-Adapt. Interact. 20:383–453. https://doi.org/10.1007/s11257-010-9082-4
Chatila R, Havens JC (2019) In: Aldinhas Ferreira, M.I., Silva Sequeira, J., Singh Virk, G., Tokhi, M.O., E. Kadar, E. (eds.) The IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems, pp. 11–16. Springer, Cham. https://doi.org/10.1007/978-3-030-12524-0_2
IEEE: The IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems. Ethically Aligned Design: A Vision for Prioritizing Human Well-being with Autonomous and Intelligent Systems. IEEE (First Edition) (2019)
The authors are very grateful to Ivo Benke, Gabriel Ivucic, and Rafael Reisenhofer for reading and commenting prior versions of the document.
Open Access funding enabled and organized by Projekt DEAL. This position paper is a result of a collaboration between the authors, initiated within the recently awarded DFG Research Training Group GRK 2739 “KD2School - Designing Adaptive Systems for Economic Decision-Making” which interconnects the three German universities KIT, University of Bremen, and University of Gießen.
Conflict of interest
The authors declare that they have no competing interests.
Research participants and other people depicted on the images have given consent for publication.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
About this article
Cite this article
Schultz, T., Maedche, A. Biosignals meet Adaptive Systems. SN Appl. Sci. 5, 234 (2023). https://doi.org/10.1007/s42452-023-05412-w