Encyclopedia of Computer Graphics and Games

Living Edition
| Editors: Newton Lee

Biosensing in Interactive Art: A User-Centered Taxonomy

  • Luís AlyEmail author
  • Rui Penha
  • Gilberto BernardesEmail author
Living reference work entry
DOI: https://doi.org/10.1007/978-3-319-08234-9_210-1



In an interactive artistic context, biosensing studies the detection, measurement, variation, and translation of electrical potentials in the human nervous and motor system as a strategy for controlling parameters in the virtual domain of a digital interactive system.


The unprecedented technological advances in terms of computational power, software integration, and miniaturized sensor technologies have fostered new artistic content creation methods in the domains of interactive music, art installations, and digital game design, to name but a few. In this context, biosensing is becoming a pervasive modality of control in interactive systems. In other words, artistic-related practices have been adopting psychophysiological electrical potentials of human subjects, i.e., biosignals, as a strategy to control the creation of interactive content towards adaptive, symbiotic, and immersive experiences.

In this entry, the authors present a user-centered taxonomy of biosensing technology, which aims to guide interactive artists in selecting appropriate biosensors for a particular interactive application design. Particular emphasis is given to the mappings between biosignals’ level of control and temporal response and the nature of the system output. In pursuing such a user-centered perspective over biosensing technology, the authors seek to extend existing taxonomies beyond the technical specifications of the sensors, thus, promoting a fluid use of such technology by interactive artists.

The remainder of this entry is organized as follows. Section “Human–Computer Interaction in Interactive Art” defines concepts such as interactive art, human–computer interaction. Then, in section “Interaction Modalities,” is presented artistic-related interaction modalities including biosensing. In section “Towards an User-Centered Taxonomy of Biosensing,” the authors review a range of taxonomic perspectives of biosensing technology proposed in related literature and further detail their user-centered taxonomy. Examples of interactive application scenarios of a user-centered taxonomy are presented in section “Application Scenarios of a User-Centered Taxonomy,” which discussed typical mappings between biosensor technology and interactive content creation using the proposed taxonomy. Finally, on section “Conclusions,” the authors outline the conclusions of their study.

Human–Computer Interaction in Interactive Art

Digital art is increasingly interactive. Some of it is built on interactions that evolved from computer games and device usage. Much of the interaction is intended to engage the audience in some form of interactive experience that is a key element in the aesthetics of the art. The interactive artist is often concerned with how the artwork behaves, how the audience interacts with it and, ultimately, in participant experience and degree of engagement with the art object.

In interactive art, the art object has an internal mechanism that enables it to change or be modified by an environmental factor, or human, which has an active role in influencing the degree of changes (Edmonds et al. 2004). In today’s interactive art, where the artist and the audience play integral participant roles, the computer’s role has immense potential in defining the degree of interaction, and also managing the real-time result of that interaction. Issues relating to human–computer interaction could be considered as important to interactive art creation as colors are to painting (Candy and Ferguson 2016).

Figure 1 shows the information flow of a human–computer interaction Bongers (2000) as a two-way control and feedback process. When interacting with a computer, humans take action in response to a sensed environment. In turn, computers capture the transformed environment and act accordingly using transducers – sensor devices that translate real-world signals into machine-world signals – and actuators that translate machine-world signals into real-world signals that can be perceived by humans.
Fig. 1

Redraw of Bongers (2000) interaction scheme, which includes two agents: a human and a computer. A feedback loop is created between both agents, which sense the external world using natural and artificial sensors. Both human effectors and computers actuators disturb the environment and act upon it

An example of this interaction loop can be demonstrated with a simple user-control case. When a user presses a computer keyboard key, the system (i) senses the mechanical force applied onto the key, and assigns that action to a specific programmed instruction – a real-world signal is translated into machine-world signal – and, in turn, the system (ii) maps that instruction to a specific symbol and translates a machine-world signal into real-world signal. The result being the visual feedback of the assigned character on the computer screen which can guide the user for future actions.

Thus, to foster human–computer interaction, a system should be equipped with (i) the ability to sense the external environment, i.e., a system capable of converting some kind of physical energy, e.g., kinetic or biosignal into electricity and then to codify that input into digital data in order for it to be recorded, analyzed, and manipulated and (ii) the ability to actuate on the external environment, i.e., being capable of converting digital data into some form of energy that can be perceived by a human being, e.g., visual/sound or mechanic cues. Sensing and actuating are specifications that allow a system to be controlled, report its current state, and guide the user towards the next possible actions.

In Bongers (2000), both the human and machine’s memory and cognition are essential components in building the interaction loop. In an interactive system, ideally the “conversation” between the human and the system should be mutually influenced by the intervention of both humans’ and machines’ memory and cognition permitting the interaction with information, the changing of the environment, and thereby altering the subsequent information that is received back by the system.

Interaction Modalities

In Bongers (2002), human–computer interaction makes use of what the author refers as interaction modalities – as communication channels between a human and a system – and those interaction modalities involve (i) input modalities which imply senses such as seeing, hearing, smelling, tasting, and touching which are used to explore the surrounding environment and (ii) output modalities, mainly involving the motor system, e.g., handwriting, speaking, or moving things around, that are used use to act on the environment.

Biosensing, as a measurement of the human psychophysiological activity and its use as a strategy for controlling parameters in the domain of a digital interactive system, is an interaction modality of special interest for the present study.

In detail, human psychophysiological activity relates to brain, skeleton, and cardiac muscles, but also skin functions, which all generate electrical potentials. These signals can be measured by electrodes and used to control a digital interactive system. Biosensing captures biosignals (Arslan et al. 2005; Ortiz-Perez et al. 2011; Ortiz et al. 2015) by detecting, measuring, and translating the electrical potentials in the human nervous and motor system functions, e.g., electromyographic signals, measured on the skin which are related to muscle activity, and electroencephalographic signals, measured on the scalp which is related to brain activity. Recent sensor technology has been developed to detect and measure these functions, notably to support medical care (Stern et al. 2001; Cacioppo et al. 2007; Webster and Eren 2017).

Beyond medical applications, biosensor technology has been attracting the attention of interactive artists who have been increasingly adopting this technology to control parameters of interactive digital systems.

Towards an User-Centered Taxonomy of Biosensing

Existing Taxonomic Perspectives

A wide range of taxonomic perspectives of biosensing technology, rooted in different disciplines and applications, have been proposed in related literature. In Horowitz and Hill (1989), sensor technologies are organized technically according to their electronic circuit design and in (Sinclair 2000) according to the kind of physical energy measured by a sensor, i.e., mechanical, gravitational, electrical, thermal, or magnetic. In Bongers (2000), and referring to the design of interactive musical systems, sensor technologies are categorized based on the ways humans can change the state of the surrounding environment pointing output modalities mainly related to muscle actions, which result in mechanical movement, air flow, or sound production.

In game research, Kivikangas et al. (2011) review the biosense method by presenting a taxonomically review of the application scenarios of biosignal as a way to assess game experience arising from emotional reactions, mainly related to valence and arousal dimensions, and in Leite et al. (2000), Kleinsmith et al. (2003), and Bernhaupt et al. (2007), a taxonomy is presented that takes into account factors such as affective responses by the player to game playing.

Taking into account the use of biofeedback to control game mechanics in Nacke et al. (2011), Pedersen et al. (2010), and Figueiredo and Paiva (2010) Nogueira et al. 2016) formalize biofeedback game mechanics with respect to players’ emotional states – modeling player experience for driving interactive content creation.

Table 1 lists biosensor technology commonly adopted in interactive art domain and details the nature of the psychophysiological electrical potentials it measures. The sensors were selected based on their common application in interactive arts, availability, low cost, miniaturization, and quick integration with software applications.
Table 1

Commonly used sensor technologies in the field of interactive arts and their respective measurable phenomena expressed in hertz (Hz)

Sensor technology


Measurable phenomena

Frequency (Hz)

Gaze interaction


Position, movement, and pupil dilation of gaze with a sensor located on the screen




Activation of facial or body muscle tissues




Chest the breathing rate and volume

Measured in extension capacity



Thermal feedback

Up to 5000



Electrical activity of the heart


Heart rate variability


Time difference between two sequential heartbeats

HF (0.15–0.40)

LF (0.04–0.15)



Eye motion analysis with a body-worn sensor

DC to 10

Electrodermal activity


Tonic level of electrical conductivity of skin




Electrical changes on the scalp


Sensors such as ECG, EEG, EMG, TEMP, and EOG measures in Aller et al. (2000), EDA (da Silva et al. 2014), and HRV (Bakhtiyari et al. 2017). GAZE and RESP sensors have different responses; the latter is measured in extension capacity range, e.g., ranging from 35% to 65%, and the former’s accuracy depends on the angular average distance from the actual gaze point

A User-Centered Taxonomy of Biosensing

The authors propose a user-centered taxonomy of biosensor technology to assess the broader picture on the use of biosensing technologies as a strategy for controlling parameters in the virtual domain of a digital interactive system. In pursuing such a user-centered perspective over biosensing technology, the aim is to extend existing taxonomies beyond the technical specifications of the sensors, thus, promoting a fluid use of such technology, and its intuitive use, by interactive artists.

In greater detail, the proposed user-centered taxonomy guides the process of selecting the most suitable sensor technology for a specific task based on two dimensions: (i) the degree of control over psycho-physiological human functions, i.e., the ability the subject has to manipulate her own psychophysiological activity and consequently alter the sensor response and (ii) the temporal variability, i.e., the rate of temporal noticeable change in the captured electrical potentials. For example, the author’s taxonomy provides an answer to the artist, which aims to use biosignals to control the (long-term) digital game mechanics such as daytime or the weather conditions by pointing to a low temporal variability response sensor with an indirect type of control.

Figure 2 shows the user-centered taxonomy of biosensing technology for interactive art contexts. It includes the biosensors technology listed in Table 1. The two dimensions of the taxonomy, i.e., the temporal variability of the psychophysiological function and the type of control over particular function, are assigned to the horizontal and vertical axes, respectively.
Fig. 2

Physiological measures according to a distribution on two axis: on the horizontal axis, the temporal variability response stimuli/sensor, and on the vertical axis, the level of direct control of the electrical potentials generated. The authors denote two clusters, A and B, grouped according to sensor’s type of control and response. The measurable phenomena captured by the different sensors is explained in Table 1

The temporal variability in the horizontal axis reports the degree of temporal variability from low to high. For example, a GAZE sensor has high variability as it measures eye movement, which can naturally be very fast. On the other side of the spectrum, the EEG sensor has a much lower variability as it measures brainwaves can have a slower rate of change. The temporal variability is related to the measurable phenomena expressed in hertz presented in Table 1.

The degree of control over the psycho-physiological functions by subjects is denoted in the vertical axis and is expressed in a scale from direct to indirect control. In greater detail, the scale reports the degree of control humans have over their psychopsychological functions and the ability to deliberately alter the response of the captured data. For example, humans have a direct-explicit control over a muscle impulse, captured by an EMG and a more indirect-implicit control over skin conductivity using an EDA.

In Fig. 2, a diagonal disposition of the biosensors can be identified, showing that the horizontal and vertical axes are intertwined in such a way that the faster the responses obtained from the sensors, the more direct control users have over their measures and evolution. Moreover, from this tendency the authors highlight two overlapping clusters. This overlap is due to the fact that there are sensor technologies which have slow changing responses that can be altered by a sudden change in the environment that they are measuring. An example of this overlap is the TEMP sensor which typically has slow response but the user can induce a more immediate response by blowing air into it.

One remaining dimension is of consideration here: body intrusion. Despite its relevancy in the choice of biosensors for a particular task, the authors believe that the miniaturization of sensor-technology will eventually make it ubiquitous and pervasive in all artistic applications scenarios. Even so, interactive artists must be aware of the pertinence of this dimension when building interactive content.

Application Scenarios of a User-Centered Taxonomy

Based on the two identified clusters in Fig. 2, the authors now discuss typical mappings between biosensor technology and interactive content in two main domains: game design and game audio. The authors believe these same principles can be applied to other forms of interactive biosensing driven art contexts, such as interactive music or performance art.

In game design, biosensors from cluster A are typically applied to control implicit, slow-adapting, or background aspects of a game, such as level generation conditions. For example, the slow-changing HRV function can be mapped to evolve long-term game settings such as the weather conditions or to control artificial intelligence aspects of game that are not so noticeable. On the other hand, the slow pace of an EEG measure can be used to define artificial intelligence parameters of nonplayer characters’ level of reaction to the player presence. For creating interactive game audio, the sensors from cluster A allow the control of higher level musical features such as tempo, i.e., a faster or slower tempo in music expressed in bpm (beats per minute), or the soundtrack general mood, i.e., a more tense or relaxed type of music can be mapped to an HRV sensor.

Biosensors from cluster B are better adapted to control explicit interactions or foreground actions. For example, an EMG sensor, which measures fast-changing facial or body muscle tissues, is well-adapted to control player rapid actions, e.g., define the impulse of a character’s jump. The fast response and highly controllable RESP sensor can be used to define the number of enemies when the player is trying to accomplish an undercover mission. For interactive game audio cluster B is better adapted to control low-level sound features such as the audio level of determined sound effect, e.g., a GAZE sensor can be used to raise the audio level of an observed game object to focus the player’s attention or to define the location of a sound event in the scope of the stereo image.


The authors presented a user-centered taxonomy of biosense for interactive arts which aim is to provide artists with a framework to assess the broader picture on its use as a strategy for controlling parameters in a digital interactive system. In pursuing such a user-centered perspective over biosensing technology, the authors sought to extend existing taxonomies beyond the technical specifications of the sensors in order to assess a broader picture of biosensing technologies, thus, promoting a fluid and intuitive use of such technology by interactive artists. By providing use cases examples which discussed typical mappings between biosensor technology and interactive content creation in domains such as game design and game audio, the authors intended to validate their user-centered taxonomy of biosense in interactive art.



  1. Aller, M., Fox, S., Pennell, J.: Measurement, Instrumentation and Sensors Handbook, CRC Press LLC, New York (2000)Google Scholar
  2. Arslan, B., Brouse, A., Castet, J., Filatriau, J.-J., Léhembre, R., Noirhomme, Q., Simon, C.: From biological signals to music. In: 2nd International Conference on Enactive Interfaces, Genoa (2005)Google Scholar
  3. Bakhtiyari, K., Beckmann, N., Ziegler, J.: Contactless heart rate variability measurement by IR and 3D depth sensors with respiratory sinus arrhythmia. Procedia Comput. Sci. 109, 498–505 (2017)CrossRefGoogle Scholar
  4. Bernhaupt, R., Boldt, A., Mirlacher, T., Wilfinger, D., Tscheligi, M.: Using emotion in games: emotional flowers. In: Proceedings of the International Conference on Advances in Computer Entertainment Technology, Salzburg, pp. 41–48. ACM (2007)Google Scholar
  5. Bongers, B.: Physical interfaces in the electronic arts. Trends in Gestural Control of Music, IRCAM-Centre Pompidou, Paris, pp. 41–70 (2000)Google Scholar
  6. Bongers, B.: Interactivating spaces. In: Proceedings of the Symposium on Systems Research in the Arts, Informatics and Cybernetics, Barcelona (2002)Google Scholar
  7. Cacioppo, J.T., Tassinary, L.G., Berntson, G.: Handbook of psychophysiology. Cambridge University Press, Cambridge (2007)CrossRefGoogle Scholar
  8. Candy L., Ferguson S.: Interactive Experience, Art and Evaluation. In: Candy L., Ferguson S. (eds) Interactive Experience in the Digital Age. Springer Series on Cultural Computing. Springer, Cham (2014)CrossRefGoogle Scholar
  9. da Silva, H.P., Guerreiro, J., Lourenço, A., Fred, A.L.N., Martins, R.: Bitalino: A novel hardware framework for physiological computing, pp 246–253. PhyCS (2014)Google Scholar
  10. Edmonds, E., Turner, G., Candy, L.:. Approaches to interactive art systems. In: Proceedings of the 2nd International Conference on Computer Graphics and Interactive Techniques in Australasia and South East Asia, pp. 113–117 ACM (2004)Google Scholar
  11. Figueiredo, R., Paiva, A.: “I want to slay that dragon!”-influencing choice in interactive storytelling. In: Joint International Conference on Interactive Digital Storytelling, pp. 26–37. Springer (2010)Google Scholar
  12. Horowitz, P., Hill, W.: The Art of Electronics. Cambridge University Press, Cambridge (1989)Google Scholar
  13. Kivikangas, J.M., Chanel, G., Cowley, B., Ekman, I., Salminen, M., Järvelä, S., Ravaja, N.: A review of the use of psychophysiological methods in game research. J Gaming Virtual Worlds. 3(3), 181–199 (2011)CrossRefGoogle Scholar
  14. Kleinsmith, A., Fushimi, T., Takenaka, H., Bianchi-Berthouze, N.: Towards bidirectional affective human-machine interaction. J. Three Dimens. Images. 17, 61–66 (2003)Google Scholar
  15. Leite, I., Pereira, A., Mascarenhas, S, Castellano, G., Martinho, C., Prada, R., Paiva, A.: Closing the loop: from affect recognition to empathic interaction. In: Proceedings of the 3rd International Workshop on Affective Interaction in Natural Environments, pp. 43–48. ACM (2000)Google Scholar
  16. Nacke, L.E., Kalyn, M., Lough, C., Mandryk, R.L.: Biofeedback game design: using direct and indirect physiological control to enhance game interaction. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 103–112. ACM (2011)Google Scholar
  17. Nogueira, P.A., Torres, V., Rodrigues, R., Oliveira, E., Nacke, L.E.: Vanishing scares: biofeedback modulation of affective player experiences in a procedural horror game. J. Multimodal User Interfaces. 10(1), 31–62 (2016)CrossRefGoogle Scholar
  18. Ortiz, M., Grierson, M., Tanaka, A.: Brain musics: history, precedents, and commentary on whalley, mavros and furniss. Empir. Musicol. Rev. 9(3–4), 277–281 (2015)CrossRefGoogle Scholar
  19. Ortiz-Perez, M., Coghlan, N., Jaimovich, J., Knapp, R.B.: Biosignal-driven art: beyond biofeedback. Ideas Sonica/Sonic Ideas. 3(2), (2011)Google Scholar
  20. Pedersen, C., Togelius, J., Yannakakis, G.N.: Modeling player experience for content creation. IEEE Trans. Comput. Intell. AI Games. 2(1), 54–67 (2010)CrossRefGoogle Scholar
  21. Sinclair, I.: Sensors and Transducers. Elsevier (2000)Google Scholar
  22. Stern, R.M., Ray, W.J., Quigley, K.S.: Psychophysiological recording. Oxford University Press, Oxford (2001)CrossRefGoogle Scholar
  23. Webster, J.G., Eren, H.: Measurement, instrumentation, and sensors handbook: Spatial, mechanical, thermal, and radiation measurement. CRC Press, Boca Raton (2017)Google Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  1. 1.FEUP, University of PortoPortoPortugal
  2. 2.INESC-TEC and FEUP, University of PortoPortoPortugal
  3. 3.INESC-TEC, Porto and University of AveiroAveiroPortugal