Keyword

1 Background

Affective interactive installations become an exploratory field with the development of emotion recognition technology. With technologies like EEG, facial affect detection and body gesture recognition, affective interactive installations are widely used in many fields. In media field, detected emotions can be reflected in the media content in real time (Altieri et al. 2019). In healthcare field, affective interactive installations can be used to treat children with ADHD (Adina et al. 2020). In rescue field, they can be essential for appearance-constrained robots used in search and rescue (Bethel and Murphy 2010). Affective interactive installations are used to meet the human spiritual needs and explore the future (Bialoskorski et al. 2010), leading to a more colorful world.

The boom of affective interactions has in turn called for the increase in diversification of interfaces. For example, the installation “Mood Swings” uses luminous orbs to present changes in the moods of users (Bialoskorski et al. 2010). In another instance, detects moods are detected and reflected with a digital board (Altieri et al. 2019). Other forms of interfaces, such as interactive landscape with oriented screens (Herruzo and Pashenkov 2020), virtual model of Kenji Miyazawa electronic portraits interacting with users, smart textiles combining an affective interaction installation (Jiang et al. 2020) shows the potential of interfaces.

Therefore, affective interactive installations with unimodel interfaces have proactively contributed to explore the potential of interactivity. However, a unimodel interface limits the composition of multi-layered interaction, which plays an important role in interactive installations. The research of L. Mignonneau and C. Sommerer indicates that multi-layered interaction could arise users’ engagement with installations, which develops through the interactive process (Mignonneau and Sommerer 2005).

Based on their research, we believed that layered interaction provides users with smooth and immersive experience in affective interaction installations by setting a structured interactive system, which calls for diverse interfaces to provide richer hierarchical interaction.

One of the convincing models for structural emotional engagement between human and computer is the theory put forward by John McCarthy and Peter Wright (McCarthy and Wright 2004). In their theory, a good setting of “four threads” helps users get better experience. The four threads, named as “compositional thread”, “sensual thread”, “emotional thread” and “spatio-temporal thread”, are respectively related to the composition of experiential hierarchies, user’s preference, user’s impression of the installation, and atmosphere of the installation. Their theory provides a potentially viable model of building layered interactive system and diverse interfaces to improve users’ experience in an affective interactive installation.

2 Proposal

To provide an immersive experience for users in affective interactive installations, the diversification of interfaces could be set to complete the “four threads”, which can be achieved through multi-dimensional input data and output media. At the same time, elements of various inputs and outputs should be organized as an organic system, thus composing a layered experiential sequence.

Among the “four threads”, the “compositional thread” can be constructed by advancing the depth of interaction. Constructing an experiential sequence leads to layered experience hierarchies, which encourages users to proactively reflect the clues in the process and spontaneously interact with installations. The “spatio-temporal thread” can be strengthened by mobilizing an engaging atmosphere and creating a mentally all-envoloping environment that redefines users’ experience of the world. The experience for “sensual thread” and “emotional thread” can be enriched by creating an engaging atmosphere through the architecture of physical space.

Both composition of experience hierarchies and atmosphere call for diverse interfaces of physical and computational setup. With diverse actions of interfaces, layered experiential hierarchies are possible, which creates the narrative of the experiential sequence. Moreover, multiplicity of physical interfaces leads to richer actions of the installation, which helps to create an engaging atmosphere in a physical-digital space, so that users are able to interact with the installation in a more immersive and more playful way.

The diversification of interface derives from two aspects. One is the increased dimensions of input data, and the other is the diversification of output media. For input data, enriching the sources increases its diversity. Besides human emotions, human behaviors could be a source of data. Performing multi-dimensional analysis to decompose existing data is also a viable way. For output media, adding diverse actuators to the installation and increasing the possibility of their actions lead to dynamic behaviors of the installation, thus making the experience richer.

The various input data and output media should be organized hierarchically. To integrate various elements in the reaction system, a mechanism is proposed as shown of Fig. 1.

Fig. 1.
figure 1

An illustration of reaction mechanism for our installation

3 Reaction Mechanism

According to the proposal, we create an affective interactive installation that provides an immersive experience sequence of emotional interaction for users by deploying diverse interfaces, for which a reaction mechanism is set (Fig. 1).

To create smooth and dynamic behaviors of the installation, which are supposed to take on different forms in response for different mood state of users, an organic form is chosen to apply on the installation (Fig. 2).

Diverse input data and output media are deployed in the installation. Multi-dimensional data are collected by sensors, analyzed by an emotional recognition system, and directed to actuators as outputs, which perform a variety of actions to response for different human behaviors. The actions are integrated to be a logically organized hierarchical experience.

Fig. 2.
figure 2

The form generation process of prototype

A typical interaction experience process includes four parts according to the depth of interaction.

  1. 1.

    The installation is awakened. When a user approaches the sensor, the installation will be activated. The actuator responding to users’ location come into operation.

  2. 2.

    The user’s emotions are reflected. The sensor captures user’s facial expression photos and the emotional recognition system will turn the photo into emotional data, thus leading to the actuators to perform actions in response for the mood state of the user.

  3. 3.

    The user’s emotions are constantly exchanged. When the user receives the actions of installation, his mood may change, which shows up in his facial expression. The installation will respond him in another way. In this way, emotions are transmitted circularly in a continuous interaction between the user and installation.

  4. 4.

    The user stops interacting. When the user leaves the installation, his mood is no longer affected by the installation.

4 Prototype

4.1 Integrated Physical-Digital Setup

According to the proposed mechanism, we aimed to create an affective installation that provides an immersive experience sequence of emotional interaction for users. In order to provide a mentally all-envoloping environment and highlight the interactivity of the installation, the overall structure should create an immersive atmosphere while also becoming one of the emotional inputs of the users. Thus, a form where the emerging and dispersing spheres distributed in the space is created (Fig. 3).

The overall installation is composed of four parts, containing the support structure, the transparent glass bodies (M1), the frosted glass bodies (M2) and the mechanical flowers. Among the flowers, those directly placed on the structure (M3) are interactive. The positions of them are shown in Fig. 3.

The support structure, which serve as a carrier for other components. Instead of using a solid model, we chose layered polyethylene plates in different shapes to increase its visual transparency, which allows users to perceive dynamic behaviors of the whole installation (Fig. 4). M1 are hanging or supporting transparent glass bodies, which contains mechanical flowers used to express the other six emotions. M2 are suspended or supported frosted glass bodies, in which mechanical flowers that serve as light fixtures are contained. M3 are interactive mechanical flowers placed directly on the structure, which are used to express one’s dominant emotion. Through the arrangement of glass bodies, interactive mechanical flowers are distinguished from others.

Fig. 3.
figure 3

Schematic diagram of prototype composition

Fig. 4.
figure 4

The scene of interaction

Seven mechanical flowers are installed as an interactive group, which includes one mechanical flower presenting the dominant emotion and six mechanical flowers presenting the other six emotions. Five sets of mechanical flowers are placed to simultaneously provide interactive experience for multiple people.

4.2 The Design of Mechanical Flower

In order to create an engaging atmosphere through the decoration of the physical space, actuators of the installation are designed as mechanical follows (Figs. 5 and 6).

There are two types of mechanical flowers. One is positively placed and the other is inversely placed. Both of the two types are composed of a control shaft, a circular ring, petals and a component box which contains a servo and a stepper motor.

The petals consist of outer petals and inner petals. Outer petals, which are made of translucent silica gel, serve as main image of flowers, while inner petals, which are made of winded aluminum wire, can be acted by the circular ring, driving the outer petals to open or close. A RGB LED is placed in the center of the petals.

Fig. 5.
figure 5

Construction of positively placed flowers

Fig. 6.
figure 6

Construction of inverted flower

The control shaft contains a main shaft and a circular ring. Fixed on the component box, the main shaft supports the double-layer petals, a thread (or strip) and a RGB LED. The servo in the component box controls the circular ring. In addition, the stepper motor inside the component box will drive the rotation of the main shaft, thereby controlling the overall rotation of the mechanical flower (Fig. 7). The component box is designed to hide in the support structure, leaving only the main shaft and petals visible.

Fig. 7.
figure 7

The mechanism of rotating and opening of positively placed and inverted flowers

When the flower is positively placed, the ring is on the outside of the petals. The ring will be driven down by the strip which is connected to the servo, and the petals will be opened by its gravity. When the flower is inverted, the ring is inside of the petals. When the ring is pulled up by the wire, the inner petals will be opened by the ring.

4.3 The Process of Emotional Interaction

We used the camera and infrared sensors as the sensors to capture data, and program in Arduino and Pycharm as control system, and servo and stepper motor connected to mechanical flowers as actuators. In this way, we set up a series of actions for the mechanical flowers and finally integrated the actions as a layered experience sequence (Fig. 8).

Fig. 8.
figure 8

Illustration of the interaction process between mechanical flowers and users

  1. a.

    The installation is awakened. When a user approaches an interactive mechanical flower, if the infrared sensors detect that the distance between the user and the mechanical flower is less than 1 m, the program in Arduino IDE will enable the servo on mechanical flowers that present emotion of users to rotate. A ring with a strip or thread is connected the servo, so the servo could drive the ring on flower to lift or lower, controlling the opening or closing of the petals.

  2. b.

    The user’s emotions are reflected. When the petals open, the camera will be activated and get the image of human face. The image will be transmitted to an emotion recognition program based on facial recognition and training of neural networks in Pycharm, which will analyze the characteristic of image and output seven representative emotions’ percentages in the image (angry, disgusted, scared, happy, sad, surprised, neutral). The percentages are sent to the Arduino IDE, in which the program controls mechanical flowers to take actions. The emotion with the largest percentage of values is selected to be the dominant emotion, which is presented by interactive mechanical flowers placed on the structure, while the data of other emotions are expressed by mechanical flower placed in clear glass spheres.

    The actions of mechanical flowers in this experience phase include glowing, rotating, and blinking. For the action of glowing, it is applied to seven mechanical flowers. Corresponding to the seven emotions, we set seven colors of RGB LEDs which are placed in mechanical flowers. When Arduino receives the emotional data, the mechanical flower will present its corresponding color respectively (Fig. 9). The correspondence between emotion and color is as follows (Table 1).

Table 1. The relationship between different emotions and colors
Fig. 9.
figure 9

Colors of mechanical flower physical model (R, G, B were selected)

  • In addition, we set the brightness of the LEDs according to the percentage of emotion. The higher the percentage, the greater the brightness.

  • For the action of rotating and blinking, it is only applied to the mechanical flower that presents dominant emotion. After Arduino receives the emotional data, it outputs a command to make the stepper motor rotate. The stepper motor is fixed to the flower. When the stepper motor rotates, it drives the flower to rotate at a certain speed, which is controlled by the percentage of dominant emotion. Besides, the blinking frequency is controlled by the percentage of dominant emotion, too. The relationship among percentage of dominant emotion, rotating speed and blinking frequency is shown in Table 2.

Table 2. Relationship between emotions’ percentages, rotating speed and blinking frequency
  1. c.

    The user’s emotions are constantly exchanged. When the user notices the actions of the flower, his moods may change. For example, the blossom or rotation of flowers may bring a sense of surprise. The changed emotions will be captured again and re-imported into the program, thus changing the actions of mechanical flowers. The process could be continuous. And the diverse actions of mechanical flowers could stimulate users’ moods in a multi-dimensional way, which makes the interactive process more playful and more desirable.

  2. d.

    The user stops interacting. When the user leaves the installation, if the infrared sensor is unable to detect the existence of objects and lasts 15 s without detection signal, the Arduino IDE will make a command to the LED, turning the LED off, which also makes a command to the servo, controlling the ring and make the flower closed, just as it does when the flower blooms.

5 Experiment and Results

In order to verify whether diversified inputs and outputs are indeed helpful to improve users’ experience, we conducted an evaluation experiment through VR with a focus group of 12 architecture students aged between 20 and 22 to experience the installation with two different interaction settings. In the first setting, we make the prototype to recognize emotions’ types and turn the data of types into different colors of RGB LEDs. And in the second setting, we detected people’ location, and the petals would open or closed. In this setting, we also detected emotions’ type and percentages. Correspondingly, the LEDs would change colors, and mechanical flowers would change the speed of rotation.

After 5 min of interacting with the installation, subjects are interviewed about which experience is more immersive and why. Ten out of twelve among them did agree that, “The progressive experience is more attractive”, “The rich feedback makes them more immersed in it”, and “The diverse expressions of flowers make them more interested in changing facial expressions to observe changes.” Through diverse interfaces, the installation became more impressive in “sensual thread”, “emotional thread” and “spatio-temporal thread”. And the arrangement of experiential process made “compositional thread” richer.

In addition, in the experiment, we noticed that after a series of interaction, 75% of the subjects’ facial expressions gradually tended to be “happy”, achieving two-way feedback and influence between users and installations.

6 Conclusion and Discussion

In this paper, we aimed to improve the users’ experience in affective interactive installations with diverse interfaces. Based on proposal and the theory of “four threads”, we created an installation with diverse interfaces by diversifying detected emotional data, and directed different input data to diverse outputs media. The installation was built with an organic form, and mechanical flowers that could interact with users were injected to it. With the actions of the flowers, layered experience hierarchies were created and the atmosphere in the installation became engaging. In this way, the users got an immersive interactive experience of emotion.

To explore our proposal’s feasibility, an experiment is conducted. The positive feedback from the subjects on the rich experience shows that diversified inputs and outputs can indeed improve users’ experience. We hope that the conclusion we put forward can be applied to future installation design, inspiring designers to introduce diversified inputs and outputs into affective interactive installations, thereby providing more engaging and immersive experience for users.

Additionally, the two-way feedback and influence can be widely used in the process of guiding the user's emotions to achieve a better installation experience. We believe that this subtle guidance will also inspire future design and application of affective interactive installations.