Keywords

1 Introduction

In the 1950s, Turing proposed a test for machine intelligence, arguing that if a machine can make humans believe it is human, then it has intelligence. Shortly after, artificial intelligence, industrial robots and chatbots were developed, which started a dialogue between designers and technology that has expanded exponentially. Currently, there are numerous technological tools that connect people with one another. Nevertheless, because of COVID and other issues, isolation and polarization are becoming more and more common in the everyday life of people. The link between technological innovation and social segregation presents an ongoing issue that can be addressed by architects. Is there a way that technology can help us to strengthen verbal and non-verbal communication in the workplace to allow for better understanding between colleagues? Can this enhance the experience of designed spaces and generate stronger emotional connections?

This paper and the corresponding project explore the consequential role technology and materials play in making meaningful connections between people, architectural space, and the workplace. It indicates that design can synergize with responsive technology and material to leverage new power for future workplace interaction design. We have created a spatial prototype paired with a series of simulations that act as a workplace critique. The project employs a responsive ceiling that combines a computational pattern with temperature responsive bi-materials, which are coated thermo-chromically, and electrically programmed with micro-controllers. This is then connected to a computer code that makes readings based upon human interaction with wearable technology. The methodology categorizes the simulation results into “Aroused States” and “Calm States”. As the computational patterns and colors change, we are made aware of the relationships between space, technology, and the human sensorium. This conversation brings insight into how we can design more effectively for workplace interactions.

The project and its simulations examine ongoing contextual awareness and demonstrate how a workplace can be designed more intelligently depending on the input and biological data of the user. It examines the role of the human brain in the interaction between the user, technology, and the environment. The project makes use of technology to capture one’s body information using biosensors that create a synergistic relationship between this technology and optical surface material (thermal-chromic paint and bi-material to change rotation in the Z-axis associated with electrical input), which is connected to a piece of interactive, intelligent furniture that presents this information back to the users in real-time. As users actively engage with the workspace and ceiling, the moods, emotions, and senses of well-being are brought to the forefront of the experience and user awareness.

This information is displayed not virtually, but physically to stimulate non-verbal communication between people, as a means to be more effective in communication, understand one another and to promote new experiences and meaningful encounters. Understanding is the key to empathy and this results in having better communication between people in the workplace.

2 Forms of Communication: Verbal, Non-verbal, Sensorium

Neuroscientists are continuously exploring the connection between the brain and the sensorial channels through which we understand and perceive the world. For example, touching something with a texture can change a person’s mood and influence the decisions a person makes.3 Touch also seems to be very important to a human’s well-being, and it has been found to convey compassion from one human to another. But maybe the most interesting topic related to this theme is how human sensorium can intensify emotional connections. To move further into the understanding of sensorium, we can look at light quality and shadow. Technology has introduced both positive and negative effects to our lives. On one hand, it has brought a positive impact in keeping us more informed and connected. On the other hand, it has created new disorders and diseases such as internet depression, FOMO (fear of missing out), diminished comprehension, and deep retention, affecting the morality of people, among other things4 (Fig. 1).

Fig. 1.
figure 1

Simulations of ceiling surface changes with multi-person non-verbal interaction using Houdini

Our human anatomy is becoming more and more in sync with technological devices. We have a multitude of options to help us through our daily lives in the workplace and beyond that offer conveniences and make us almost constantly connected to technology in one form or another. According to Andy Clark, a professor of philosophy and Chair in Logic and Metaphysics at the University of Edinburgh, “we are already cyborgs” or “human-machine hybrids”, which means the physical merge of flesh and electronic circuitry, without the need for wires, surgery, or bodily alterations. He argues that it is arbitrary to say that the mind is contained only within the boundaries of our brain because it has always collaborated with external, nonbiological sources to solve the problems of survival and reproduction in humans. He states that, “with the advent of texts, PCs, coevolving software agents, and user adaptive home and office devices, our mind is just less and less in the head. In other words, the separation between the mind, the body, and the environment are seen as an unprincipled distinction.5 Following this line of thought, if we are already cyborgs and technology is increasingly contributing to that, we have to enjoy the positive aspects and place boundaries, reduce or extinguish the negative ones. Moreover, if the opposites enable us to boost experience and add value to a situation, it is interesting to design a product that increments that, through technology, by filtering the received input data and balancing the output outcomes (Fig. 2).

Fig. 2.
figure 2

Operational diagrams of responsive material behaviour including process of mood application, wearable sensors, algorithm, human interaction, and subsequent physical material change.

3 Materials and Methods

The potential contributions that can arise from a process that uses computational logic and new media as real-time predictive or interactive tools can be an acute method for designers interested in workplace design. This process of simulation also aids in creating a more robust prototype. Using digital tools such as micro-controllers, Rhino, Grasshopper, and Houdini, paired with physical testing, it interrogates a workflow between computational design, production methods & material logics. Through this process, it manifests a methodology that categorizes the test results into two separate morphological conditions: “Aroused States” and “Calm States”, as diagnosed by the wearable Upmood sensors. As users engage with the space, the computational patterns and colors change, and we become aware of the relationships between space, technology, and the sensorium.

The results here explore a range of scenarios, such as one-person interactions, and group interactions, which produces insight into how the project could work at a larger scale in the built environment. As simulations for the project began to expand, it became necessary to break them down into a series of categories that demonstrated how the results could impact the overall design and show how the different simulations could affect our decision-making process for future projects. In doing this, we were interested in using the computer to assist in making decisions relative to the physical prototype. Trying to simulate a person’s state of being (excited, calm, etc.), is difficult to do when making a physical model, but much easier and more accessible with a series of digital models. To explore this further, we used a Houdini model to generate patterns responsive to color that could be simulated based upon information gained from the Upmood algorithm. It then was paired with the Rhino model to explore the capacity for responsive geometry in the process of different material states.

The process of making this method was something we had not previously experienced, but it proved beneficial because the final version of the prototype was very expensive and time-consuming to construct, and therefore required simulation to help with decision-making. It required an intense amount of electronic manipulation and conversion of algorithmic data. The digital workflow allowed us to not only visualize before building but make decisions for how we wanted this expensive prototype to be built in the end, which was different compared to a traditional furniture prototype. Also, during the process of designing the interface, it was also difficult to test physically. For this project, we felt that in some ways, designing the “process” was more important than designing the finished product.

Along with the simulations, we started testing several different versions of bi-materials, this included combinations of metal materials paired with natural materials such as metal and paper together. The bi-material utilized in this project is composed of two separate metals joined together and consists of layers of different materials that vary in thickness and property. The bi-materials are useful because they convert a temperature change into recognizable form displacement, and then because of the different material properties, revert back to an initial position after the increase in temperature has subsided. The embodied energy of the materials become visual as the energy is released and absorbed, this displacement becomes evident and is therefore very useful in working with temperatures and electronic inputs.

It was apparent that certain bi-metal materials were not going to work for this particular project due to the fact that we were unsuccessful in testing and laminating the materials together in a consistent way. Also, the two coefficients of expansion for the bi-metals was attractive in the beginning due to the obvious visual perception the material provided. Examples of this are bending when heated and then returning to its original state when cooled. In this way, the bi-metals behaved as expected, however, it did not fall into our desired temperature ratios for the way we were designing the heating interface. In the end, the thickness of the various bi-metals we explored proved to be problematic because the correct thickness of the metals was not achievable at the scale of the ceiling installation in this particular setting.

It became clear that extremely thin metal combined with a paper-based product would offer the most performative value at this scale and was consistently the most successful combination. This was a significant breakthrough because the metal allowed for enough rigidity, and the paper-based material allowed for a more direct relationship between the thermochromic paint and the temperature differences. The resiliency of the material palette, both in terms of the thermochromic coatings, and the responsiveness of the bi-material was a critical factor. Our studies indicated that 500 W of potency was enough to achieve our desired outcome but the speed and rate of thermal expansion needed additional ranges to portray the full range of options that the installation provided for human interaction. At 500 W, the biomaterial starts to move, but it moves in a slower rate. The higher the temperature, the more the metal deforms and the more it is respondent to the temperature. The activation temperature for the thermochromic paint is 30 °C (Fig. 3).

Fig. 3.
figure 3

Material tests of individual units based on temperature and heat-responsive inks. The decision was made to utilize the morphologies that are marked and that bended most successfully with heat. Research indicated 40 s for the leaves to bend and 2 min for it to return to the initial position.

We utilized a coating of Leuco water-based dye on the materials because this finish allows colors to change in the presence of temperature variation. This is a phenomenon that certain substances exhibit, known as thermochromism. The process is reversible up to 60°, and Irreversible from 60° onward. It allows for the mixing of shades between paints of the same turning point, but in this case, it wasn’t mixed. The color intensity in this process depends on the designer’s needs, but the weight of the covering (paint coat) should be 2 to 3 times that of normal paint. The dyes are rarely applied on materials directly; they are usually in the form of microcapsules with the mixture sealed inside. The dyes most commonly used are spirolactones, fluorans, spiropyrans, and fulgides. The acids include bisphenol A, parabens, 1,2,3-triazole derivates, and 4-hydroxycoumarin and act as proton donors, changing the dye molecule between its leuco form and its protonated colored form; stronger acids would make the change irreversible. We found that leuco dyes are available for temperature ranges between about −5 °C (23 °F) and 60 °C (140 °F).

4 Responsive Technologies

To monitor and evaluate how people perceive certain criteria, users wore emotion-sensing bracelets when they visited the project. In order to predict the potential overlap between the architectural and the sensorial, we worked with Upmood technologies to begin understanding how to measure and estimate the feelings people have in response to a given environment. The bracelets collect biodata from the user and result in 11 different emotional states: calm, pleasant, unpleasant, happy, sad, excited anxious, confused, challenged, tense. This data was continuously fed into an App that revealed the different states back to the user. The evaluations and the use of these overlapping technologies acted as a way to gain insight into a more profound human experience. Through this process, the project addressed user insight relative to emotional patterns and management (Fig. 4).

Fig. 4.
figure 4

Plan drawings of the surface incorporating color change based on temperature

Homeostasis, the tendency towards a relatively stable equilibrium between interdependent elements, especially as maintained by physiological processes, is a big part of human survival and is relevant to the theories of Antonio Damasio. In the paper, The Nature of Feelings: Evolutionary and Neurobiological Origins, Damasio writes, “Survival depends on a homeostatic range”, and “feelings and experiences facilitate the learning of the conditions for homeostatic imbalances plus the anticipation of conditions. Feelings are mental experiences that accompany a change in the body state.” He goes on to write that, “external changes - displayed in the exteroceptive maps of vision or hearing (sensorium) are perceived, but largely not felt. It can trigger drives or emotions, causing a change in the body state, and subsequently felt.” Human survival depends on homeostasis, or the regulation of the body’s self-repair and defense. The body can regulate itself without the person having a feeling, or “conscientious experience”. However, when the person does have a feeling, and therefore he/she is aware of it, it facilitates the learning of a change in body state for a better prediction of future situations and thus increases behavioral flexibility. With these concepts in mind, this installation tries to increase felt experiences, using the different stimuli to increase senses and also offer a recording of what a user felt to bring about potential self-awareness.

Wearing Upmood bracelets allowed us to monitor different emotional states in real time as we interacted with the project. What we found is that the technology can be sometimes accurate and sometimes surprising, indicating that our heart rates change in different ways and can be highly situational. It was an interesting part of our experiment because it allowed us to interact with a user in a way not normally accessed in architectural projects. The conclusions for these findings via Upmood wearable technologies indicate that stable peaks, highs, and lows, in experience are crucial to accuracy, and variance from person to person can differ. There can also be discrepancies between what the users “thought” they were feeling, and what the technology actually indicated. Fluctuation in experience will also affect the results.

A relay module is used along with an LCD display to connect with a microprocessor and WiFi, this is in turn connected to a heating element and the surface hardware along with other auxiliary components. To receive the information, the ESP32 microprocessor from Expressif must be connected to the internet through a WiFi network, which will be used to communicate with the Upmood server, and firmware updates via OTA must also happen. Once emotions are received from the server by the microprocessor, they are shown on the display, and the heating element is activated or deactivated by the relay module. For testing and demonstration purposes, the surface also has a manual override mode, where the heating element can be activated or deactivated regardless of the information received by the server. The set described will be referred to as MoodSpace (Fig. 5).

Fig. 5.
figure 5

Upmood bracelet extracted from https://techmash.co.uk/2018/08/29/upmood-wearable/#jp-carousel-99635. Testing setup, putting technology into play.

MoodSpace changes its behavior according to the emotions captured by the Upmood bracelet, reacting to two possible states, being Calm and Aroused. In order for MoodSpace to react to emotions, the Upmood bracelet captures the heartbeat of the person using it, the Upmood App then receives this data and sends it to the Upmood Server, where it is analyzed by the algorithm and translated into emotions. In turn, MoodSpace communicates with the Upmood Server through a REST API (see reference images, diagrams), which returns the information of the linked people and their emotional state in JSON format. The entire process happens automatically and, as long as MoodSpace is connected to the internet and the App is receiving data from the bracelet and subsequently sending it to the server, the surface will react to the emotions of whomever is using it. The two states, “Calm” and “Aroused”, which are responsible for activating or deactivating the surface are obtained from the grouping of various emotional states captured by the bracelet. The emotional states are as follows:

  • Calm: Calm; Pleasant; Zen; Sad.

  • Aroused: Happy; Excited; Unpleasant; Anxious; Confused; Challenged; Tense (Fig. 6).

    Fig. 6.
    figure 6

    Operational diagram illustrating process of QR code, wearable, sensors, algorithm, server, human interaction.

Therefore, when the emotion captured by the bracelet is defined as “Zen”, the project enters the state of “Calm”, and so on. Although a person’s emotions change quickly depending on the situation, from this segmentation by two states, it was possible to have a uniform and consistent response between all emotions within both states.

5 Conclusions

In this project, we explored the potential of workplace design relative to digital technologies and material interactions in the field of bi-material, thermal properties, and the human sensorium. We designed a small, spatial workspace project which cultivated a sensory experience between users with a goal of facilitating better non-verbal communication between people in the workplace. Throughout the process, we asked questions about the capacity to design with technology and its subsequent impact on human beings. It started by employing typical parametric & computational software, thinking of the potentials between the digital and the real, and incorporated this potential by accepting the role of material manipulation and response to temperature as a way to interact with architectural space and people. The purpose was to engage people sensory experiences that force humans to re-perceive our physical world. The outcomes imply that through research and design, stronger sensorial experiences can be used to increase awareness, perceptibility, and create new design conversations. The paper is meant to document and critique the process.

What we have learned offers a great deal of input relative to humanizing the design of everyday objects to allow a more heightened experience and relationship between human and object. Although there is much work to be done and this paper only documents one sample series of prototypes, the use of computer simulations allowed us to begin to predict and forecast how our design could respond to specific levels of human interaction. The use of bi-materials allowed us to create physical and visual responses that relate directly to human interaction, and the role of sensors and wearable technology allow us to further dial in and program the project in a way that illustrates human connectivity and interaction in real time.

This project has a sensory approach to emphasize the positive aspects of technology and reduce the bad ones. So, it enhances the sensorial channels to provide the users new experiences and build emotional connections with one another. Furthermore, through non-verbal communication, this surface helps people to be open, honest, and integrated, stimulating real and meaningful rapports. In this way, one can understand the other better and have more efficient communication.