1 Introduction

A haptic device can stimulate a sense of virtual collision through electric, vibration, pneumatic, and vertical linear actuators thus providing a user with a better experience in a Virtual Environment (VE). Various haptic devices have been developed to cover a wide area of the user’s body. An example is a haptic suit, a wearable haptic device used to provide haptic feedback to the upper body when a user collides with objects in a VE (Garcia-Valle et al. 2017; Delazio et al. 2018; Yang et al. 2021).

The majority of commercial haptic suits stimulate the torso and arms with vibration actuators spaced at regular distances and operated along a predetermined sequence. Users can obtain haptic feedback such as a collision of a bullet or a fist. Prior research has been applied to a military training simulation (Lindeman et al. 2006), serious games (Garcia-Valle et al. 2017), movies (Lemmens et al. 2009), and virtual games (Konishi et al. 2018; Al-Sada et al. 2020).

A vibro-haptic suit provides vibration cues to a user when the user experiences virtual collisions. It has the advantages of being lightweight, affordable, and composed of essential parts. An acousto-haptic suit is developed with an audio signal and a vibration actuator. It is especially effective with stereo sounds such as 3D games, music concerts, and military situations such as; explosive materials, pressure damage and explosion generated effects. However, the suits provide unexpected haptic feedback when players shoot without hitting the user. The inconsistency between the feedback and virtual collision impairs the immersion experience of users. Usually, the distributed pressure of a vibration actuator is weaker than the vertical pressure of a pneumatic actuator. Hence, pneumatic haptic suits have been developed to transmit more powerful force to users in a short time as these devices use compressed air above a pressure of 200 kPa.

A haptic device stimulates many sensory receptors. Cutaneous mechanoreceptors of a human, which are the Meissner corpuscles, Ruffini endings, and Pacinian corpuscles in the epidermis and dermis, sense vibration and pressure. Specifically, the Pacinian corpuscle (rapidly adapting type II mechanoreceptor) found at the edge of the dermis senses rapid changes in pressure and high-frequency vibration (200–300 Hz) and responds to sudden disturbances. Proprioceptors of a human, which are the muscle spindle and Golgi tendon organ on the muscle and bone, sense stretch, speed of stretch, and tension of the muscle. Although a vibration actuator can stimulate the receptors, increasing the depth of vertical pressing with a vibration actuator is more difficult than with a pneumatic actuator which makes a user feel a difference between vibration and pneumatic actuators. Hence a pneumatic actuator can stimulate many proprioceptors that generate mixed sensory signals of mechanoreceptors and proprioceptors, it is unknown which mixed sensory signals improve the experience of users in specific VR simulations. To determine that, researchers develop haptic suits through various methods (Wang et al. 2020).

We reasoned that if we use a multimodal haptic suit with acoustic and pneumatic actuators, then we can improve the experience of users in VR. We made small silicone air bladders (bellows) that can inflate to a vertical direction and endure high supply pressure (600–800 kPa), grouped them, and attached them to the front upper side of the PA suit to cover the entire chest of a human (shown in Fig. 2). The small size of the actuator helps to create various haptic patterns that express artificial feeling related to virtual collision on the user’s body. Moreover, we used acoustic actuators to provide realistic vibrations using an audio signal from a VR Head Mounted Display (HMD). Therefore, the PA suit can express two impact types—strong impact without vibration and strong impact with realistic vibration. This is an advantage when the VR simulation provides strong collision or generates a high-intensity audio signal which can be converted to vibration by an amplifier and vibration motor. Moreover, users expected strong pressing (40 N at inflow time of 115 ms) in the virtual collision. A great example is that of a VR simulation battle where a VR avatar collides with an explosion and heavy shelling occurs. The pneumatic hardware, which includes an air tank and valves, is too heavy to move around in VR. Hence, when we developed a haptic suit, we noticed that it difficult to choose a pneumatic actuator. To solve this problem, we incorporated a Weight Support System (WSS) that lifts the pneumatic hardware (backpack) and the PA suit as shown in Fig. 1. We determined that VR developers can use the PA suit in training, task instruction and improving the VR experience. Our user studies proved the possibility and usefulness of the PA suit in various research fields.

Fig. 1
figure 1

VR simulator with the proposed PA suit, pneumatic hardware, and the weight support system (WSS)

Fig. 2
figure 2

Configuration of the ABC and ABM

This paper presents our multimodal haptic suit that creates various haptic patterns using pneumatic and acoustic actuators to provide powerful haptic feedback and improve the experience of users in VR. First, in Sect. 2, we discuss related prior haptic suits and evaluation method about user’s VR experience. Then in Sect. 3, we present the PA suit, weight support system and the VR environment. In Sect. 4, we describe our pneumatic and acoustic haptic patterns. In Sect. 5, we describe the user studies that we conducted to evaluate the performance of the VR system and discuss the results of these studies. The user studies include recognizing haptic patterns to establish the possibility of instruction tasks and guiding directions using multimodal actuators in VEs. Moreover, we focused on increasing user presence using the PA suit. In Sect. 6, we present our conclusions and directions for future work on VR simulation.

2 Related work

A haptic suit provides various haptic cues to a user, and it covers a wide area of the user’s body to create realistic haptic feedback in a VE. Some haptic suits reflect the location where a specific object collides with the user’s avatar in VR by operating an actuator so that the user knows which part of the body was hit by the object (Lindeman et al. 2006; Günther et al. 2020b; Delazio et al. 2018). To achieve this function of a haptic suit, researchers deployed multiple actuators with some distances in the wearable device.

Haptic devices are typically used for guiding direction (Smets et al. 2008), instruction tasks (Zhu et al. 2020), and improving the VR experience of users (Jain et al. 2016; Günther et al. 2020b; Delazio et al. 2018; Prada and Payandeh 2009). With regard to guiding direction, users are stimulated by visual, auditory, and haptic cues. When a user approaches the target point, the haptic devices reduce moving time. Unlike a haptic cue, which can instantly provide direct information to users, a human takes time to recognize the visual information on a map (Raitor et al. 2017; Smets et al. 2008). It also has the benefit of providing instructions on specific tasks in extreme situations, such as terrorist attacks, natural disasters, and combat situations; it is difficult to obtain this information through visual or auditory devices. Previously, commercial suits focused on improving the VR experience of users by creating a realistic virtual world in VEs using advanced HMD technology.

The haptic transmission method (Yang et al. 2021), which was used to develop the commercial top-type haptic suit for the past 10 years mainly adopted vibrotactile (TactSuit 2017), acousto-tactile (ARAIG 2013; KOR-FX 2014; Woojer 2017), and electrotactile devices (TESLASUIT 2018) to create a virtual collision. These types of devices have advantages in price, weight, and ease of use. Haptic devices are relatively small and light compared to other haptic devices that adopted minor transmission methods, and are easy to design to be suitable for a flexible body structure. In the next subsection, we categorize the information on related haptic device depending on the transmission method, the advantages and disadvantages of each device, and the difference among prior haptic devices. We developed the PA suit based on the understanding of difference from prior research. Then, we combined advantages and reduced constraints of difference types of actuators to enhance haptic feedback and user experience in VR.

2.1 Vibrotactile haptic device

Most haptic suits adopt its transmission method using vibrotactile devices (McGregor et al. 2017; Schaack et al. 2019; Furukawa et al. 2019; Smets et al. 2008; Israr et al. 2014; Zhao et al. 2015; Konishi et al. 2018). Usually, the suits provide haptic feedback based on collision events. Researchers adopted various vibration actuators namely, Eccentric Rotating Mass (ERM) (Schaack et al. 2019), Shape Memory Alloy (SMA) (Foo et al. 2019), Linear Resonance Actuator (LRA) (Dementyev et al. 2020), Piezo (Sauvet et al. 2017), and Electro Active Polymer (EAP) (Mun et al. 2018), to provide vibration. Many researchers easily use inexpensive ERM actuators. However, most ERM actuators have a rigid body that is disadvantageous to develop a wearable haptic suit with flexible structure. Hence, researchers have considered other vibration actuators, which have a flexible structure, are small in size, and inexpensive. Moreover, the vibration haptic feedback is familiar to users because the devices have been used for a long time. However, the vibration strengths of these actuators are limited by their size. When users collide with heavy objects in VR, they expect strong haptic feedback. However, the feedback of the vibration actuators is weaker than the real collision. Additionally, the haptic suits are not equipped to provide a collision with strong pressing and no vibration.

2.2 Electrotactile haptic device

An electrotactile device requires the attachment of conductive patches to human skin and transmits electric signals into the receptors of a human (Shi et al. 2021; Franceschi et al. 2017; Germani et al. 2013). During the design of a haptic suit, a compact structure with electro-actuators can be easily developed. Because the device is lighter and smaller requiring minimal hardware comprising flexible patches, electric wires, a controller, a voltage converter, and a battery, as compared with a pneumatic device. Heavy pneumo-devices are described in Sect. 2.4. The device provides an electric stimuli at a low cost. The stimulus controls muscles to force the behavior of a user. However, when developing an electro haptic device, safety rules must be observed; for example, the patches on the head and heart should not be firmly attached and the current strength limit function must be secured. Otherwise, electro-devices can give shocks to the user’s heart, brain, and nerves, thereby causing nerve pain and permanently destroying organs (Dalziel 1956). Hence, owing to such high risks, the electrotactile haptic suit is not as widely used as other devices. Moreover, when the devices provide electro haptic feedback to people who have experienced weak electric shocks, they may feel discomfort owing to psychological fear (Seligman 1968).

2.3 Acoustotactile haptic device

The input of acoustotactile devices is an audio signal of the VE application (Ida 2014). They provide haptic feedback to users by converting the electrical waveform of an audio signal into physical vibrations. An audio transducer sends an amplified audio signal to a vibration actuator (Papetti et al. 2010) or ultrasonic display device (Hirayama et al. 2019), which transforms the signals into physical forces through a low-frequency filter, similar to the vibration when you put your hand on a speaker. The devices are effective for presenting echo sounds in simulations of situations such as a concert, musical hall, earthquake, or heavy explosion. Furthermore, the devices do not need to develop a classification collision type function. They only require a connection to a stereo audio signal jack into haptic devices to provide haptic feedback. However, the device provides feedback to users without objects colliding because the input signal of the device is an audio signal. Consequently, it interrupts the VR experience of a user through asynchronous feedback between visual cues and haptic cues in VE.

2.4 Pneumo-haptic device

A pneumo haptic device can provide strong pressing and vibration stimuli to users using compressed air and air bladders. Pneumatic devices are deployed to the ground and connected to the air bladders of the suit via air tubes (Wu and Culbertson 2019; Günther et al. 2020b; Barreiros et al. 2018; Young et al. 2019). The size of the airbag varies and the flexible air bladders cover the upper body of users. The Force Sensitive Resonant (FSR) sensor, which is placed on air bladders, measures the pressure of air bladders to control inflating and deflating. The pneumo-haptic feedback can present brief and strong collisions, which have low vibration frequencies such as those of a ball, bullet, or fist. The devices can also transmit vibration by rapid inflating and deflating iterations (Delazio et al. 2018; Zhu et al. 2020). As an advantage, the suits have the potential to provide a better VR experience than other transmission methods in VE. However, researchers have not adopted pneumo devices to make a haptic suit because they consist of an air compressor, tank, pneumatic valve, tubes, and air bladders, which make them heavy, loud, and expensive. If the systems have a high supply of pressure to provide strong haptic feedback, they would be difficult to control and would need the pneumatic device that endures high pressure. This is a limitation of the suit because people expect to experience VR with a haptic device on mobile platforms.

2.5 Presence evaluation

As stated by Witmer and Singer (1998), “the presence is defined as the subjective experience of being in one place or environment, even when one is physically situated in another.” and “involvement and immersion are necessary for experiencing presence.” They suggested that researchers can evaluate the presence of users with related factors. The authors proved that presence was related to the Sensory Factor (SF) and Realism Factor (RF) using the Presence Questionnaire (PQ). The SF includes sub-concepts that consist of multimodal presentation, environmental richness, and consistency of multimodal information. If a haptic device provides haptic feedback through multimodal senses of a human (multimodal presentation), expresses a virtual collision on many mechanoreceptors of the human body (environmental richness), and the various cues (visual, auditory, haptic, etc.) of the expressed virtual collision have consistency (consistency of multimodal information), the presence can be increased by the devices. The RF includes sub-concepts that consist of scene realism, separation anxiety/disorientation, and meaningfulness of the experience. For example, if a soccer VR simulation provides excellent graphics to a user with an advanced device (one with high texture, resolution, a field of view, etc.), and isolates the user from external cues (noise, light, etc.), and the user likes soccer, the user will most likely give a high score of the RF.

In their next paper (Witmer et al. 2005), they show us that there was a high correlation among several factors namely: involvement, immersion, and sensory fidelity. When a haptic device helps to increase involvement, it may also increase immersion and sensory fidelity. As presence is related to these factors, these factors are important for advanced VR experiences. In the past, researchers have tried to find the correlation between a haptic device and presence in order to prove the usefulness of the haptic device. They evaluated the suits with different and modified PQ forms because the best method for measuring the presence of systems that incorporate various structures of haptic devices and VE has not been verified. However, the PQ forms include the involvement factor in many studies (Barreiros et al. 2018; Kang et al. 2019; Jain et al. 2016; Günther et al. 2020b, a). Therefore, the involvement factor is used in this study to evaluate the developed PA suit system.

3 Proposed PA suit system

The hardware of the PA suit was incorporated with pneumo- and acousto-haptic devices to provide a multimodal sense that enhances user experience in VEs, as described in Sect. 2.5. To combine the advantages of the two haptic devices, we used multiple silicone air bladders and vibration motors as haptic actuators that were located on the chest area of the suit because the VR avatar usually collides with the projectile on chest of human avatar.

The main devices of the PA suit include the air tank, solenoid valves, embedded devices, a battery, and Ultra Mobile PC (UMPC). First, we aimed to develop a portable haptic suit. Therefore, we chose small components and placed the devices in the backpack as shown in Fig. 11. Although the backpack was heavy (8.7 kg), it was still portable. After achieving one of the development goals, we aimed to improve the VR experience of users without a weight penalty of the backpack, as shown. Even if the device is further developed to be lightweight in the future, if the PA suit does not enhance the VR experience of users, the primary purpose will not be served. Hence, we used the WSS to lift the backpack in the VR user experience study.

The software of the suit can be divided into two parts. The first part, the pneumatic device monitoring program, is located within the UMPC. It is used to control the hardware of the suit and handles haptic patterns. The second part includes the VR simulation that displays the 3D environment of user studies and an avatar of the user in real-time. The program also connects to the UMPC of the PA suit, WSS, VR devices, and motion tracking cameras. The detailed system description is given below.

3.1 Hardware of the PA suit

3.1.1 Air bladder design

Our goal was to make the air bladders simulate various virtual collisions with sensitivity on a human chest in order to provide realistic haptic stimuli to users. To achieve this goal, we made a small air bladder called the Air Bladder Cell (ABC) (Fig. 2a, b) based on a spatial discrimination threshold, that is, an ability to discern two nearby touching points on the skin. The ABC has a cubical bellows structure to transmit powerful forces to the skin directly. It is easy to fold and vertically inflate the ABC. Its size is \(30 \times 30 \times 10~\hbox {mm}\) with a spatial discrimination threshold of 30 mm on the chest of a human body (Schacher et al. 2011). We modulate the ABCs to render virtual collision. The Air Bladder Module (ABM)(Fig. 2d–f) consisting of five ABCs and one Bottom Frame (BF) prevents the pneumatic pressure of ABC from being diffused.

To make the ABC, ABM, and BF, we used silicone (Dragon skin 20 with a shore hardness of 20A) and plastic mold as shown in Fig. 3. We designed the ABC mold to have the thickness of the flank side as 2–3 mm and that of the top side as 1 mm. Before pouring silicone into the BF mold, we inserted silicone air tubes of Ø4 mm into its slots to secure the air path of the ABC. After securing the silicone objects, we punched holes on the top side of the BF for the air path, and then combined the ABC and BF to make the ABM. Multiple ABCs can create various types of ABM depending on the combination methods for each part of the human body. This paper reports that the ABM was designed with flexibility in mind in order to be compatible with the curved surface of a male human body as shown in Fig. 2e. We grouped the four ABMs consisting of twenty ABCs (Fig. 2c) and used two groups of ABMs to cover the left and right areas of the chest for high-resolution haptic feedback. We aligned all air tubes laterally to connect solenoid valves without folding the tubes.

Fig. 3
figure 3

Configuration of the Air Bladder Cell (ABC) and Bottom Frame (BF) mold. The ABC mold has the part of two types: side and core part; We combined the parts (the two side parts and core part) to make the ABC. All dimensions are in millimeters

3.1.2 Pneumatic actuation

The pneumatic and embedded parts are found in the suit backpack as shown in Fig. 4. The small air tank receives high-pressure air through an external air compressor, which provides a pressure of 600–800kPa with the regulator. The air tank is connected to inflow valves by an Ø8 mm polyurethane (PU) air tube. A microcontroller (MCU, Atmega 2560) (https://www.microchip.com/en-us/product/ATMEGA2560) controls the ON/OFF actuation with 40 pairs of inflow and outflow solenoid valves (SMC V100, 24 V, 0.1–0.4 W) (https://www.smcworld.com/newproducts/en-jp/21/v100/), which have a minimum response time of 5 ms to open the valves. Pneumatic pressure sensors and ABCs are connected by a 4-way pneumatic fitting between inflow and outflow valves with Ø4 mm PU air tubes. The sensors are only used to measure the pneumatic pressure of ABCs. The analog sensor data are converted to digital data using an analog-digital converter integrated circuit (MCP3008) (https://www.microchip.com/en-us/product/MCP3008). The microcontroller receives the data every 5 ms. The MCU sends the data to a UMPC (LattePanda Alpha) via LAN port for fast transmission. The embedded PCB is also composed of shift register integrated circuits (74HC595) (https://www.alldatasheet.com/datasheet-pdf/pdf/1388053/SS/74HC595.html) and high voltage power Metal-Oxide-Semiconductor Field-Effect Transistor (MOSFET, IRFZ44N) (https://www.infineon.com/cms/en/product/power/mosfet/n-channel/irfz44n/) to send multiple signals to the valves.

Fig. 4
figure 4

Schematic of the pneumatic hardware

We measured the inflation height and force of the ABC to define the character of the pneumatic actuator with digital vernier calipers and force gauge (DST-500N) (https://www.forcegauge.net/en/product/detail/dst010). Moreover, we compared the force of the virtual collision with the force of a real ball collision. We measured the force and inflation height of the ABC to define the character of the actuator and the artificial force and the force of real ball collision so as to compare the forces using digital vernier calipers (Fig. 5b) and force gauge (DST-500N), which have measurement resolution of 0.1 N and a range of 0–500 N as shown in the setups of Figs. 5a and 6. The ABC was sealed and filled with air to have no gap between itself and the force gauge. In the measurement, the pneumatic system had a pressure supply of 800 kPa, and the length of the PU air tube of Ø4 mm that connected between inflow valve and the ABC was 1275 mm.

Fig. 5
figure 5

ABC test environment

Fig. 6
figure 6

Setup to measure a collision force of the real ball

When the supply pressure was 800 kPa, the pneumatic system showed an inflation height of 8–19 mm (Fig. 2g) and a force of 9.3–51.2 N depending on the inflow times of 10–150 ms, as shown in Fig. 7. Additionally, we surveyed whether the ABC can provide the force of the real ball collision to the user using the setup of Fig. 6. In the experiment, we used a standard size 5 soccer ball of diameter 220 \(\hbox {mm}^{2}\), weight of 0.4 kg, and hung it on a wire of 1.6 m. When the drop height of the ball was 0.15 m, we measured maximum force of 40.4 N and collision time of 60 ms. The ABC provided the maximum force of the ball with an inflow time of 115 ms.

Fig. 7
figure 7

The measured the force of ABC based on inflow time. The 40 N when inflow time is 115ms, was similar to maximum collision force (40.4 N) of the real ball in the setup. However, the real ball collision time of 60 ms was less than 115 ms

For the outcome, we hoped that the PA suit would create an artificial collision of the ball. However, it proved difficult to provide the force of the real ball within 60 ms with our system. Moreover, we found it difficult to develop a system that endures high pneumatic pressure within the given time although we considered the supply pressure in the early development stage. To prove that the time of artificial collision is slightly different from the time in the real collision, even though people deem them to be similar, we proceeded with the comparison of real and virtual ball collision using a user study with the equipment specified in Sect. 5.2.

3.1.3 Control algorithm of pneumatic actuation

The control state of pneumatic actuation is divided into four states: inflow, hold, outflow, and wait state as shown in Fig. 8. The pneumatic device can expand the ABCs by repeating the states. We describe the states in this section.

Fig. 8
figure 8

Control diagram of pneumatic actuation

  • Inflow state Input solenoid valves are opened, compressed air flows through air supply lines, and air bladders expand during inflow time. The solenoid valves have minimum operation times due to the mechanical characteristics of valves and the processing time of embedded devices. Hence, we determined the inflow time of haptic patterns considering the limitations of the hardware.

  • Hold state If the haptic pattern involves the hold time state, input and output solenoid valves are closed to keep the pneumatic pressure of air bladders until the outflow state is “ON”.

  • Outflow state Input solenoid valves are closed and output solenoid valves are opened. As the pressure of the inflated ABC is higher than atmospheric pressure, and the human body pushes the ABC, the pressure of the ABC is lowered in this state.

  • Wait state Input and output valves are closed until the controller receives the next pattern command or during the offset time of the haptic pattern.

If the patterns need a hold state to keep the pressure of the air bladder, the haptic patterns can have the hold and offset time. If the pattern has a hold state, the order of control flow is wait, inflow, hold, and outflow state. If not, the order is wait, inflow, and outflow state. In this paper, we use the hold state only to test the hardware performance of ABC. We adopted open-loop control method in the control algorithm of the PA suit. The method precisely controls airflow with a control cycle of 5 ms, and its reaction speed is faster than the close-loop control method with a Pulse Width Modulation (PWM) that uses data from a pressure sensor. To achieve a natural and powerful force transmission, we decided to use the open-loop control method without using the sensor data in the experiments.

3.1.4 Acoustic actuation

The PA suit can provide not only pneumatic haptic feedback but also acoustic haptic feedback to users using a stereo audio signal of the HMD. Users can be provided the two types of haptic feedback at the same time to enhance their VR experience. The HMD communicates VR simulation via the wireless adapter and receives the audio signal in real time. After disassembling the electric line of the HMD, we connected its audio terminal to the stereo audio amplifier (SZH-EK373, \(50 \hbox {W}\times 2\)CH) (https://www.devicemart.co.kr/goods/view?no=1386028). The audio signal passed through the amplifier to generate sufficient power to actuate the vibration motors (SZH-GNP131, 12 V, 60 mA, \(45 \times 58 \times 30 \hbox { mm}\), 48 g) (https://www.devicemart.co.kr/goods/view?no=1329747) of the outer chest pocket for the feedback. The eight motors are attached with 32–56 mm distances on each pocket, as shown in Figs. 9 and 10. The motors transmit vibration on to the user’s chest through the ABM, as the ABM and the motor are closed with inner banding straps, the holders. In addition, the vibration direction is vertically toward the inner chest from the outer pocket.

Fig. 9
figure 9

Setup of the vibration motors in the PA suit. The holders fixed the position of the motors

Fig. 10
figure 10

Proposed PA suit

The system actuates the motors depending on the left and right audio signal that can be represented using the front LED of the suit, as shown in the demonstration video. Users can feel the vibration of the stereo audio such as explosion and crash noise that has an audible frequency range of 20 Hz–16 kHz and high-intensity vibration, as shown in Fig. 16.

3.1.5 Suit clothing and backpack design

We considered the convenience of the user while wearing the PA suit because the pneumatic devices are bulky and need multiple air tubes. The suit has two inner ABM pockets and two outer motor pockets on the chest area. The air tubes are aligned one behind the other. We also used flexible fabric (neoprene, 2T) for user convenience, as shown in Fig. 10. The rigid straps were sewed on the side of the suit to prevent losing elasticity due to the weight of the suit. The suit has two side straps for resizing the girth of the chest as chest sizes of participants are different (Fig. 10a). The suit was designed with a girth of 90–100 cm for a male. However, it can be enlarged using the attachments to 100–116 cm (Fig. 10b). A woman user can wear the suit if her chest size is small or if we redesign the ABC structure with minor modifications.

To make portable haptic devices, we put several devices namely pneumatic devices, lithium-ion polymer batteries, a portable PC, embedded PCB, and an air tank in the backpack. We made the 3D printed separator using Poly Lactic Acid (PLA) filament and then put it between the devices in the backpack to arrange the devices as shown in Fig. 11. The heaviest component, the air tank, is positioned at a low level of the backpack because of a lower center of gravity, which provides more stability. The embedded PCB, sensor PCB, and valve box are placed at the upper level. To reduce the size of the valve box, we used small solenoid valves (\(28.5 \times 15.5 \times 30.8 \hbox{ mm}\)) as described in Sect. 3.1.2, and small air tubes with have a diameter of 4 mm. After development, we reduced the size of the system for the device to fit in the backpack and for users to load the backpack without the WSS. The total weight of the suit system is 8.7 kg including the air tank and 80 valves. The backpack is suitable for a VR simulation time of 10–15 min but is too heavy to experience our VR simulation for long periods (above 90 min); hence, we set up the backpack on the WSS, as shown in Fig. 1.

Fig. 11
figure 11

Schematic of the haptic backpack

3.1.6 Weight support system

The WSS was developed by the Korea Institute of Industrial Technology (KITECH) (https://eng.kitech.re.kr/main/), to reduce the weight of the backpack worn by the user. Control method of the system is similar to the previous research about reducing gravity (Kang et al. 2019; Dungan et al. 2015). The WSS can cover our VR area of \(5 \hbox {m}\times 5 \hbox {m}\), allowing a user to experience our VR simulation with less load in the area. The WSS uses an overhead crane system and a cable-driven system as shown in Fig. 12. The overhead crane system consists of three motors. Motor3 and 4 move along the X-axis and the Motor2 moves along the Y-axis to follow the position of a user that can move within 1 m/s. The cable-driven system consists of a drum winch, 2 encoders and a load cell. The steel rope is rolled on the winch and passed through ring holes which are linked by the encoders in order to get the slope of the rope. The slope is used for closed-loop position control of the overhead crane, every 2 ms. The load cell measures the tension of the rope by 0–200 kgf, to reduce the weight of the backpack. The Programmable Logic Controller (PLC) of the WSS with motor1 of the rope drum can moves along the Z-axis within 2 m/s and lifts a maximum weight of 120 kg. The system also has safety functions with limited switch devices to prevent a user from falling.

Fig. 12
figure 12

Configuration of the WSS

3.2 Software of the PA suit

3.2.1 PA suit manager program

The UMPC runs the PA suit manager program based on Windows 10 OS and Visual Basic .NET (Microsoft 2018). The program shown in Fig. 13 has various functions: mapping the valves between ABCs, controlling each valve manually, editing haptic patterns, playing the pattern depending on collision event, and displaying the pressure of ABCs. Each ABC is matched with a pair of valves to control the inflow and outflow of air with the mapping function. After mapping, we edited the parameters of the haptic pattern, that is offset, inflow, hold, outflow time, repeat number, and vibration mode, to create the patterns. These parameters were edited with a unit of 5 ms. The program can combine the created patterns into a sequence. For example, we can play the explosion pattern and then automatically play the cross pattern using the haptic sequence function.

Fig. 13
figure 13

The PA suit manager program

The manager program has two operation modes, which are manual mode and auto mode. In manual mode, we can control switching ON/OFF of the valves. Moreover, the program can play the created haptic patterns with different operation coordinates. In auto mode, the manager connects to the server of VR simulation via a wireless TCP/IP communication receives the collision information from VR simulation and plays the haptic pattern depending on collided position and type of VR avatar.

To provide stronger pressure for users, we added an initialization function to the program. In the initialization, the program opens outflow valves, and we tighten the side straps of the suit to fold the ABCs. Because the side strap is connected to the fabric above the ABCs, the fabric presses vertically the ABCs. Therefore, tightening the side strap when the ABC is inflated improves the outflow of the ABC. Then, to reduce the gap between the user’s body and ABMs, as the chest of a human is not flat, the program can execute low inflation of the ABCs with an inflow time of 5 ms. After the user has completed the initialization, the ABCs can recover to the inflation height of the initial state with sufficient outflow time owing to the elastic force of the ABC and the suit, as shown in Fig. 8.

3.2.2 VR simulation

We made the VR simulation with the commercial 3D engine software, Unity (https://unity.com/), to proceed with the user studies as shown in Fig. 14. We simulated an empty forest, a swing ball using the motion capture function, an air raid, fire, a shock wave of an explosion, and a dust wind for user studies discussed Sects. 5.2, 5.3, and 5.4. To increase the playability of the simulation without the PA suit, the participants interacted with the virtual ball and buttons. Moreover, they control the VR avatar that is expressed by motion tracking. The simulation produces a 3D stereo sound that expresses a crash noise of a ball and an explosion for auditory cues. Additionally, we added wind and bird sounds in the simulation to prevent disturbances caused by external noises from machines or researchers’ voices. If the participant is hit by a ball on the chest or an explosion occurs in the simulation, the system provides an auditory cue of the collision and haptic cues within 12–17 ms, as described in Sect. 3.2.3. The participants view the scene through an HMD for a realistic simulation test. We used VIVE PRO HMD (https://www.vive.com/us/product/vive-pro-full-kit/), which has a display resolution of \(1440\times 1600\) pixels per eye (\(2880\times 3200\) pixels combined), a refresh rate 90 of Hz and a field of view of 110\(^\circ\). The physical VR area size is \(5~\hbox {m} \times 5~\hbox {m}\). We placed multiple infrared cameras which consist of four VIVE-base stations 2.0 (https://www.vive.com/eu/accessory/base-station2/) and eight OptiTrack PRIME 13W https://optitrack.com/cameras/prime-13w/) to track hand controllers and feet of a user for realistic visual cues.

Fig. 14
figure 14

3D environment of our VR simulation

The main computer which consists of NVIDIA 1080Ti graphic card (https://www.nvidia.com/en-us/geforce/products/10series/geforce-gtx-1080-ti./) and i7-8700K CPU (https://www.intel.co.kr/content/www/kr/ko/products/sku/126684/intel-core-i78700k-processor-12m-cache-up-to-4-70-ghz/specifications.html), executes the VR simulation and communicates wirelessly with the UMPC of the PA suit through TCP/IP protocol. Also, the computer is connected to the embedded PLC of the WSS.

3.2.3 Collision command transmission process

It takes time to provide haptic feedback after an avatar of a user collides with a virtual object. If the time of collision is too slow, users feel the haptic feedback is unnatural. Hence, we attempted to find the entire delay time to define limitation of the system. In this section, we show the delay time in the transmission process, which influences a user’s presence because of inconsistencies between visual cues and haptic cues of a virtual collision. When the avatar of the user collides with a virtual object in our VR simulation, collision events occur. The event sends information about the collided object type and collision position to the UMPC via 2.4 GHz WiFi. When the UMPC receives this information, it creates the flag of the pattern command and proceeds with the sequence of the control diagram in Fig. 8. The connection has 2–5 ms delay time in the network system. The PA suit manager program of the UMPC parses the information and makes control commands according to the control sequence, as described in Sect. 3.1.3. The process is executed with multi-thread using a multimedia timer every 5 ms.

After this process, command packets are sent to the MCU (Arduino Leonardo, https://docs.arduino.cc/hardware/leonardo), which is integrated with the PCB of the UMPC via serial port (230400 bps) using a Universal Asynchronous Receiver/Transmitter (UART). The MCU sends signals to the shift register IC via I/O port to switching valves and then executes them with timer interrupt of the MCU every 1–2 ms. The valve receives an electric signal via the IC which then moves the inner core of the valve using electromagnetic forces to open or close the air path which requires ON: 5 ms and OFF: 4 ms. From the tests, we estimated the total processing time to be 12–17 ms. Most participants from our user study reported that they could not feel the delay.

4 Haptic pattern

A haptic pattern expresses artificial feelings related to virtual collisions when an avatar of a user collides with an object in VR. Well-made haptic patterns increase the involvement factor and cognition of patterns in guiding direction and task instruction. We created the patterns which are divided into two parts: the pneumatic haptic patterns and the acoustic haptic patterns.

4.1 Pneumatic haptic patterns

These patterns are expressed by pneumatic devices. Basically, the patterns provide a feeling of pressing and relayed motion. We created various haptic patterns including: a ball, fist, bullet, wave, rotation, cross, explosion, and indefinable patterns using the PA suit manager program. Most of these patterns were designed based on estimated result or supposition. Each ABC of the patterns has variables that offset and inflow time. The patterns have three or more offset and inflow times to create various haptic patterns. For example, if the haptic pattern has three offset times, the times are written in the format: (a, b, c). The detail of the haptic patterns is described in here and Fig. 15.

Fig. 15
figure 15

Visualization of the haptic patterns. a one ABC as a square. If the ABC color is deeper red, the ABCs have longer inflow time. The number of the ABC means the inflation order (e.g., the ABCs written by the number 1 are actuated at first with short offset time). b shape of the patterns; red arrows show direction of inflation

  • Ball We estimated the collision area of the soccer ball to be 576 \(\hbox {mm}^{2}\) with a falling height of 0.15m; therefore, the pattern’s size is similar to the collision area. Nine ABCs are actuated with several offset times (0, 5, and 10 ms) to present spreading power. The pattern has four pressure levels with several inflow times (Level 1: 30, 20, 10 ms; Level 2: 60, 50, 40 ms; Level 3: 90, 80, 70 ms; Level 4: 120, 110, 100 ms). The different levels of the pattern were used to compare collisions between the real and virtual ball as shown in Sect. 5.2. The value of 120 ms was decided based on the result from the ABC actuating test as described Sect. 3.1.2.

  • Fist According to the paper (Liu et al. 2014), the peak impact force of the weak fist had 0.7–0.9 kN and occurred within from 30 to 40 ms. The force is too strong to represent a virtual haptic pattern in our system. Instead, we reduced the force to reproduce such as weak fist collisions. We applied ABM area of 540 \(\hbox {mm}^{2}\) with 6 ABCs and the inflow times (60, 80, and 100 ms). The fist pattern has the offset times of 0, 5, and 10 ms, as each knuckle of a hand collides with delayed time in the collision.

  • Bullet In mass media, the pains of gunshot wound have been discussed. The feeling of the gunshot wound has been described as similar to an electric shock, a burning sensation, a heavy push with a hammer, or a stab with a rusty knife; these feelings vary due to the difference in the position of collision, nerve system, force of collision, state of the participants, and shape and size of the bullets. Some people do not feel any pain owing to an increase in the amount of adrenaline in the body. Therefore, in our experiment, we arbitrarily define the pain associated with a bullet wound as similar to that experienced when stabbed; hence, the bullet pattern has a small size of 45 \(\hbox {mm}^{2}\). In the expression for the spreading force, the middle ABC of the pattern has a maximum inflow time of 120 ms and approximately 4 ABCs have a low inflow time of 20 ms.

  • Side The pattern actuates the 9 ABCs with offset times (0, 100, and 200 ms) and inflow times (30, 50, and 100 ms). Users feel a haptic stimulus of the pattern on the left side. If a user has low fat in the upper rib section of their body, the pressing feeling of the ABCs experienced by them will be stronger than that experienced by the other users.

  • Wave The wave pattern is a relay pattern that actuates the ABCs one by one every 100 ms. The wave-in pattern actuates from the outer ABC to the inner ABC and the reverse pattern (wave-out) actuates from the inner ABC to the outer ABC along the marked direction of Fig. 15. The inflow time of the ABCs is 100 ms and offset times are 0–1900 ms.

  • Cross The cross patterns is a relay pattern that can be used in virtual combat simulations. In the simulations, a user engages an enemy with a long sword which usually hits the upper body of the user. The pattern inflates the ABCs one by one with the same inflow time of 100 ms and relayed offset times of 0, 100, 200 and 300 ms.

  • Check The check pattern inflates the ABCs from top to bottom, and top as shown in the Fig. 15. The pattern has the offset times: 0, 100, 200, 300 ms and inflow times: 30, 50, 100 ms. We created the pattern for task instructions and cognition of special events in VR.

  • Rect-dot The rect-dot pattern inflates the ABCs with the inflow times of 30, 50 and 100 ms and offset times of 0, 100 and 200 ms. We created the pattern for task instructions.

  • Explosion The shock wave of an explosion is very strong. We never experience its impact but only imagine that the impact harshly pushes our whole body. Hence, we designed the pattern which actuates all ABCs (40 ABCs) at the same time and sets the inflow time at a maximum time of 120 ms. This pattern was used in the Sect. 5.4, of the explosion simulation user study.

4.2 Acoustic haptic patterns

The patterns are expressed by acoustic devices with specific environmental conditions. We used the patterns to provide multimodal haptic feedback. We converted the crash noise of a ball and the sound of an explosion to acoustic haptic patterns when simulating an explosion and receiving a ball. A user hears the sounds through the speaker of an HMD and the PA suit provides acoustic haptic feedback to the user with the patterns. The sound of the haptic patterns is louder than other sounds of VR simulation that did not convert the acoustic haptic pattern because the intensity of surrounding sounds weak. The length of the wave of the crash noise of a ball is shorter (0.15 s) than the audio of an explosion (8.43 s). We measured the converted physical vibration using the accelerometer of the iPhone 8 Plus https://support.apple.com/kb/SP768?viewlocale=en_US &locale=en_US) on the ABM. To measure the acceleration with the sounds, we set the accelerometer on the ABMs as the motors transmitted the vibration to the user through the ABMs. The accelerations measured for the crash noise of a ball were in the 9.67-9.81 \(\hbox {m/s}^{2}\) range (Fig. 16a) and those for the explosion noise were in the 8.96-10.76 \(\hbox {m/s}^{2}\) range (Fig. 16b).

Fig. 16
figure 16

Waveform, spectrogram, and acceleration of sounds. a crash noise of a ball; b explosion audio

5 User study

The main objective of the user study is to evaluate the possibility and usefulness of the PA suit. The possibility of the suit is evaluated by the recognition test and a user study that shows the comparison between a real and virtual ball. The usefulness is evaluated by a user study showing the simulation of a receiving ball and explosion. Participants (\(N = 25\), all males) aged 19-50 (\(M=22.71\), \(\text {SD}=7.07\)) with a girth of 87–105 cm on the chest (\(M=94\), \(\text {SD}=5.04\)) were recruited for this user study and received 34 dollars as compensation. Five of the participants had never experienced playing in VR simulations. The user studies were conducted in approximately 105–140 minutes as shown in Table 1, except 40 minutes used to explain a subject information sheet and a consent form to participants. All statistical analyses were proceeded by the Thin-R v8.01.03.05 (R Core Team 2017). We administered participants with the Simulation Sickness Questionnaire (SSQ) three times (before experiments, the break time, after all experiments).Among the 25 participants, five reported mild sickness (total score: 1–5 point, \(M=0.44\), \(\text {SD}=1.1\)). which included general discomfort, fatigue, eye strain, and difficulty in focusing on the user study.

Table 1 Time table

5.1 Recognition test

In the recognition test, we try to prove that our PA suit provides various haptic feedback and participant can feel a detailed expression of collision position in VE. Moreover, if participants can distinguish various patterns, we can use the patterns for guiding direction and tasks in serious games.

5.1.1 Haptic pattern recognition

Method and procedure We used haptic patterns for the recognition user study, as described in Sect. s4.1. The participants had to remember the haptic feedback of the patterns that they used to pair the 10 figures of Fig. 15 during the memorization step. In this step, when the participants clicked the VR buttons, they were presented with the shape, order, and name of each haptic pattern graphically, as shown in Fig. 17, and the PA suit provided the haptic feedback for the clicked button. They clicked the buttons of the 10 patterns three times within 15-20 min. Then, we proceeded with the pre-test to ensure if the participants correctly understood the meaning of the numbers and colors of the haptic patterns of Fig. 15. During the answer step, when the participants clicked the next button, the VR simulation software automatically stimulated the participants with one of the 10 patterns. Subsequently, we asked the participants the name of the stimulated pattern that were presented with, and they responded by clicking the buttons with the name of the pattern. The participants answered 30 such recognition questions, and each of the 10 haptic patterns was activated three times in a random order. They were allowed to replay the haptic pattern, for answering the question, if needed. Over time, they lost concentration and failed to correctly answer the questions they were asked.

Fig. 17
figure 17

In the haptic pattern recognition, participants clicking the buttons to learn the patterns and solve the questions

Result and discussion

We obtained the correct answer rate of pattern recognition test from participants’ answers , P0:Side \((M=86.67\%)\), P1:Check \((M=100\%)\), P2:Rect-dot \((M=66.67\%)\), P3:Wave-in \((M=100\%)\), P4:Cross2 \((M=98.67\%)\), P5:Ball \((M=73.33\%)\), P6:Bullet \((M=90.67\%)\), P7:Punch \((M=78.67\%)\), P8:Wave-out \((M=100\%)\), P9:Cross1 \((M=100\%)\), as shown in Fig. 18. P0, P1, P3, P4, P6, P8, and P9 had very high correct answer rates, above 90%, but P2, P5, P7 had lower rates than other patterns. As a result, we decided that we could use the patterns which had a high correct answer rates for guiding direction and task instruction. We are confident we can find haptic patterns of high correct answer rates with an additional recognition test. After the test, we asked participants who recorded low correct answer rate, “Why did you have a hard time distinguishing the patterns?”. They reported that the feelings of the patterns were similar even though the number of actuated ABCs was different. Since the middle part of a human chest has a valley and the lower part of the chest makes it difficult to close ABCs with the straps of the PA suit.

Fig. 18
figure 18

The correct answer rate of questions in the haptic pattern recognition

5.1.2 Haptic feedback accuracy

Method and procedure

To analyze the accuracy of the haptic feedback, the participants underwent the memorization and answer steps. In the memorization step, the participants were asked to memorize the relationships between the stimulated position and the identification number of the ABC (0–39) for 10 min; if the participants clicked one of the buttons (see Fig. 20), the PA suit actuated one of the ABCs for 100 ms since the buttons were paired ABCs. E.g., if they clicked the button labelled as ‘9’, the VR simulation software automatically actuated the ABC (X:2, Y:2). In the answer step, one of the 40 ABCs was actuated to stimulate the participants. Subsequently, we asked the question (“Where is the position of the stimulus provided by the ABC? Please choose the identification number of the actuated position.”). Each of the 40 ABCs was actuated three times in a random order and we asked the question 120 times (40 * 3). The participants were allowed to replay the stimulus for each question since they gradually lost concentration. After the user study, we translated the identification number of the ABCs to the coordinates (X:1-8, Y:1-5) from the responses, and calculated the average position and distance between the answer position of the participants and the real actuated position, as shown in Fig. 19.

Fig. 19
figure 19

This figure shows the example about calculation of the average position and distance when the number 9 ABC was actuated. The distances between the actuated and answered positions are \(\sqrt{2}\), 2, and 1, average distance is (\(\sqrt{2}\)+2+1)/3 = 1.47. The average position is calculated by adding all the vectors

Fig. 20
figure 20

In the user study for the haptic feedback accuracy, participants clicked the buttons (0–39) to memorize the position of actuated ABCs, and to answer about the questions

Result and discussion

We calculated the average distances of all ABCs (\(M=0.73\), SD = 0.19) from the responses and presented the average position of each ABCs as shown in Fig. 21. The result shows that the participants could distinguish the position of the actuated ABC within the distance of 1 (we defined the distance in the Fig. 19). This means that the participants were confused between the actuated and neighboring ABCs. The confusion is related to providing a detailed haptic feedback; e.g., a VR developer wants to provide a feeling of pressing related to hand scratch for their entertainment VR program. If a user can completely distinguish the positions of actuated ABCs sequentially, the feeling is not similar to that of a single touch; in other words, the user may feel them as separate touches. Although the PA suit did not fully support the haptic feedback, the result proved that it is helpful to achieve detailed haptic feedback beyond human cognition including the feeling of pressing along with hand moving. Furthermore, we observed from the user study that the participants tended to confuse the position of ABCs over time because of the compact deployment of the ABCs, fatigue, and short-term memory loss of the participants. However, they distinguished the positions of each ABC at the memorization step. Participants said that short-term memory affected their ability to remember the relationships of the stimulated position and the identification number of actuated ABCs. Usually, a user of a haptic suit uses fun VR programs with various cues unlike this user study where participants just click buttons then get stimulated by the ABC. For this reason, when a user plays in other fun VR programs with the PA suit, distinguishing could be more difficult for a user than our participants. Moreover, we know that when designing haptic patterns using the PA suit, if recognizing haptic patterns is an important goal to VR developers that use the PA suit for their VR programs, they can consider the average distance to avoid confusion; unlike when we aimed to providing detailed haptic feedback. This is useful for creating a haptic pattern that has high correct answer rates of the Sect. 5.1.1.

Fig. 21
figure 21

Average position of each ABC from the responses of user study in Sect. 5.1.2

5.2 Collision comparison of real and virtual ball

We designed the PA suit to transmit strong pressure and detailed haptic feedback. We determined that the ball collision is related to strong feedback. Moreover, we measured the force of a real ball collision and the ABC to prove that the PA suit provides the force as shown in Sect. 3.1.2. Thus, we conjectured that the users feel that the pressure of a virtual ball collision and that of a real ball collision are similar. To verify if the suit can provide enhanced haptic feedback regarding ball collision, we performed a collision comparison user study between the real ball and virtual ball.

Method and procedure

We set up a pendulum, suspended a ball from the bar using a thin rope, and dropped the ball from a height of 0.15–0.2 m as shown in Fig. 6. The VE shown in Fig. 22 indicates the position of the soccer ball using motion tracking devices that participants can use to estimate the collision timing of the real ball. The swing motion of a virtual ball was calculated using the physics engine of Unity 3D to present a virtual collision with the PA suit. When the participants collided with the ball, the PA suit actuated the ABCs with the ball’s haptic patterns which have four pressure levels, as described in Sect. 4. After participants experienced a collision with a real ball, they proceeded to experience collisions with a virtual ball then chose one of the levels related to the closed pressure of the real ball collision. Also, we asked: “How similar were the chosen virtual collision and real ball collision?” and “Did you think that can replace the real ball collision using the virtual haptic feedback even the feedback was different the real feeling?” with a 7-point Likert scale (Likert 1932). The questions are related to the similarity and replaceability scores of the ball pattern.

Fig. 22
figure 22

The VE of the collision comparison

Result and discussion

The participants rated the average pressure level (\(M=3.08\), \(\text {SD}=1.00\)), similarity score (\(M=4.72\), \(\text {SD}=1.1\)) and replaceability score (\(M=5.52\), \(\text {SD}=0.87\)) of the haptic pattern. The average pressure level was lower than level 4 of the ball haptic pattern that provides maximum pressure in the PA suit. If the participants thought that the virtual collision was weaker than the real ball collision, they would report that the average pressure level is 4, but they did not. The result proved the PA suit provides the approximate pressure of real ball collision. However, the similarity scores shown in the questions about the haptic patterns means that the PA suit cannot provide realistic haptic feedback in this experiment. We asked the participants the reason for their answers, they said that the inflow time of ABCs was too long. The collision time (50 ms) of the real ball was shorter than the inflation time (100 ms) of the haptic pattern in the collision of virtual ball. Additionally, they reported that the material of the ABCs was too soft. They expected the feeling of a rigid soccer ball, but the PA suit just inflates the ABCs. When we asked participants about replaceability, they gave us positive responses stating that we can use the ball pattern for ball game applications even though they felt a difference between real and virtual collision. Consequently, we inferred that the haptic pattern has a chance to enhance the VR experience of participants in the receiving ball simulation. To prove this, we designed a chest ball receiving simulation.

5.3 Chest receiving ball simulation

We already evaluated that the haptic pattern is not similar to the touch of a real ball; however, users reported that it is possible to use the pattern for game contents in user study 4.2. To prove the performance of the PA suit, we proceeded with the receiving ball simulation.

Method and procedure

At first, participants were asked to stand in the middle of a virtual space, and then they were moved to receive the virtual ball with their chest for two minutes, as shown in Fig. 23.

Fig. 23
figure 23

Receiving ball simulation

In the simulation, the ball’s speed is 4 m/s, the launch angle of the Y-axis is \(\pm {11.5}^\circ -{23}^\circ\), the distance between ball launcher and user is 4 m, and the participants moved within 1 m along the Y-axis. The participants received the ball on their chest. Upon receiving the ball, the PA suit provided synchronized haptic feedback which represented the position and timing of the virtual collision. In the simulation, the haptic feedback using pneumatic actuators can display in various positions, e.g., when the user collides with the ball on the left chest, the users feel the haptic feedback in the same position. Additionally, users heard the sound of the ball collision as described in Fig. 16 The audio signal actuated eight vibration motors.

To prove the performance of our multimodal haptic suit, each participant repeated the simulation 4 times under different conditions namely: (None: without the PA suit, Vib: with acoustic actuators, Air: with pneumatic actuators, Both: with both actuators). We were concerned that the order of playing would affect the VR experience of participants in the simulation which would decrease the objectivity of the results because of the primacy effect (Holbert et al. 2007), which is the tendency to remember the first piece of information better than the information presented later; e.g., people like to indulge in a new activity that they haven’t experienced before. When participants experience new VR simulations, they may tend to take it more seriously owing to their excitement. However, the seriousness may diminish with time. Therefore, we conjectured that if a participant experiences our VR simulation without haptic feedback for the first time, they may give more high involvement scores of the simulation when the participant experiences the simulation with haptic feedback than the evaluation score of first time.

To confirm the conjecture, we grouped the participants into two: (A: \(N=13\), B: \(N=12\)). The groups were presented with different orders of play with the conditions, (order of play of A: \({\textit{None}}\rightarrow {\textit{Air}} \rightarrow {\textit{Vib}} \rightarrow {\textit{Both}}\), order of play of B: \({\textit{Both}}\rightarrow {\textit{Vib}} \rightarrow {\textit{Air}}\rightarrow {\textit{None}}\)). After experiencing the simulation, we asked two questions (Q1: How compelling was your sense of objects moving through space?, Q2: How involved were you in the VE experience?). The questions are related to the SF (Q1), RF (Q2), and involvement factor (Q1, Q2) of the PQ (Witmer and Singer 1998) as described in Sect. 2.5. The participants answered the questions with a 7-point Likert scale (1: Not at all, Not involved, 4: Moderately compelling, Mildly involved, 7: Very compelling, Completely engrossed). We conduct the user study in the same condition (visual and auditory cues), excluding haptic cues.

Result and discussion

Participants reported these involvement scores: None (Q1: \(M=4.16\), \(\text {SD}=1.43\), Q2: \(M=4.8\), \(\text {SD}=1.65\)), Vib (Q1: \(M=4.12\), \(\text {SD}=1.36\), Q2: \(M=4.48\), \(\text {SD}=1.26\)), Air (Q1: \(M=5.24\), \(\text {SD}=1.09\), Q2: \(M=5.44\), \(\text {SD}=0.96\)), and Both (Q1: \(M=5.16\), \(\text {SD}=1.34\), Q2: \(M=5.56\), \(\text {SD}=1.04\)) as shown in Fig. 25. The Both and Air conditions recorded a higher score than the None condition, but Vib condition has a lower score than None condition despite being provided haptic feedback by the PA suit. We asked them, “Why do you feel the differences of VR experience in the \(\textit{Vib}\) and \(\textit{None}\) condition?”. Some participants said that the vibration was unexpected haptic feedback, and the short and strong feedback was more proper in the case of ball collision. The result showed us that pneumatic haptic feedback increased involvement but the increased score with multimodal haptic feedback is lower than the score with the single haptic feedback in this simulation. We thought that if we change the collision type, which needs strong pressure and vibration like a heavy explosion, the multimodal feedback would increase the evaluated scores. Therefore, we designed an explosion simulation in the next section.

Further, we analyzed the involvement score based on order of play group as shown in Fig. 25. A Wilcoxon rank-sum test showed that there were significant differences between the scores, the None condition (Q1: \({p}=.01\)*, Q2: \({p}=.002\)**, *\({p}<.05\), **\({p}<.01\)), the Vib condition (Q1: p = .75, Q2: p = .31), the Air condition (Q1: p = .17, Q2: p = .29), and the Both condition (Q1: p=.13, Q2: p = .79) The result shows that the primacy effect affected response Q1 and Q2 in the None condition. At first, the participants of group B experienced the fun VR content in the Both, Air, and Vib conditions with haptic feedback. At the end of the user study, they experienced the VR content in None condition without haptic feedback. At this time, group B participants experienced more boredom that group A participants in the None condition because participants from group B already had information from the contents of the fun VR. In other conditions, the involvement scores were not affected by the order of play. The results proved the objectivity of the evaluated scores.

5.4 Explosion simulation

To increase the presence of multimodal haptic feedback in the PA suit, we searched for proper collisions that include vibration and strong pressure. We assumed that an explosion collision would be more suitable than a ball collision to describe a real situation using the PA suit.

Method and procedure

To make the haptic pattern, we considered the sensory modality and consistency of multimodal information (Witmer and Singer 1998) in the user study. We are reminded that the pneumatic actuator provides proprioceptive cues and acousto actuators provide tactile cues. When well mixed, both cues can satisfy the modality and consistency. We determined that the explosion transmits strong pressure and vibration for 8.43 seconds in our heavy shelling combat situation. Since a heavy explosion creates a shock wave with air pressure and vibrates the ground. The vibration of acoustic actuator well expresses environment sound, when the explosion vibrate the ground. We used the explosion haptic pattern as shown in Sect. 4. To describe the realistic explosion, we added the visual effect such as fire, dust, stone particle, and flash, as shown Fig. 24. In this simulation, participants just stood and saw various objects such as airplanes and light from missiles, then heard an air raid alert for a few seconds. After that, the simulation played the explosion three times. The participants took the same evaluation involving the simulation of receiving a ball , and they reported they did not suffer from Post-Traumatic Stress Disorder (PTSD) as a result.

Fig. 24
figure 24

Explosion simulation (Heavy explosion occurs three times with the fire, shock wave, dust, and stone fragments at night)

Result and discussion

Participants reported the involvement scores as follows: None (Q1: \(M=4.12\), \(\text {SD}=1.36\), Q2: \(M=4.4\), \(\text {SD}=1.6\)), Vib (Q1: \(M=5.36\), \(\text {SD}=0.95\), Q2: \(M=5.16\), \(\text {SD}=0.94\)), Air (Q1: \(M=5.56\), \(\text {SD}=1.0\), Q2: \(M=5.68\), \(\text {SD}=0.98\)), and Both (Q1: \(M=6.08\), \(\text {SD}=0.7\), Q2: \(M=5.96\), \(\text {SD}=0.84\)) as shown in Fig. 25. The results show that the multimodal feedback of the PA suit is very effective in the explosion situation. The enhanced involvement score of Both condition, which is 1.56–1.96 point higher than None condition, means that the participants were engrossed in the VR simulation by multimodal haptic feedback of the PA suit. The score of Both condition is higher than Vib and Air condition showing us that participants have more presence when we use two types of the actuators instead of one.

Fig. 25
figure 25

Responses of presence questionnaire with a 7-point Likert scale in the simulations: a, b receiving ball; c, d explosion. The a and c present result based on the conditions of user study (None, Vib, Air, and Both) and the questions (Q1 and Q2). The b and d show us the additional result while considering the order of play group (A and B). The linked red dots indicate that difference between the A and B groups is big in the condition

Furthermore, we analyzed the involvement score based on the order of play group, as shown in Fig. 25. A Wilcoxon rank-sum test showed that there were significant differences between the scores as shown: the None condition (Q1: \({p}=.16,\) Q2: \({p}=.008^{**}\)), the Vib condition (Q1: \({p}=.57,\) Q2: \({p} =.43\)), the Air condition (Q1: \({p}=.93\), Q2: \({p}=.97\)), and the Both condition (Q1: \({p}=.97\), Q2: \({p}=.93\)). The result shows that the primacy effect affected the Q2 response in the None condition. The result has objectivity in these conditions, except the Q2 of None condition.

5.5 User presence analysis based on immersive tendency

Method

We suspected the answered involvement score of our user studies was affected by the immersive tendency (IT) of participants because researchers reported in previous studies that IT can affect one’s presence (Witmer and Singer 1998). We were concerned that the reason for the highly scored answers was the IT of the participants and not the PA suit. To increase the objectivity of the results, before the experiments, we gave participants the Immersive Tendency Questionnaire (ITQ) which consists of focus, immersion, emotion, and game items. After the experiments, we again asked about the presence of the entire system using the PQ which consists of realism, possibility to act, quality of the interface, self-evaluation of performance, and sound item with 22 questions. The ITQ and PQ were proposed by Witmer &Singer (revised by the UQO Cyberpsychology Lab, 2004) with a 7-point Likert scale (Witmer et al. 2005). We divided the participants into 3 groups (A: \(N=9\), \(M=94.11\), B: \(N=10\), \(M=83\), C: \(N=7\), \(M=64.86\)) based on the score of ITQ and proceeded with statistical analysis using a Kruskal–Wallis test.

Table 2 p-values of Kruskal–Wallis test based on the IT group

Result and discussion

In the analysis result, the presences score of the receiving ball and explosion simulations were not affected by the score of ITQ (\({p}>.05\)), as shown Table 2. The presence score (\(M=5.6, \text {SD}=0.57\)) of the entire system showed that the participants received a somewhat good feeling in the system as shown in Fig. 27. To analyze the correlation between IT and the presence of the entire system, we drew the distribution chart as shown in Fig. 26. The chart seems to be unrelated to the variables. Additionally, a Kruskal-Wallis test showed that there were no significant differences between the presence scores of the groups (\({p}=.94\)). As seen from the results, we confirmed that the presences were not biased by the IT of the participants in our user studies.

Fig. 26
figure 26

The figure shows us presence score on ITQ Score. The presence scores were evaluated on entire system

Fig. 27
figure 27

Presence questionnaire response of our entire system. The figure shows mean scores of each item depending on the group (A: high ITQ score, B: middle ITQ score, C: low ITQ score) of ITQ score. The total presence score was obtained by summing scores of other items

6 Conclusions and future work

To satisfy the consistency of multimodal information concepts, we developed the PA suit with 40 small silicone air bladders and 8 vibration motor using audio signal. As a result, our multimodal haptic suit using pneumatic and acoustic actuators highly increased the involvement of participants in the VR simulations. Mainly, we proved in the explosion situation that proper multimodal haptic feedback with the PA suit provides a high involvement to users. Additionally, the ball collision comparison showed us that although the haptic feedback of the ball pattern is not similar to the touch of a real ball, and the pressure of the pattern is similar to the pressure of a real ball. Consequently, the participants suggested that developers can use the ball haptic pattern for immersive VR contents. The result of the ball simulation shows that the pattern helps to increase involvement in our VE but the increased involvement score of the ball simulation is lower than that of the explosion simulation. Participants reported that the reason for this was that the air bladder was made of a soft material and the pattern had a long collision time. According to the participants in the additional user study, adding a 3D printed ball frame and increasing the pressure resulted in the suit providing superior haptic feedback. This phenomenon suggests that if the surface shape and stiffness of the ABCs is controlled and even though the collision time differs in real and artificial collisions, superior user presence can still be obtained.

We reasoned that if the participants have high ITQ score, they will give a high involvement score in the simulations which in turn will decrease the reliability of the simulation results. The involvement scores were not affected by the ITQ score in our simulations. However, the order of play affected the involvement scores. The correlation between IT and the involvement score of the participants in our user studies is uncertain. We need more participants to verify the correlation in our system because 25 participants are probably too few to survey the correlation. We will add participants and proceed with more experiments to have more reliability in our experiment results.

As discussed in Sect. 5.1.1, we found a low correct answer rate of the patterns. We assumed that the reason for the low rate was that the ABCs of the suit were not close to the body of the participants because of their chest size and body shape. Moreover, the distribution of air pressure changed the pattern shapes. In the future work, we intend to improve the low correct answer rate by improving the structures and materials of PA suit, considering additional generic users, human factors, and ergonomics.

Despite its limitations, VR developers can use the PA suit in training, task instruction, and VR experience enhancement. Our user studies proved the possibility and usefulness of the PA suit in various research fields. When a VR simulation need multimodal information, the PA suit is effective since the users were stimulated distinguishable and consistent cues with considerable pressure and realistic vibration on a their entire chest. In our future work, we will design and carry out user studies including new high-quality simulations about guiding direction, instruction task, and escaping from an explosion with the high-resolution multimodal haptic feedback to enhance the strengths of the PA suit based on the results of blind and simulation user studies.