We first evaluated the influence of our device on immersion in a user study with four scenarios that require PropellerHand to simulate force and torque in varying amounts and directions.
We randomly recruited six people from our university’s campus to participate in the study (5 m, 1 f; 24–34 years). Five assessed their VR experience as beginner and one as advanced. None of them have had previous experience with kinesthetic haptic feedback.
The participants used an HTC Vive Pro head-mounted display (HMD) and its controller and wore earplugs during the study. For increased hygiene, we provided each participant with their own single-use rubber glove and sanitized the HMD after each use. We attached our device to the participant’s dominant hand (all were right-handed). The participants were standing and free to move in an area of about 4\(\times\)4 m. To reduce noise and especially annoying frequencies, we limited the power through PWM to 28% of the maximum, limiting thrust to 5.1 N and torque to 0.87 Nm, receiving a noise level of 90 dB.
We proceeded as follows: After the participant signed a consent form, we gave a brief introduction to the safety features and study procedure. Each participant experienced four scenarios, each of them first without haptic feedback and then with PropellerHand. In each of the scenarios, the participants had to complete a different task, as described in detail in the following subsection. The participants were allowed to familiarize themselves with the virtual environment for as long as they wished, but for at least 30 s. After each scenario, the participants answered a questionnaire that included 7-point Likert scale questions (1: very strongly disagree–7: very strongly agree). The questions asked in how far PropellerHand increased the immersion in VR. In this paper, we use the definition of immersion from Witmer et al. (2005), who define immersion as “a psychological state characterized by perceiving oneself to be enveloped by, included in, and interacting with an environment that provides a continuous stream of stimuli and experiences”. We explained this definition to our participants and asked subsequently them to rank their perceived level of immersion. After all four scenarios, a concluding questionnaire inquired about general feedback for PropellerHand, possible improvements, and which scenario the participants preferred and why.
We created four scenarios that allowed us to evaluate our device with different types of user-object interactions. To provide a more comfortable environment, we situated these scenarios in a virtual room and a grove which we retrieved from the Unity Asset Store (Unity Asset Store 2020) (Fig. 5). The scenarios require PropellerHand to simulate both force and torque separately for the first three scenarios and to switch between them within the last scenario. This differentiates our scenarios from those used in related work, which only tested either force or torque (Je et al. 2018, 2019; Heo et al. 2018).
S1: moving objects with different weights In order to investigate the simulation of physical weight, we told participants to sort five identically-looking pieces of cheese by weight into boxes (Fig. 5a). We assigned each piece a different weight that PropellerHand then simulated by varying the produced thrust. Once a user grabbed and lifted a piece, PropellerHand oriented its propeller cages such that the produced thrust was directed downwards (airflow upwards), using the HTC controller’s pitch value.
S2: daggers producing different torques In this scenario, participants grabbed identically-looking daggers at their grip and held them horizontally, with the blade pointing to their left (Fig. 5b). Each blade had a different weight and therefore produced a different torque on the participant’s hand. Here, PropellerHand oriented one propeller’s thrust downwards and the other one’s upwards, creating torque around the arm. As before, the participants sorted the daggers by weight and placed them in empty boxes.
S3: catching falling items To also test torque in another direction, this scenario employs items that fall from the sky and have to be caught by the participant (Fig. 5c). For this task, we provided a virtual catching device that resembles two pans glued together at their handles. Depending on the items’ weight, the torque will be slightly different, allowing the user to perceive collisions with different object weights. We implemented three different falling objects with different weights: a block of cheese, a piece of meat, and an onion. The catching device itself was not assigned a weight to reduce strain and allow us to measure the effect more clearly.
S4: multiple desk interactions In this scenario, we tasked participants with unfastening three screws and placing them in three drawers (Fig. 5d). This motion required PropellerHand to quickly switch between producing force and thrust. When the participants grabbed and twisted the screw, the propellers were rotated such that they produced a torque simulating the screw’s friction. After removing the screw, PropellerHand simulated its weight by orienting the thrust downwards. Next, the participants had to place each screw in one of the drawers. We simulated the drawers resistance during opening and closing by orienting the thrust against the direction of movement. When completely opened, increased thrust conveyed the drawers being stopped from moving further.
Overall, participants enjoyed using our device. They strongly agreed that PropellerHand increased the immersion in VR (mean (M) of 7-point Likert scale: 5.9).
S1: moving objects with different weights We asked the participants whether they felt the weight of the objects and whether they perceived these weights to be different. All of them described the weight perception as very distinct (M: 5.8, SD: 0.8) and easy to distinguish (M: 6, SD: 0.9), see Fig. 6. One participant reported that he noticed louder noise for higher weight values and that the “lag between picking up things and the fans turning on feels odd“. All agreed that PropellerHand increased the immersion of the virtual environment (M: 6, SD: 0.9). Everyone sorted the pieces correctly.
S2: daggers producing different torques In this scenario, we asked the participants about their perception of torque. All participants stated that they could feel the torque when grabbing the daggers (M: 6.5, SD: 0.5) and a torque difference between the different blades (M: 6.5, SD: 0.5), see Fig. 6. One participant mentioned that some torques did not fit the daggers because they were visually identical, but felt different. However, this was necessary to avoid visual biases from influencing the results. As in the first scenario, all participants reported feeling more immersed when using PropellerHand (M: 6.5, SD: 0.8), with one participant mentioning that the “torque was captured well”. Again, all objects were sorted correctly.
S3: catching falling items After they finished the third scenario, we asked participants whether they were able to feel the impact of falling objects as well as a difference depending on the object’s type. Each participants reported to have felt a significant difference between the strength of impacts (M: 6, SD: 1.3), see Fig. 6. All participants clearly perceived the impact of falling objects (M: 5.7, SD: 0.8), but three mentioned a significant delay between seeing and feeling it. Due to this delay, participants reported less increased immersion than in S1 and S2 (M: 5.5, SD: 1.4). Furthermore, it seems that fast hand movements make it harder to perceive impacts.
S4: multiple desk interactions In this scenario, the participants were asked whether they perceived forces with different strengths and directions depending on the object they interacted with. They strongly agreed that they felt different strengths (M: 6.2, SD: 0.8) and directions (M: 6, SD: 1.3), see Fig. 6. While interacting with the drawers, one participant said “oh that is cool” when he felt the drawer’s collision with the stopper. Others described the torques as easier to feel than the forces and stated that there is “only a slight delay in the haptic feedback when pulling out a drawer to the limit”. All participants told us they enjoyed this scenario and agreed that PropellerHand made them feel more immersed in the virtual environment (M: 5.7, SD: 1.5).
Concluding questionnaire and summary After participants finished all the scenarios, we asked them to answer a final questionnaire with general questions about PropellerHand. The questions included which use case they prefer, if the noise or wind flow was disturbing, and if the weight of PropellerHand was too heavy. We also asked them to give general feedback about PropellerHand. The participants agreed that the noise was disturbing the immersion (M: 5.7, SD: 0.8), although one said that “the noise is not that bad since you are distracted”. When asked whether the airflow negatively impacted the immersion, the average answer was between neutral and disagree (M: 3.5, SD: 1.6).
The answers about PropellerHand’s weight were mixed; they neither agreed nor disagreed that our device is too heavy (M: 4.3, SD: 2.2): “Keeping your arm in the same position for long periods of time is tiring.”, “The device definitely increases immersion. At the same time it is also tiring for your arms, first due to its own weight and second because most forces are generated in the same direction as gravity.” We did not find any difference in gender based on the question about PropellerHand’s weight. We did not have enough participants to analyze correlation in gender. However, we belief that the user’s body size and strength might influence the perception of PropellerHand’s weight.
The general opinions on PropellerHand were positive: “Totally awesome to have these haptics, this changes a lot”. Participants described their experience as “very interesting and immersive” and said that the “haptic device emphasizes the feeling of actually doing something in reality. Interestingly, “torque was more clearly tangible than directional force”. There were also some suggestions and criticisms, for example that “it would be more realistic if momentum was simulated [as well]” and that “picking up things seems more realistic, but the lag feels off”.
Regarding the feedback, participants told us that “it was extremely helpful to feel the drawer’s stopping” and that they “like that it gives you more information about the virtual environment”.
When asked for their favorite scenario, participants preferred those including torques: “I preferred the use case with torques, in that case I found the reaction of forces on the human body (arm) the most appropriate.” “The falling object use case was my favorite, because you could feel the impact, even if you did not look at it.
One participant suggested to keep the propellers running at all times, such that the device carries itself and the noise is more constant. Further proposals included adding another DOF for rotation, increased forces, and using both hands for the feedback.