Skip to main content

Touching data with PropellerHand


Immersive analytics often takes place in virtual environments which promise the users immersion. To fulfill this promise, sensory feedback, such as haptics, is an important component, which is however not well supported yet. Existing haptic devices are often expensive, stationary, or occupy the user’s hand, preventing them from grasping objects or using a controller. We propose PropellerHand, an ungrounded hand-mounted haptic device with two rotatable propellers, that allows exerting forces on the hand without obstructing hand use. PropellerHand is able to simulate feedback such as weight and torque by generating thrust up to 11 N in 2-DOF and a torque of 1.87 Nm in 2-DOF. Its design builds on our experience from quantitative and qualitative experiments with different form factors and parts. We evaluated our prototype through a qualitative user study in various VR scenarios that required participants to manipulate virtual objects in different ways, while changing between torques and directional forces. Results show that PropellerHand improves users’ immersion in virtual reality. Additionally, we conducted a second user study in the field of immersive visualization to investigate the potential benefits of PropellerHand there.

Graphical abstract


Virtual and augmented reality (VR/AR) enables users to inspect and interact directly with three-dimensional data, allowing for various applications in education, training, entertainment, immersive analytics, and visualization (Yoo et al. 2020; Winther et al. 2020; Paneels and Roberts 2009). Immersive analytics (IA) supports users in decision-making and data understanding through the use of immersive technologies (Dwyer et al. 2018; Fonnet and Prie 2019). However, IA lacks on haptic feedback which counteracts the promise of providing immersive environments.

Fig. 1
figure 1

Left: PropellerHand consists of two propellers in rotatable cages attached to a glove. Right: A user wearing the device. The control unit and batteries are placed in a backpack to reduce the weight of the hand-mounted hardware. This figure shows a close-up of PropellerHand and a picture showing a user wearing it, a backpack with additional hardware, as well as a head-mounted display. A bundled cable connects PropellerHand to the backpack

Current VR/AR headsets primarily rely on visual and auditory output, sometimes enriched with haptic feedback through vibrating controllers. Other types of haptic feedback, such as force feedback, can be of high interest in VR though, as they allow to “touch” virtual objects or data. Such devices can produce different force strengths and directions, and have shown promising results in classical VR applications (Tsai et al. 2019, 2020), telepresence (Kuling et al. 2020), and also in data visualization (Durbeck et al. 1998; Fritz and Barner 1999).

There are several dimensions in the design space of haptic devices, such as the type of feedback and attachment. The sensory feedback they generate can be either tactile, for example using vibration motors (Pezent et al. 2020), or kinesthetic, when they employ motion (Heo et al. 2018). Haptic devices can also be categorized into grounded and ungrounded devices, based on attachment. Grounded devices (Borro et al. 2004) are fixed to the user’s environment (see (Seifi et al. 2019) for more examples). They are mostly large and expensive, but can generate strong forces with high accuracy. Ungrounded devices on the other hand can be handheld (Heo et al. 2018; Je et al. 2019), attached to the user’s body (Je et al. 2018), or even move on their own (Knierim et al. 2017; Abtahi et al. 2019). We are focusing on ungrounded kinesthetic devices that are cheaper and lighter than grounded ones and can be used in mobile use cases.

We extend existing work on ungrounded haptics (Heo et al. 2018; Je et al. 2018, 2019; Knierim et al. 2017; Abtahi et al. 2019; Sasaki et al. 2018) by creating a novel device that for the first time allows to generate force and torque directly on the hand via propellers. These two types of feedback are required to resist motion and/or rotation that result in a more immersive perception of a virtual environment or in immersive analytics.

Example use cases for haptic devices include games, industrial design, virtual collaboration, or data visualization (Xia 2016; Van der Meijden and Schijven 2009). Here, we can increase the immersion of interactions such as moving or rotating objects or touching them. For instance, when opening a door, haptics can simulate the torque of turning the handle and the force of pushing the door open.

We propose PropellerHand, a hand-mounted haptic device that leverages propellers to create kinesthetic feedback through forces and torques in two degrees of freedom (2-DOF) each, directly on the hand. PropellerHand is able to generate a thrust of up to 11 Newton (N) and a torque up to 1.87 Newton-meters (Nm) with a weight of 480 g. As our device is mounted to the back of the user’s hand, they are still able to interact with or hold objects, such as VR controllers or physical tools. This allows more flexible interaction possibilities and usages. We also allow for a mobile use by integrating Bluetooth communication and batteries into our design. When users hold a VR controller, there is no need for an additional tracking setup, as the controller is already tracked and moves together with the hand. As we demonstrate in our user study, game mechanics such as opening a drawer or lifting objects become more immersive when using PropellerHand. Additionally, we conducted a second user study in the field of immersive visualization. We implemented two charts where the user can touch data in order to gain knowledge about it via haptic stimuli. Furthermore, we implemented another chart where the user can interact with and perceive force feedback. We envision potential benefits of PropellerHand in immersive visualization.

To summarize, we contribute a novel hand-mounted propeller-based force feedback device, as well as details of our design process. This process includes a formative user study evaluating different form factors, and measurements investigating the relationship of noise versus thrust of different propellers, both of which we did not find in related work. We perform a qualitative user study to evaluate our final device regarding its usability and increased immersion and provide a second user study illustrating its potential benefits in immersive visualization. Our supplementary material includes 3D models and further details on our design and process. This paper is an extension of the PropellerHand paper published at the IEEE Conference on Visual Information Communication and Interaction (VINCI) 2021 (Achberger et al. 2021).

Related work

In this section, we first show the use of haptics in mid-air interaction. Afterwards, we discuss how haptics has previously been used in visualization, what kind of haptic feedback devices exist, and how they differ from ours.

Mid-air haptics

There are multiple interactive methods for visualizations in a 3D environment, such as in VR or AR applications (Millais et al. 2018; Olshannikova et al. 2015). However, feedback users receive in mid-air interaction is technically limited at the moment (Koutsabasis and Vogiatzidakis 2019). Haptic feedback has shown potential in mid-air interaction, such as decreasing the interaction time or helping blind people interact with data (Rakkolainen et al. 2020).

Few haptic feedback devices support mid-air interaction. UltraHaptics (Carter et al. 2013) is a surface device that uses ultrasound to provide multi-point haptic feedback to feel objects in mid-air. HaptoMime (Monnai et al. 2014) also uses ultrasound, enabling users to touch floating images with hands-free tactile feedback. Additionally, they show different applications of using HaptoMime, such as the implementation of a floating touch panel or drawing graphics.

Kim et al. (2016) presented a novel mobile haptic device held in the pointing hand that uses an external damping mechanism applying vibration stimuli in a specific direction. They evaluated the device as an assistive technology to facilitate blind users in searching targets on large wall-mounted displays.

Köpsel et al. (2016) conducted two exploratory experiments to investigate the effects of auditory, haptic, and visual feedback on hand gestures. They found that the feedback modality can be given a lower priority and should be chosen by user preference.

In contrast to our work, these devices solely focus on tactile feedback and cannot apply any force to the user. Therefore, the devices are limited in steering the user’s hand in a specific direction. Additionally, force feedback can give the users more options to interact with virtual objects.

Haptics for visualization

In the field of data visualization, haptics have become important to support users to understand their data more accurately and quickly (Durbeck et al. 1998). Data physicalization has shown multiple benefits and opportunities of making data touchable (Jansen et al. 2015). Paneels and Roberts (2009) summarized various haptic designs to provide haptic feedback for different data visualizations such as charts, maps, signs, networks, diagrams, images, and tables.

Their results show that most of the research focused on chart visualization, where they also describe the challenges in potentials of haptics. In the field of scientific visualization, Avila and Sobierajski (1996) present a haptic interaction method that is suitable for volume visualizations. Another area where haptics have a high potential in data visualization is to enable blind persons to observe and understand visualizations. Fritz and Barner (1999) present various methods to observe data with haptics without the need for visual components, such as using texture or forces. Here, we describe the force feedback device PropellerHand. We study PropellerHand with classical VR scenes and tasks, as these build the foundations for more complex interactive scenarios as used in data visualization. Afterwards, we report the potential benefits of PropellerHand in immersive visualization we collected in our second user study.

Haptic devices

We review related work on ungrounded haptic devices in the following categories: air-based, drone-based, propeller-based, and other methods.


First, we discuss approaches that use air, but not propellers, to create force feedback. The AirGlove device (Gurocak et al. 2003) forces compressed air through six nozzles to create thrust in any direction. It is able to create a realistic sensation of weight and forces of about 7 N. A major drawback is the required compressor that limits the range of the user’s movement due to its weight and power connection. We follow similar goals, but avoid fixed or heavy equipment to create a more portable solution. The AirWand (Romano and Kuchenbecker 2009) is a pen-shaped device with one nozzle at each end that can produce a force of about 3 N. Contrary to our approach, it only produces force feedback in one dimension. Suzuki and Kobayashi (2005) propose an AR system comprising a projection-based stereo display and force feedback via air pressure. Nozzles inside a table blow air upwards, where users receive it with a cup-shaped object in their hand. Since compressor and nozzles are fixed to the table, users are not able to move in a larger area or receive feedback from directions other than upwards. Contrary to PropellerHand, the three above approaches cannot create torque.

Drag:on (Zenner and Krüger 2019) does not generate airflow by itself, but instead uses two flamenco fans that cause drag when moving the handheld device. By controlling the extent of the surface of each fan separately, the device can create different amounts of drag and torque. Due to its passive nature, this device cannot generate feedback when the user’s hand is not moving. Furthermore, drag can only be felt perpendicular to the fans’ surfaces which makes the effect dependant on the device’s orientation.


Some related work uses drones to provide haptic feedback for objects that can move around the user. Yamaguchi et al. (2016) propose an approach where a piece of paper, fixed to a drone and stabilized by its airflow, serves as an interaction surface. BitDrones (Gomes et al. 2016) uses drones that are equipped with either an RGB LED, a shape formed by an acrylic mesh and a frame, or a small touch screen. These allow for augmented reality scenarios without the need for head-mounted displays. HapticDrone (Abdullah et al. 2017) uses a quadcopter to create force feedback either up- or downwards with about 1.5 and 3 N. TactileDrones (Knierim et al. 2017) provides tactile feedback through small drones. By hitting the user with differently shaped tips, this approach can convey the impact of arrows or the sting of a bumblebee. The drones of VRHapticDrones (Hoppe et al. 2018) are fitted with mesh surfaces or objects, to provide either a surface the user can touch or an actuator that touches the user. Alternatively, the user can also grab and move the drone to move the virtual object. Abtahi et al. (2019) employ a drone to simulate physical touch events with realistic texture, by attaching different materials, such as cloth, to each side of the drone.

These drone-based approaches allow for physical representations of virtual objects, but the use cases for drones as haptic feedback devices are limited. None of the above approaches can provide much force and they cannot be used for torque feedback.

Propeller-based handheld or user-attached devices

Closest to our work are the following propeller-based approaches: Thor’s Hammer (Heo et al. 2018) is a handheld force feedback device resembling a hammer with a large cubic head. The hammer’s head contains six propellers, one on each face, allowing to propulse the hammer into any direction with up to 4 N. Aero-plane (Je et al. 2019) uses two propellers that are fixed to a stick and can generate different downward forces to create the sensation of varying centers of mass. The device is able to convey scenarios such as a ball rolling on a plane. Leviopole (Sasaki et al. 2018) consists of two quadcopters mounted to a pole. Depending on the thrust of each of the eight propellers, it can create linear force and torque in one degree of freedom.

A major drawback of these handheld devices is that they occupy the user’s hand, which therefore cannot be used to interact with virtual or physical objects or controllers. Most of the current VR applications require controller interaction or direct hand interaction. If the user has to hold a feedback device such as Thor’s Hammer, the interaction possibilities with data or virtual objects are very limited. We aim to transfer their propeller-based concept to a free-handed, less obstructing device.

Wind-Blaster (Je et al. 2018) is most closely related to our contribution. It uses two rotatable ducted propellers that are fixed to the user’s wrist. Participants of the conducted user study mentioned a weaker than expected force, which is something we want to improve on by using more powerful hardware. If the maximum force is too low, the user will not be able to perceive a sufficient number of different levels of force strength, which would substantially limit the interaction possibilities. The evaluation also misses to test rotating the thrust direction of the propellers during usage, as their user study only tested feedback in one direction. Instead of the wrist, we attach our device directly to the user’s hand by using a glove. This position allows for a more direct and stable feedback and was preferred by the participants of our pre-study on form factors.

Other approaches

This last part of related work consists of ungrounded haptics that use neither air nor propellers.

The handheld device by Yano et al. (2003) as well as the iTorqU (Winfree et al. 2009) employs a rotatable flywheel to create directional torques through the gyroscopic effect. Since we also want to generate force and not only torque, these approaches do not fit our goals. Lopes et al. (2017) propose an approach that delivers feedback via eight electric muscle stimulation pads. The electric current causes the user’s muscles to contract and thereby resist the opposing muscles, creating the illusion of a physical surface. This technique requires all pads to be connected to a power source through lots of cables, constricting freedom of movement. HapticSerpent (Al-Sada et al. 2018) is a snake-like robotic arm, that is attached to the user’s waist and is able to hold objects or perform actions such as poking. While it enables interesting use cases, it cannot create haptic feedback on the user’s hand without obstructing hand usage. Wireality (Fang et al. 2020) uses strings to hold back the user’s hand and fingers in order to prevent them from moving inside virtual objects. When an object’s surface is touched, the system locks the spools of the corresponding strings, which are fixed to the user’s shoulder, and the hand or finger cannot move further. This device can only generate feedback towards the user and bears a high complexity.

Design of PropellerHand

We designed a force feedback device that consists of two propeller cages attached to a hand-mounted bridge (Fig. 1). Those cages can be rotated via servo motors to be able to generate thrust and torque in two degrees of freedom (2-DOF) each (4-DOF in total). We used mostly 3D printed parts and commonly used electronic components to facilitate rapid prototyping and reproducibility.

Propeller cage design study

As moving propellers can be dangerous, we encapsulate them in cylindrical cages covered with aluminum meshes to prevent them from getting in contact with fingers or objects in the room, that might be pulled by the airflow. We conducted a small user study to find a cage size that does not obstruct movement of the arm and hand or cause collisions with other body parts. Six people (5 m, 1 f; 24–27 years) participated in this study.

The participants played one level of a VR game (Tumble VR for PlayStation). In this game, players have to stack objects onto a plate, requiring them to move and rotate their hand while holding a controller. We repeated this procedure six times with differently sized and weighted paper mockups of our approximate device design (Fig. 2). The mockups’ shapes reflect the dimensions of possible propeller choices and we tested cylinders with a height of 60, 65, and 70 mm and diameters of 76, 102, and 127 mm. We added different amounts of weight to the mockups (132, 198, and 290 g) to simulate the small, medium, and large versions of the device. The participants were sitting on a chair, to allow investigating collisions with the legs, and the mockups were fixed to either wrist or hand. To avoid biases, we made sure that the participants did not know which mockup they wore, by attaching those in a random order and only after putting on the head-mounted display. On average, each participant spent about 17 minutes in total playing Tumble VR.

Fig. 2
figure 2

The paper mockup used in our form factor pre-study. Left/right: attached to the wrist/hand. This figure consists of two images showing the paper mockup used in our form factor pre-study. In the first image, the paper mockup is attached to the user’s wrist and in the second to the hand, using two different fitness gloves

We registered a total of 11 collisions, 5 with the head and HMD and 6 with legs. We asked the participants, if they noticed any difference between the sizes and weights of the mockups, if they were restricted in their movement, and if they consciously moved differently. Four participants did not notice any differences in size and three no difference in weight. Only two participants felt slightly restricted in their movements, but only with wrist-mounting. Just one participant reported that he consciously moved differently because of the mockup. Overall, the participants’ answers to our questionnaires show that even the largest and heaviest model did not obstruct usage. They also preferred mounting the device to their hand instead of their wrist, since this is more stable when moving. The hand-mounted version seems also to be more comfortable, since two participants reported to sweat less with this configuration.

Based on these results, we chose the biggest of our three candidate shapes, as we assume that this allows for the most thrust. We also decided to mount our device to the hand instead of the wrist.


We chose an Arduino UNO micro-controller board due to its ease of use and wide support. The brushless motors we use to drive the propellers are of the type T-Motor F40 PROIII (T-Motor n. d.) and are similar to the ones used in Thor’s Hammer (Heo et al. 2018), but slightly stronger. To select the servo motors that rotate the propeller cages, we compared about 40 different products to find the best compromise between torque and weight (see supplemental material). The servo motors differ in size, weight, and torque. We were looking for the motor with the highest torque that fits our design regarding size and weight. We chose the Hitec HS-81 Micro Servo. As all servos we found either provide 180 degrees or continuous rotation, we opted to use gears to obtain a larger range (330 degrees). We recommend using metal gear servos, as one of our plastic ones broke during the user study. For batteries, we used the Conrad Energy Lipo with 2400 mAh and 14.8 V, and for the electronic speed controllers (ESCs), we used Pulsar A-50 with 50 A. We chose these components because they match the motors’ power requirements. For the communication between a PC and the Arduino we use a HC-05 Bluetooth module.

We compared four propellers with different diameter, shape, and blade count that fit the cage size we found in our pre-study. For this comparison, we performed experiments to measure thrust and noise at different power levels using a custom-build thrust stand. Based on the results of these experiments, we chose a four-bladed propeller with a diameter of 127 mm.


Our software consists of several parts. For controlling motors and measuring thrust with the load cell, we wrote small C/C++ programs for the Arduino. The VR scenarios for our user study and the code for sending commands to the Arduino are written in C# using Unity. We implemented two safety measures: propellers turn off when they get too close to the user’s head or when the controller’s trigger is double-clicked.

Resulting device

PropellerHand (including the glove) measures about 470\(\times\)135\(\times\)50 mm and weighs about 480 g (Fig. 1). We are confident that we can further reduce the weight with a more sophisticated 3D design, as motors, propellers, and cables weight less than 130 g. The total cost of components amounts to about 225 Euros (about 270 USD), excluding 3D printed parts. Controller, batteries, and Bluetooth module (840 g in total) are placed in a small backpack to minimize the mass attached to the user’s hand. With our current battery capacity of 2\(\times\)2400 mAh, PropellerHand could provide feedback for at least two hours. The actual operating time will vary depending on the duration and strength of feedback, but batteries are quick to change and can be replaced by ones with higher capacities when needed. Our device does not require any calibration or tracking setup prior to use, as we assume that the hand will be tracked anyway in a usual VR use case, either via controller or VR glove, to which PropellerHand could be directly attached.

Technical evaluation

To avoid unnecessary noise, and since the generated thrust seemed already strong enough, we did not use the maximum amount of power, but only 55.6% of it (as regulated by pulse-width modulation (PWM)). For our force measurements, we used a load cell from which we suspended one of the propeller cages such that it created a downward thrust. Two propellers combined, we measured a maximum force of 11 N, a minimum force of 0.17 N, and a maximum torque of 1.87 Nm, see Fig. 3.

Fig. 3
figure 3

The thrust and noise measurement of one propeller. We increased the power every 5 s; therefore, we can see steps in the thrust measurements

This means that PropellerHand can provide more thrust than Wind-Blaster (Je et al. 2018) (1.5 N) and Thor’s Hammer (Heo et al. 2018) (4 N) and less than Aero-Plane (Je et al. 2019) (14 N).

Another important metric is the consistency of thrust, since users might notice fluctuations and therefore feel less immersed. When running the propeller for 10 s, all measured values were inside a range of 0.03 N (SD: \(5.3E^{-5}\) N)  (Fig. 4). We did not perceive these fluctuations ourselves, and they are even smaller when less thrust is generated. This is a low value, which means PropellerHand can produce the same amount of force over a long time, which is important for many use cases.

Fig. 4
figure 4

The thrust consistency measured with a single propeller with an intensity of 27%

As our device can produce varying amounts of thrust as well as rotate the propeller cages, we measured two kinds of latency: the reaction time from a control signal to full target thrust and the time needed to rotate the cage to a target angle. We measured a latency of 683 ms for a still-standing propeller to reach full thrust. Compared to non-propeller-based force feedback devices, this is a high value; therefore, we recommend to calculate collision prediction in order to reduce the latency. For a 330 degree rotation of the cage, our device requires 429 ms with standing propellers and 833 ms when running at full thrust. It depends on the interactive use case whether this rotation time is sufficient. For simulating weight, the rotation time was acceptable. For use cases where many full rotations in a short time period are required, the latency can be perceived as too high.

We measured the noise level of PropellerHand with a decibel meter placed 1 m apart, we got a maximum sound pressure of 102 dB and a minimum sound pressure of 65 dB.

User study 1

We first evaluated the influence of our device on immersion in a user study with four scenarios that require PropellerHand to simulate force and torque in varying amounts and directions.

Study design

We randomly recruited six people from our university’s campus to participate in the study (5 m, 1 f; 24–34 years). Five assessed their VR experience as beginner and one as advanced. None of them have had previous experience with kinesthetic haptic feedback.

The participants used an HTC Vive Pro head-mounted display (HMD) and its controller and wore earplugs during the study. For increased hygiene, we provided each participant with their own single-use rubber glove and sanitized the HMD after each use. We attached our device to the participant’s dominant hand (all were right-handed). The participants were standing and free to move in an area of about 4\(\times\)4 m. To reduce noise and especially annoying frequencies, we limited the power through PWM to 28% of the maximum, limiting thrust to 5.1 N and torque to 0.87 Nm, receiving a noise level of 90 dB.

We proceeded as follows: After the participant signed a consent form, we gave a brief introduction to the safety features and study procedure. Each participant experienced four scenarios, each of them first without haptic feedback and then with PropellerHand. In each of the scenarios, the participants had to complete a different task, as described in detail in the following subsection. The participants were allowed to familiarize themselves with the virtual environment for as long as they wished, but for at least 30 s. After each scenario, the participants answered a questionnaire that included 7-point Likert scale questions (1: very strongly disagree–7: very strongly agree). The questions asked in how far PropellerHand increased the immersion in VR. In this paper, we use the definition of immersion from Witmer et al. (2005), who define immersion as “a psychological state characterized by perceiving oneself to be enveloped by, included in, and interacting with an environment that provides a continuous stream of stimuli and experiences”. We explained this definition to our participants and asked subsequently them to rank their perceived level of immersion. After all four scenarios, a concluding questionnaire inquired about general feedback for PropellerHand, possible improvements, and which scenario the participants preferred and why.


We created four scenarios that allowed us to evaluate our device with different types of user-object interactions. To provide a more comfortable environment, we situated these scenarios in a virtual room and a grove which we retrieved from the Unity Asset Store (Unity Asset Store 2020) (Fig. 5). The scenarios require PropellerHand to simulate both force and torque separately for the first three scenarios and to switch between them within the last scenario. This differentiates our scenarios from those used in related work, which only tested either force or torque (Je et al. 2018, 2019; Heo et al. 2018).

Fig. 5
figure 5

The four scenarios of our user study. a, b Blocks of cheese and daggers that participants had to sort by weight. c A catching device with which items falling from the sky were to be caught. d Participants had to unfasten screws and place them in drawers. This figure consists of four screenshots of our user study’s four VR scenarios. The first screenshot shows a desk with five blocks of cheese placed on it. One block is grabbed by the user’s hand, shown in blue. The second screenshot shows empty chests, two which are annotated with medium and heavy (the light one is outside of the image). In addition, the user’s hand is shown grasping a dagger to put it in a chest. The third screenshot shows two pans attached together by their handles. With this construction, the user tries to catch a falling piece of meat. The fourth and last picture shows the hand opening a drawer below a desk. A screw is placed on the desk for the user to pick up and place inside the opened drawer

S1: moving objects with different weights In order to investigate the simulation of physical weight, we told participants to sort five identically-looking pieces of cheese by weight into boxes (Fig. 5a). We assigned each piece a different weight that PropellerHand then simulated by varying the produced thrust. Once a user grabbed and lifted a piece, PropellerHand oriented its propeller cages such that the produced thrust was directed downwards (airflow upwards), using the HTC controller’s pitch value.

S2: daggers producing different torques In this scenario, participants grabbed identically-looking daggers at their grip and held them horizontally, with the blade pointing to their left (Fig. 5b). Each blade had a different weight and therefore produced a different torque on the participant’s hand. Here, PropellerHand oriented one propeller’s thrust downwards and the other one’s upwards, creating torque around the arm. As before, the participants sorted the daggers by weight and placed them in empty boxes.

S3: catching falling items To also test torque in another direction, this scenario employs items that fall from the sky and have to be caught by the participant (Fig. 5c). For this task, we provided a virtual catching device that resembles two pans glued together at their handles. Depending on the items’ weight, the torque will be slightly different, allowing the user to perceive collisions with different object weights. We implemented three different falling objects with different weights: a block of cheese, a piece of meat, and an onion. The catching device itself was not assigned a weight to reduce strain and allow us to measure the effect more clearly.

S4: multiple desk interactions In this scenario, we tasked participants with unfastening three screws and placing them in three drawers (Fig. 5d). This motion required PropellerHand to quickly switch between producing force and thrust. When the participants grabbed and twisted the screw, the propellers were rotated such that they produced a torque simulating the screw’s friction. After removing the screw, PropellerHand simulated its weight by orienting the thrust downwards. Next, the participants had to place each screw in one of the drawers. We simulated the drawers resistance during opening and closing by orienting the thrust against the direction of movement. When completely opened, increased thrust conveyed the drawers being stopped from moving further.


Overall, participants enjoyed using our device. They strongly agreed that PropellerHand increased the immersion in VR (mean (M) of 7-point Likert scale: 5.9).

Fig. 6
figure 6

The results of the 7-point Likert Scale of the 6 participants (7 means very strongly agree and 1 very strongly disagree). The boxes indicate the first and third quartile, the lines the minimum and maximum values, and the X-symbols the mean value

S1: moving objects with different weights We asked the participants whether they felt the weight of the objects and whether they perceived these weights to be different. All of them described the weight perception as very distinct (M: 5.8, SD: 0.8) and easy to distinguish (M: 6, SD: 0.9), see Fig. 6. One participant reported that he noticed louder noise for higher weight values and that the “lag between picking up things and the fans turning on feels odd“. All agreed that PropellerHand increased the immersion of the virtual environment (M: 6, SD: 0.9). Everyone sorted the pieces correctly.

S2: daggers producing different torques In this scenario, we asked the participants about their perception of torque. All participants stated that they could feel the torque when grabbing the daggers (M: 6.5, SD: 0.5) and a torque difference between the different blades (M: 6.5, SD: 0.5), see Fig. 6. One participant mentioned that some torques did not fit the daggers because they were visually identical, but felt different. However, this was necessary to avoid visual biases from influencing the results. As in the first scenario, all participants reported feeling more immersed when using PropellerHand (M: 6.5, SD: 0.8), with one participant mentioning that the “torque was captured well”. Again, all objects were sorted correctly.

S3: catching falling items After they finished the third scenario, we asked participants whether they were able to feel the impact of falling objects as well as a difference depending on the object’s type. Each participants reported to have felt a significant difference between the strength of impacts (M: 6, SD: 1.3), see Fig. 6. All participants clearly perceived the impact of falling objects (M: 5.7, SD: 0.8), but three mentioned a significant delay between seeing and feeling it. Due to this delay, participants reported less increased immersion than in S1 and S2 (M: 5.5, SD: 1.4). Furthermore, it seems that fast hand movements make it harder to perceive impacts.

S4: multiple desk interactions In this scenario, the participants were asked whether they perceived forces with different strengths and directions depending on the object they interacted with. They strongly agreed that they felt different strengths (M: 6.2, SD: 0.8) and directions (M: 6, SD: 1.3), see Fig. 6. While interacting with the drawers, one participant said “oh that is cool” when he felt the drawer’s collision with the stopper. Others described the torques as easier to feel than the forces and stated that there is “only a slight delay in the haptic feedback when pulling out a drawer to the limit”. All participants told us they enjoyed this scenario and agreed that PropellerHand made them feel more immersed in the virtual environment (M: 5.7, SD: 1.5).

Concluding questionnaire and summary After participants finished all the scenarios, we asked them to answer a final questionnaire with general questions about PropellerHand. The questions included which use case they prefer, if the noise or wind flow was disturbing, and if the weight of PropellerHand was too heavy. We also asked them to give general feedback about PropellerHand. The participants agreed that the noise was disturbing the immersion (M: 5.7, SD: 0.8), although one said that “the noise is not that bad since you are distracted”. When asked whether the airflow negatively impacted the immersion, the average answer was between neutral and disagree (M: 3.5, SD: 1.6).

The answers about PropellerHand’s weight were mixed; they neither agreed nor disagreed that our device is too heavy (M: 4.3, SD: 2.2): “Keeping your arm in the same position for long periods of time is tiring.”, “The device definitely increases immersion. At the same time it is also tiring for your arms, first due to its own weight and second because most forces are generated in the same direction as gravity.” We did not find any difference in gender based on the question about PropellerHand’s weight. We did not have enough participants to analyze correlation in gender. However, we belief that the user’s body size and strength might influence the perception of PropellerHand’s weight.

The general opinions on PropellerHand were positive: “Totally awesome to have these haptics, this changes a lot”. Participants described their experience as “very interesting and immersive” and said that the “haptic device emphasizes the feeling of actually doing something in reality. Interestingly, “torque was more clearly tangible than directional force”. There were also some suggestions and criticisms, for example that “it would be more realistic if momentum was simulated [as well]” and that “picking up things seems more realistic, but the lag feels off”.

Regarding the feedback, participants told us that “it was extremely helpful to feel the drawer’s stopping” and that they “like that it gives you more information about the virtual environment”.

When asked for their favorite scenario, participants preferred those including torques: “I preferred the use case with torques, in that case I found the reaction of forces on the human body (arm) the most appropriate.” “The falling object use case was my favorite, because you could feel the impact, even if you did not look at it.

One participant suggested to keep the propellers running at all times, such that the device carries itself and the noise is more constant. Further proposals included adding another DOF for rotation, increased forces, and using both hands for the feedback.

User study 2

In a second small-scale exploratory user study, we investigated the potential benefits of PropellerHand in immersive visualization scenarios. To that end, we implemented different visualizations where data can be inspected or interacted by using PropellerHand. Touching data via haptic feedback devices is a very new and largely unexplored research area. We thus focused on a qualitative and exploratory study design, instead of seeking confirmatory statistical results. As such, we follow recommended practices for early design stage, which caution to not prematurely focus on statistical, comparative evidence to avoid suppressing or even eliminating novel and creative ideas (Greenberg and Buxton 2008). The same argument has been made for studies in the area of data physicalization (Jansen et al. 2015), which is conceptually related to our endeavor of “touching data”.

Fig. 7
figure 7

The three implemented charts of the case study. a The line chart where PropellerHand generates torque on the user’s hand depending on the charts slope. b The bar chart where PropellerHand simulates the height of the bars. c The interactive bar chart where the user can move spheres to update the data of the chart. TODO

Study design

We decided to follow an exploratory study setup similar to the one used to evaluate Thor’s Hammer (Heo et al. 2018). To that end, we randomly recruited five people from our university’s campus to participate in the study (4 m, 1 f; 24–34 years). Two assessed their VR experience as beginner, one as experienced, and one as expert.

We used the same setup as in our previous user study (Sect. 5.1) and proceeded as follows: After the participant signed a consent form, we gave a brief introduction to the safety features and study procedure. Each participant experienced three different immersive visualization scenarios with PropellerHand, as further detailed below. In each scenario, the participants had one minute to explore the data, touch the graph, and experience the haptic feedback. After that, the participants took off the HMD and we asked them questions about the data, such as where was the highest slope, what was the highest value. We explained the participants at the beginning of the study that we will ask questions about the data to motivate the participants to inspect the data instead of playing around. After the three scenarios, we asked our main qualitative questions about the haptic experience. Specifically, we asked (i) whether they can imagine PropellerHand to have a benefit in analyzing data graphs and which benefit they would expect. We also asked (ii) whether they felt that PropellerHand supported them in answering questions. Furthermore, we asked (iii) whether they see potential value in using PropellerHand in analyzing data graphs, and (iv) what worked well/not well regarding the scenarios.


For our user study, we implemented three scenarios with two different chart types. We implemented a line and a bar chart, using Unity, where PropellerHand provides forces to investigate the data. A third scenario was chosen where the user can interact with a bar chart.

Line chart investigation In this scenario, we placed a line chart in front of a wall, see Fig. 7a. The lines consist of a 3D cylinder with a diameter of 5 cm. The whole chart is 2.7 m long with a height of 1 m. When the user holds their virtual hand inside a line, PropellerHand starts to produce forces on the user’s physical hand. The force depends on the data the user touches. PropellerHand produces a torque on the roll axis of the user’s hand, see Fig. 7a. Depending on the slope of the touched line, PropellerHand changes the torque direction and strength. The higher the slope, the higher the torque strength. If the slope is positive, the torque direction is applied counterclockwise on the user’s hand and vice versa.

Bar chart investigation This scenario contains a bar chart in front of a wall with 6 bars, see Fig. 7b. The bars’ width and depth is 20 cm. When the user touches one bar, PropellerHand produces a force downwards or upwards depending on the corresponding data of the bar. The higher the value of the bar, the stronger the force. For positive values, PropellerHand produces a force upwards and vice versa.

Bar chart interaction In this scenario, we implemented a bar chart that can be interactively updated. The type and size of the bar chart are the same as in the Bar Chart Investigation scenario. We added an interactive slider in front of each bar, see Fig. 7c. The slider can be moved forward and backward. When the user moves the slider, the data updates and the height of the bars changes accordingly. The slider has an end position in each direction. If the user reaches this position, PropellerHand produces a force in the opposite direction in order to signal that they reached the limit.


In the following, we report our qualitative results collected from our participants for each scenario, as well as general feedback.

Line chart investigation Three participants perceived the scenario with PropellerHand as engaging, because of its novelty and playful experience. However, three participants criticized that PropellerHand distracted them, because they did not get used to it for the first time. Multiple benefits of PropellerHand in the line chart scenario were reported by the participants. Two participants told us that PropellerHand could give them information about an additional dimension, such as the deviation of the function. Another participant explained that PropellerHand is able to represent data more extremely and to present the impact strongly. He said “you perceive the slope more explicitly via haptic than visually”. In addition, one person noticed that PropellerHand helped them to better orient themselves along the line of the chart. One participant was happy that PropellerHand triggered their explorer instinct and they wanted to compare the data.

However, there were also negative comments about the experience with PropellerHand in the line chart example. Some participants were distracted by PropellerHand’s noise and air flow. One participant with long hair was scared that their hair would be caught by PropellerHand rotors. Another participant was surprised by the force increase jumps, as we used discrete values in our line chart. He said “sometimes, the jumps in the force increase was a bit surprising, I think continuous data would feel better”.

Bar chart investigation In the bar chart investigation scenario, four participants described the investigation of the data with PropellerHand as engaging. The benefit in using PropellerHand when comparing data was reported by three participants. One of them could imagine feeling small data differences is easier than only seeing them. The participants mentioned that the force and force directions matched the data well. One person told us that he thinks “the data peeks are represented more dramatically with haptic feedback and have an emotional impact. Therefore, I have the feeling they are easier to remember.” Compared to the line chart scenario, three participants told us that they prefer the bar chart because the bars were easier to touch than the line.

However, two participants criticized that PropellerHand did not help much because they were more focused on the visual feedback. Another participant said “I was distracted by the noise and the air flow of the device.” Force directed towards the ground when touching negative bars was more dramatic than upward forces was mentioned by one participant.

Bar chart interaction In this scenario, four participants could clearly perceive the maximum value of the slider. Haptic feedback helped two persons in manipulating the data without looking at the slider and stay in the designated space. “You can manipulate the data without looking at the slider because you feel when you reach the maximum. This is helpful.” Another participant told us they can imagine haptic having a benefit when manipulating complex data, because the eyes can focus on the data instead of the manipulation via the UI.

However, two persons criticized that the latency was too large to simulate a realistic impact. One participant said “when you reach the end position of the slider, there is a delay of the haptic feedback. Therefore, it does not really feel like an impact”. A force increase when getting closer to the sliders endpoints was expected by one participant. Another person perceived the force on the slider’s end points as too strong. Additionally, two participants told us that PropellerHand was distracting because interacting with the chart and feeling haptic feedback was too much fun.

General feedback The participants told us they see a general benefit in using PropellerHand for immersive visualization because it supports them by adding one additional sense. Therefore, they felt like they might be able to better remember the data and perceive data peaks as “more dramatic”. In addition, they told us that PropellerHand can add one additional dimension that cannot be represented visually.

The participants also told us other graphs where they can think PropellerHand can provide benefit. They mentioned 3D plots and multidimensional graphs where haptics can render one dimension. They also suggested graphs with nodes and edges, where the force could be directed to the node with the strongest edge. Another participant mentioned pie charts to present large values more dramatically. One person explained us that he can imagine using PropellerHand in flowcharts, where PropellerHand could apply forces in the flow’s direction.

We also asked the participants for which people they can imagine PropellerHand providing a benefit in immersive visualizations. Blind or visual impaired people were proposed by two participants which could investigate data by touching them. Another participant told us that they think teacher can benefit from PropellerHand by teaching their students mathematics in a more exciting and engaging way. Two persons explained us that they can imagine that people who want to present data to other people can benefit from PropellerHand. They could use the haptic feedback to present data peaks more dramatically and emotionally in order to keep the data better in mind and to perceive them more clearly.


In the following section, we discuss findings from our two user studies.

User study 1

There are many use cases where PropellerHand can simulate force and torque convincingly. For example, the participants in our user study enjoyed being able to perceive the stopping of drawers and impact of caught objects, feedback that does not require them to keep their eyes on objects while interacting with them. However, due to the latency and maximum force of PropellerHand, it has limitations in simulating realistic collisions. We see more potential in the simulation of soft resistances than impact forces, such as weight simulation, pressing against soft objects, or simulating current. The noise level increases with the strength of the force, so we recommend reducing the power of the motors, if communication is more important than high forces during the task. To decrease the perceived weight of PropellerHand, one participant suggested to keep the propellers running at all times, such that the device carries itself. PropellerHand cannot simulate forces to compensate PropellerHand’s weight and torque at the same time. Therefore, we decided against this approach because when PropellerHand has to simulate torque, the user would feel PropellerHand’s weight again which could result in confusion and unrealistic behavior.

We believe PropellerHand can also be used in general mid-air haptics to enable interaction, but we see a limitation when PropellerHand obscures important information due to its size.

In addition, the study results showed that PropellerHand was capable of providing perceivable torque for different abstract use cases.

User study 2

The benefit and potential of immersive visualization is still relative unexplored. Some research either does not fully exploit the possibility of immersive visualization or overestimates its power. Kraus et al. (2021) mention four areas where they see potential of immersive visualization, situated visualizations, spatial data analysis with spatial tasks, collaboration, and presentation. We also see potential in these attributes, especially in the presentation that is mentioned in our immersive visualization user study. The result that PropellerHand makes special data more dramatic is interesting and helpful in the area of presentation. For example, we have a pie chart showing current costs where we want to convince a person that one part of the costs are too high. We can use PropellerHand to render this area of the pie chart with an additional high force to perceive the data more dramatic.. We can imagine that haptics can help the person to better understand that these cost are too high and should be reduces as soon as possible.

PropellerHand helped the participants follow the line in the line chart. However, in our scenario, we only had a single line visualized. If we had multiple, we would have a problem with distinguishing them. Yu et al. (2000) used friction to distinguish lines in the chart. We could add an additional modality to distinguish lines such as sound, as Ramloll et al. (2000) used to find values of lines.

One idea we had during the immersive visualization user study was to visualize the chart around the user instead on a flat wall. To investigate our charts, we had to do multiple steps to observe all data. Sometimes, we did not know in which direction to move. If we visualize the chart around the user with a radius of the user’s arm length, we could potentially investigate the whole data without needing to walk, at the cost of having to turn.

Comparing the two studies we conducted, we noticed only a few design characteristics that differ. We observed that the delay of the force feedback was more problematic in the first user study than in the second one. We believe that this is caused by the participants which do not have a clear expectation of the haptic feedback compared to more realistic use cases such as opening a drawer. Additionally, we noticed the force’s strength can be lower in the immersive visualization applications than in the scenarios of the first study . In the second user study, participants reported that the force strength fits well also in data peaks. There, we used smaller forces than in our first study, where some participants mentioned that the force was too weak in some situations.


The current version of PropellerHand still has several limitations. Due to its propeller-based design, noise will be an issue even when wearing earplugs or headphones independent of the application. In addition, wearing earplugs can reduce the benefit of multi-sensory interaction. At the moment, we hope further research will result in more quiet propellers. We recommend wearing active noise canceling headphones to reduce the noise level without reducing important audio feedback.

Currently, the device cannot provide thrust in the left-right direction or rotate and move the user’s hand in different directions at the same time. This limitation is present in immersive visualization and game applications. However, in immersive visualization, the interactive charts can be implemented in a constrained way to reduce the missing DoF limitation. This also means that users have to hold their hand in certain ways depending on the intended direction of thrust.

Some participants mentioned that PropellerHand is too heavy, an issue that we plan to address with an improved design. Both force and torque are produced with a certain delay that decreases the immersion in some use cases, such as in interacting with charts with continuous data. This limitation arose more in the first than second study. This delay between visual and haptic feedback could be reduced by letting the propellers always run at low RPM.

Furthermore, software-side methods such as collision prediction could be used. We also note that our studies are purely exploratory in nature with small participant numbers. We opted for these qualitative and exploratory study designs, and did not seek confirmatory statistical results. As such, we follow recommended practices for early design stages, which caution to not prematurely focus on statistical, comparative evidence to avoid suppressing or even eliminating novel and creative ideas (Greenberg and Buxton 2008). With maturing hardware for haptic-feedback, gathering statistical evidence through larger studies will be an interesting line of future work though.


We propose a new ungrounded force feedback device that is worn on the user’s hand. Compared to prior work, we include more powerful motors and propellers. Additionally, PropellerHand can produce thrust and torque and can quickly switch between them. We also contribute our design process, including a user study on form factors and quantitative experiments on propellers. The user study we conducted to evaluate our final design shows promising results regarding the improved immersion. It also revealed current limitations and provided us with ideas and suggestions for further improvements. Additionally, we conducted an exploratory user study with a focus on immersive visualization. Results show potential benefits in using PropellerHand in immersive visualization.

In the future, we plan to combine our device with haptic-feedback gloves such as the Manus Prime II Haptic. We also want to further improve our design and evaluate upcoming versions with more extensive user studies. We also want to conduct another user study on data visualization with blind or visually impaired participants and to further investigate the impact of perceiving data more emotionally and dramatically via PropellerHand. We also plan to implement scientific visualizations such as flow visualization, where PropellerHand produces forces depending on the data. Regarding design improvements, we want to add two more propellers or another rotation axis for additional degrees of freedom.


  • Unity Technologies (2020) Unity Asset Store - VR beginner: the escape room.

  • Abdullah M, Kim M, Hassan W, Kuroda Y, Jeon S (2017) HapticDrone: an encountered-type kinesthetic haptic interface with controllable force feedback: initial example for 1D haptic feedback. In: UIST ’17. ACM, pp 115–117.

  • Abtahi P, Landry B, Yang J, Pavone M, Follmer S, Landay JA (2019) Beyond the force: using quadcopters to appropriate objects and the environment for haptics in virtual reality. In: CHI ’19. ACM, pp 1–13.

  • Achberger A, Heyen F, Vidakovic K, Sedlmair M (2021) PropellerHand: a hand-mounted, propeller-based force feedback device. In: IEEE conference on visual information communication and interaction (VINCI), pp 1–8

  • Al-Sada M, Jiang K, Ranade S, Piao X, Höglund T, Nakajima T (2018) HapticSerpent: a wearable haptic feedback robot for VR. In: CHI EA ’18. ACM, pp 1–6.

  • Avila RS, Sobierajski LM (1996) A haptic interaction method for volume visualization. In: Proceedings of seventh annual IEEE visualization’96. IEEE, pp 197–204

  • Borro D, Savall J, Amundarain A, Gil JJ, Garcia-Alonso A, Matey L (2004) A large haptic device for aircraft engine maintainability. Comput Graph Appl 24(6):70–74.

    Article  Google Scholar 

  • Carter T, Seah SA, Long B, Drinkwater B, Subramanian S (2013) UltraHaptics: multi-point mid-air haptic feedback for touch surfaces. In: Proceedings of the 26th annual ACM symposium on user interface software and technology, pp 505–514

  • Durbeck LJK, Macias NJ, Weinstein DM, Johnson CR, Hollerbach JM (1998) SCIRun haptic display for scientific visualization. In: Phantom users group meetings

  • Dwyer T, Marriott K, Isenberg T, Klein K, Riche N, Schreiber F, Stuerzlinger W, Thomas BH (2018) Immersive analytics: an introduction. In: Immersive analytics. Springer, pp 1–23

  • Fang C, Zhang Y, Dworman M, Harrison C (2020) Wireality: enabling complex tangible geometries in virtual reality with worn multi-string haptics. In: CHI ’20. ACM, pp 1–10.

  • Fonnet A, Prie Y (2019) Survey of immersive analytics. IEEE Trans Vis Comput Graph 27(3):2101–2122

    Article  Google Scholar 

  • Fritz JP, Barner KE (1999) Design of a haptic data visualization system for people with visual impairments. IEEE Trans Rehabilit Eng 7(3):372–384

    Article  Google Scholar 

  • Gomes A, Rubens C, Braley S, Vertegaal R (2016) BitDrones: towards using 3D nanocopter displays as interactive self-levitating programmable matter. In: CHI ’16. ACM, pp 770–780.

  • Greenberg S, Buxton B (2008) Usability evaluation considered harmful (some of the time). In: ACM Proceedings of the human factors in computing systems (SIGCHI), pp 111–120

  • Gurocak H, Jayaram S, Parrish B, Jayaram U (2003) Weight sensation in virtual environments using a haptic device with air jets. JCISE 3(2):130–135.

    Article  Google Scholar 

  • Heo S, Chung C, Lee G, Wigdor D (2018) Thor’s hammer: an ungrounded force feedback device utilizing propeller-induced propulsive force. In: CHI ’18. ACM, pp 1–11.

  • Hoppe M, Knierim P, Kosch T, Funk M, Futami L, Schneegass S, Henze N, Schmidt A, Machulla T (2018) VRHapticDrones: providing haptics in virtual reality through quadcopters. In: MUM 2018. ACM, pp 7–18.

  • Jansen Y, Dragicevic P, Isenberg P, Alexander J, Karnik A, Kildal J, Subramanian S, Hornbæk K (2015) Opportunities and challenges for data physicalization. In: ACM Proceedings of the human factors in computing systems (SIGCHI), pp 3227–3236

  • Je S, Lee H, Kim MJ, Bianchi A (2018) Wind-blaster: a wearable propeller-based prototype that provides ungrounded force-feedback. In: SIGGRAPH ’18. ACM, Article 23.

  • Je S, Kim MJ, Lee W, Lee B, Yang X-D, Lopes P, Bianchi A (2019) Aero-plane: a handheld force-feedback device that renders weight motion illusion on a virtual 2D plane. In: UIST ’19. ACM, pp 763–775.

  • Kim K, Ren X, Choi S, Tan HZ (2016) Assisting people with visual impairments in aiming at a target on a large wall-mounted display. 86:109–120

    Google Scholar 

  • Knierim P, Kosch T, Schwind V, Funk M, Kiss F, Schneegass S, Henze N (2017) Tactile drones—providing immersive tactile feedback in virtual reality through quadcopters. In: CHI EA ’17. ACM, pp 433–436.

  • Köpsel A, Majaranta Päivi IP, Huckauf A (2016) Effects of auditory, haptic and visual feedback on performing gestures by gaze or by hand. Behav Inf Technol 35(12):1044–1062

    Article  Google Scholar 

  • Koutsabasis P, Vogiatzidakis P (2019) Empirical research in mid-air interaction: a systematic review. Int J Hum Comput Interact 35(18):1747–1768

    Article  Google Scholar 

  • Kraus M, Klein K, Fuchs J, Keim DA, Schreiber F, Sedlmair M (2021) The value of immersive visualization. IEEE Comput Graph Appl 41(4):125–132

    Article  Google Scholar 

  • Kuling IA, Gijsbertse K, Krom BN, van Teeffelen KJ, van Erp JBF (2020) Haptic feedback in a teleoperated box & blocks task. In: International conference on human haptic sensing and touch enabled computer applications. Springer, pp 96–104

  • Lopes P, You S, Cheng L-P, Marwecki S, Baudisch P (2017) Providing haptics to walls & heavy objects in virtual reality by means of electrical muscle stimulation. In: CHI ’17. ACM, pp 1471–1482.

  • Millais P, Jones SL, Kelly R (2018) Exploring data in virtual reality: comparisons with 2D data visualizations. In: Extended abstracts of the 2018 CHI conference on human factors in computing systems, pp 1–6

  • Monnai Y, Hasegawa K, Fujiwara M, Yoshino K, Inoue S, Shinoda H (2014) HaptoMime: mid-air haptic interaction with a floating virtual screen. In: Proceedings of the 27th annual ACM symposium on User interface software and technology, pp 663–667

  • Olshannikova E, Ometov A, Koucheryavy Y, Olsson T (2015) Visualizing big data with augmented and virtual reality: challenges and research agenda. J Big Data 2(1):1–27

    Article  Google Scholar 

  • Paneels S, Roberts JC (2009) Review of designs for haptic data visualization. IEEE Trans Haptics 3(2):119–137

    Article  Google Scholar 

  • Pezent E, O’Malley MK, Israr A, Samad M, Robinson S, Agarwal P, Benko H, Colonnese N (2020) Explorations of wrist haptic feedback for AR/VR interactions with Tasbi. In: CHI EA ’20. ACM, pp 1–4.

  • Rakkolainen I, Freeman E, Sand A, Raisamo R, Brewster S (2020) A survey of mid-air ultrasound haptics and its applications. IEEE Trans Haptics 14(1):2–19

    Article  Google Scholar 

  • Ramloll R, Yu W, Brewster S, Riedel B, Burton M, Dimigen G (2000) Constructing sonified haptic line graphs for the blind student: first steps. In: Proceedings of the fourth international ACM conference on Assistive technologies, pp 17–25

  • Romano JM , Kuchenbecker KJ (2009) The AirWand: design and characterization of a large-workspace haptic device. In: ICRA ’09. IEEE, pp 1461–1466.

  • Sasaki T, Hartanto RS, Liu K-H, Tsuchiya K, Hiyama A, Inami M (2018) Leviopole: mid-air haptic interactions using multirotor. In: SIGGRAPH ’18. ACM, Article 12.

  • Seifi H, Fazlollahi F, Oppermann M, Sastrillo JA, Ip J,Agrawal A, Park G, Kuchenbecker KJ, MacLean KE (2019) Haptipedia: accelerating haptic device discovery to support interaction & engineering design. In: CHI ’19. ACM, pp 1–12.

  • Suzuki Y, Kobayashi M (2005) Air jet driven force feedback in virtual reality. Comput Graph Appl 25(1):44–47.

    Article  Google Scholar 

  • T-Motor. (n. d.) F40 PROIII.

  • Tsai H-R, Rekimoto J, Chen B-Y (2019) Elasticvr: providing multilevel continuously-changing resistive force and instant impact using elasticity for VR. In: Proceedings of the CHI conference on human factors in computing systems, pp 1–10

  • Tsai H-R, Hung C-W, Wu T-C, Chen B-Y (2020) ElastOscillation: 3D Multilevel force feedback for damped oscillation on VR controllers. In: Proceedings of the 2020 CHI conference on human factors in computing systems, pp 1–12

  • Van der Meijden OAJ, Schijven MP (2009) The value of haptic feedback in conventional and robot-assisted minimal invasive surgery and virtual reality training: a current review. Surg Endosc 23(6):1180–1190.

    Article  Google Scholar 

  • Winfree KN, Gewirtz J, Mather T, Fiene J, Kuchenbecker KJ (2009) A high fidelity ungrounded torque feedback device: the iTorqU 2.0. In: World haptics ’09. IEEE, pp 261–266.

  • Winther F, Ravindran L, Svendsen KP, Feuchtner T (2020) Design and evaluation of a VR training simulation for pump maintenance. In: CHI EA ’20. ACM, pp 1–8.

  • Witmer BG, Jerome CJ, Singer MJ (2005) The factor structure of the presence questionnaire. Presence Teleoper Virtual Environ 14(3):298–312

    Article  Google Scholar 

  • Xia P (2016) Haptics for product design and manufacturing simulation. IEEE Trans Haptics 9(3):358–375.

    Article  Google Scholar 

  • Yamaguchi K, Kato G, Kuroda Y, Kiyokawa K, Takemura H (2016) A non-grounded and encountered-type haptic display using a drone. In: SUI ’16. ACM, pp 43–46.

  • Yano H, Yoshie M, Iwata H (2003) Development of a non-grounded haptic interface using the gyro effect. In: HAPTICS ’03. IEEE, pp 32–39.

  • Yoo S, Kim S, Lee Y (2020) Learning by doing: evaluation of an educational VR application for the care of schizophrenic patients. In: CHI EA ’20. ACM, pp 1–6.

  • Yu W, Ramloll R, Brewster S (2000) Haptic graphs for blind computer users. In: International workshop on haptic human–computer interaction. Springer, pp 41–51

  • Zenner A, Krüger A (2019) Drag:On: a virtual reality controller providing haptic feedback based on drag and weight shift. In: CHI ’19. ACM, pp 1–12.

Download references


Partially supported by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) under Germanys Excellence Strategy – EXC 2120/1 – 390831618.


Open Access funding enabled and organized by Projekt DEAL.

Author information

Authors and Affiliations


Corresponding author

Correspondence to Alexander Achberger.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Achberger, A., Heyen, F., Vidackovic, K. et al. Touching data with PropellerHand. J Vis (2022).

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • DOI:


  • Immersive analytics
  • Haptics
  • Ungrounded force feedback
  • Propeller-based haptics
  • Virtual reality