Keywords

1 Introduction

Technology devices nowadays trend towards multisensory engagement, including visual, auditory, and haptic modalities. To design for an integrated experience, current computing platforms are equipped with audio-visual displays, frameworks for seamless flow of analog-digital data, and plugins for authoring, recording and playback the sensory media. In order to further enhance the sensory experience, user interactions are monitored via embedded sensors and trigger event handlers in application programs. There is currently an abundance of available application programming interfaces (APIs) and editing software for designing and working with the audio-visual content. Such richness in hardware technologies and software tools is, however, missing in designing coherent haptic feedback, and therefore integrating them with multisensory media is challenging and non-trivial. In this paper, we repurpose the audio-based framework for haptic experience design and utilize audio-based tools to add, save, personalize, and broadcast dynamic haptic media. These tools are well established, mainstream and familiar to a large population in entertainment, design, academic, and the DIY communities, and therefore the toolkit introduce in this paper is easy to use and simple to adopt.

We introduce the Stereohaptics toolkit that allows designers, engineers and researchers to use existing hardware and software tools for creating haptics experiences. The toolkit is composed of three main components; One is the database of actuation and sensing technologies easily accessible to consumers and operates with an alternating driving signal; compatible with audio frameworks in our computers, mobile devices, toys, furniture, etc. Second is the suite of software used for authoring and routing audio signals. These software suites are modified to work and design in haptics. Third, in the core of the toolkit resides the Stereohaptics engine that renders dynamic haptic feedback based on the user input. The engine utilizes psychophysical models of tactile illusions for creating a phantom tactile sensation and modulating its motion between two or more actuators arranged as a grid on and around the user and couples user inputs to parametric values of the haptic output.

Organization of the paper is as follow: we first present related background on haptic hardware technologies, common tactile illusions, and available audio and haptic toolkits. We then present the framework of Stereohaptics and design of a sample toolkit that uses off the shelf components and software and demonstrate the work flow with the toolkit on a user’s computer. Then we present psychophysical evidence of static and moving tactile illusions on, around and across the user’s body using two vibrating actuators (therefore stereo) and approximate the control space of moving tactile illusions in four configurations shown in Fig. 1. Then we report experience with the sample toolkit in a series of workshops and show that the toolkit is useful to familiarize and educate new users with haptic feedback, and provide necessary tools for haptic experience, interaction and product design. Finally, we highlight the scalability and use cases of the toolkit in a variety of applications.

Fig. 1.
figure 1

The Stereohaptics Toolkit is used in a variety of applications including wearable, handhelds and structures.

2 Related Background

2.1 Haptic Hardware Technologies

A widely available haptic hardware technology in consumer use, often seen in cellphones and game controllers, is the unbalanced rotating mass dc-motors that produce vibrations as a function of applied dc-voltage. Voice coils and linear resonance actuators (LRA) are also popular due to small packaging, reasonable cost and simple drive circuitry. Piezoactuators and electroactive polymers (EAP) have been used in wearable, toys, and handheld devices because of small and flexible design [3, 7, 27, 34]. Piezoactuators, as well as motors, are also used in skin stretch devices to produce shear on the skin [10, 22], and electrostatic devices are designed to vary the friction between the skin and interaction surface or between two surfaces to augment user interactions [4, 11]. These actuators, similar to the speaker technology, give control over both output frequency and amplitude and operate on an alternating driving signal. This time alternating haptic actuation is termed as tactile or vibrotactile.

Vibrotactile devices are either installed in a single actuator configuration (as in mobile phones) or in a “grid” configuration for varying and moving sensations on and across the body [13, 18, 20]. Surface mounted transducers such as shakers [1, 23], vibrators [29], ultrasonic transducers [32] and electrostatic layers [4, 12] create haptic simulations by propagating acoustic waves in and away from a surface. These haptic technologies operate in bandwidths similar to that of audio channels, and therefore can be applicable to a typical audio framework. Other actuators, such as motors, pneumatic, and Peltier devices, may require special driving circuitry and control framework as in [31].

2.2 Haptic Illusions

A typical vibrotactile stimulus of duration d is defined by a frequency f and amplitude A. Other parameters to characterize the stimulus are temporal onset interval (SOA short for stimulus onset asynchrony) and attack and decay fading functions, as shown in Fig. 2. Sequential stimulations at two locations on the skin create a perception of illusory moving sensation between the two points [13, 14, 19, 30]. This moving illusion is known as Apparent Tactile Motion shown in Fig. 2A. Simultaneous stimulations at two locations create a “phantom” percept, whose intensity and location are determined by the intensity of two real actuators. This static illusion is known as Tactile Phantom Sensations [2, 13] shown in Fig. 2B.

Fig. 2.
figure 2

A set of sensory illusions in tactile perception.

Another illusion is Sensory Saltation or the “cutaneous rabbit” illusion [9] shown in Fig. 2C. In this case, three brief stimuli are delivered to the first actuator followed by three stimuli at the second actuator. The user feels a jumping sensation gradually moving from one actuator to the other. These illusions are popular in HCI field and applied towards developing new haptic technologies and interfaces (such as [15, 16, 17, 26, 29 33]. Moreover, these illusions are evoked in contiguous body sites, such as the back, torso, forearm, palm, etc. In this paper, we show that these illusions persist on, across and all around the user’s body even when the actuators are not directly in contact with the skin. It has been shown that the frequency and amplitude of the stimulation do not affect illusory percepts [6, 14, 35], and we explore the other two control parameters, d and SOA, to evoke moving illusions in our experiments.

2.3 Audio and Haptic Software and Toolkits

A typical audio-based peripheral is shown in Fig. 3A. A personal laptop is equipped with audio input-output ports for microphone input and headphone output. This peripheral is standard in mobile devices (such as phones, wearable, hand controllers), furniture, vehicles, toys and so on. In terms of audio tools, options abound for a user to design and experience various audio content. Mainstream tools, such as Maya, Pro tools, Ableton, Unreal Engine etc. are only a small sample of tools available to edit, route, store and playback audio media. With abundance of audio-based tools, users can easily work with their audio media to create diversified content.

Fig. 3.
figure 3

(A) Typical audio interfaces in a computer. (B) Framework of the Stereohaptics toolkit.

Compared to the variety and flexibility in audio interfaces and tools, haptic media tools are in dearth. The few existing tools with custom APIs are mostly developed for commercial use and only support a specific kind of actuation technology. For example, Immersion Corp. has published developer tools that are compatible with haptic technologies by Immersion Corp. (www.immersion.com). Similarly, D-BOX tools and custom Maya plugins are optimized for motion platforms in theater experiences (www.d-box.com). Research has also proposed tools like Tactons [5], Haptic Phonemes [8], FeelCraft [36], and Tactile Animations [28] to combine rich haptic experience with events, activities and media context. But they are demonstrated to support a specific haptic device.

Hapkit is another example of low-cost open hardware device with one degree of freedom for introductory courses on haptics [24] and TECHTILE toolkit by Minamizawa and colleagues [25] provides a platform for easy haptic content design. Both these toolkits made aware of the ease and applicability of haptics in a series of workshops and classroom lectures.

3 Stereohaptics Toolkit

Our framework utilizes audio tools and infrastructure for haptic media production and playback. A standard audio peripheral consists of two audio out-ports (stereo output) and one audio in-port (microphone input) (Fig. 3A). Parallel to the setup, our framework has outputs, inputs and an interface (Fig. 3B). A laptop computer running an audio synthesizer tool (interface) generates an output of an analog waveform from the audio out-port. The input encodes analog measurements through the audio in-port. These input-output analog signals are conditioned through our toolkit’s hardware board that comprises of common audio conditioning and amplifier circuitries (Fig. 3B). The output from the board is connected to off-the-shelf haptic actuator in contact with a user’s body and the input to the board is connected to an analog sensor that measures a user activity.

The interface is a critical component for accessible and flexible designs, and routing and parsing of input-output signals. The interface also enables parametric models of haptic effects, particularly models of static and moving tactile illusions (Fig. 2). As many computing devices are already equipped with two (stereo) audio out-ports, we present a toolkit with two haptic points to demonstrate the framework and user experiences.

Figure 4A shows an exemplary toolkit. The toolkit is light, inexpensive, and composed of common electronic components. It consists of a custom hardware board, two voice coil tactors, a pair of stereo audio cables, an USB audio interface, a power adapter and a MEMS accelerometer. The audio cables are plugged to the audio port of a personal computer and the computer interface was developed in a popular visual programing language MAX (cycling74.com). Figure 4B shows the input-output flow of the toolkit and an exemplary use case. The accelerometer is placed under one cup and the tactor is placed under another cup. The toolkit and a software interface are used to couple the activity in the sensed cup to vibrations on the actuated cup [25].

Fig. 4.
figure 4

(A) Exemplary toolkit using off the shelf parts. (B) a typical user case with the toolkit.

The tactors are enclosed in 3D printed housings and operate ~40–300 Hz at ~20–40 dB SL (sensation level) on the body. The board is also housed in a 3D casing, and Velcro straps are used to attach 3D parts to the body and environment. Figure 5 shows details of the hardware board. The board takes 12 V input and drives the circuitry with regulated ±5 V. The stereo audio output passes through a pair of 1 W LM4889 audio amplifiers (Texas Instruments, USA) and outputs the waveform to a pair of Tectonics Element’s exciters (model: TEAX13C02-8/RH, USA).

Fig. 5.
figure 5

A Hardware board of the toolkit.

The toolkit also allows switching between two classes of sensor inputs with a ‘sensor selection’ slider switch. Measurements from microphones, accelerometers, pulse-sensors are alternating in nature and can be directly measured through audio in-ports after passing through a conditioner circuitry which removes any dc bias in the sensor signal. Resistance based sensors, such as stretch sensors, potentiometers, switches, dials, force and pressure sensors, etc., output non-alternating signal that cannot be directly measured through AC-coupled audio in-ports. In this case, the sensor measurements are modulated with an oscillating signal and the alternating sensor voltage is then digitized through the audio in-port. Details on Bill of Material (BOM) and board layout files are accessible at: https://www.researchgate.net/publication/330672632_supplementaryzip.

4 Psychophysics of Stereohaptics Illusions

In this section, we demonstrate that a variety of haptic effects can be generated between two or more vibrating elements. Recent research has used vibrotactile grids to create moving illusory percepts on the palm, and across the back and torso [13, 16,17,18, 26]. These sensory illusions can be parametrically modeled to create a variety of sensations that vary in size, direction, speed and quality. We investigate psychophysics of mechanical stimulations at two active points and approximate the control space of moving illusory percepts between them. We applied three popular sensory illusions (Fig. 2) to evoke static and moving tactile sensations between two voice-coil actuators placed on (i) the forearm, (ii) on each hand, (iii) across the head, and (iv) across an object in contact with the hand.

4.1 Methods

The toolkit described in the previous section is used in the experiments. The purpose of experiments is to determine if moving tactile illusions (Fig. 2) are possible in four actuator configurations shown in Fig. 1. A participant completes a session of eight blocks and each block examines one of the two moving illusions (apparent motion or sensory saltation) at one of the four configurations. The illusion and configuration combinations are presented in random order.

A stimulus set is composed of three repetitions of 2 durations (70 ms and 200 ms) and four SOAs. In the apparent tactile motion condition, one actuator sets off, followed by the other separated by an SOA (Fig. 2A). The SOA values are 0 ms (simultaneous), 50 ms, 100 ms and 1000 ms (completely discrete). Attack and decay fading functions are set to 10% of the stimulus duration. In the saltation conditions, one actuator is triggered for three times followed by three stimulations of the other actuator (Fig. 2C). Each series of stimulations is separated by an SOA of d + 10 ms, d + 50 ms, d + 100 ms and d + 500 ms. Attack and decay fading functions are set to 10 ms. These durations and SOAs are derived from previous research to elicit illusory, non illusory and no motion sensations [6, 13, 35]. Frequency (70 Hz sinusoid) and amplitude (~25 dB SL) of stimuli were kept the same throughout the experiment.

In each trial participants report if they feel one point, two, three or more distinct points, or a “continuous motion” sensation from one point to the other point. Participants respond verbally and the experimenter records the response. Participants wear headphone that plays masking noise. Each block (24 trials) takes roughly 5 min to complete and the experiment takes less than 45 min to complete.

4.2 Results and Discussion

Ten participants (5 males, average age: 23 years) took part in the study after they signed a consent form approved by an IRB (Institutional Review Board). Participants’ responses were quantified by assigning a number corresponding to the number of distinct spatial points they felt. The response of “more distinct points” was assigned a number 4 and “continuous motion” was assigned a number 5. The results are shown in Fig. 6. Each data point represents the average among participants and the error bars show standard deviation. The results are further analyzed using a repeated measure ANOVA.

Fig. 6.
figure 6

Approximate control parameter space of apparent tactile motion (left) and sensory saltation (right) in four test configurations.

Apparent tactile motion is evoked in all four actuator configurations. When SOA was large (1000 ms), the two stimuli were perceived as two discrete vibration points. When the two stimuli were simultaneous (SOA = 0), participants reported a single vibration point in all configurations except for the hands. In the two intermediate SOAs (i.e. 50 ms and 100 ms) participants consistently felt “motion” between the two points. An ANOVA reveals no significant effects of configurations [F(2.03,18.3) = 1.2, p = 0.33] and duration [F(1,9) = 4.7, p = 0.06] and a significant effect of SOA [F(1.5,13.8) = 214, p < 0.001]. All interaction terms failed to show significance (p > 0.05). The result suggests that apparent tactile motion robustly persists on the body (arms), along a surface, and across the hands and the head.

Sensory saltation, on the other hand, was not perceived consistently across all four configurations. A 3-way repeated measure ANOVA shows significant effects of configuration [F(1.4,12.2) = 4.9, p = 0.04] and SOA [F(1.4,12.6) = 5.1, p = 0.032] and no significant effect of duration [F(1,9) = 3.3, p = 0.103]. All interactions involving the configuration factor were also significant (p < 0.05). Therefore, we analyzed each configuration separately using 2-way repeated ANOVAs. On the hands and on the head, the saltation failed to evoke illusory motion. On the arm, both duration [F(1,9) = 8.6, p = 0.017] and SOA [F(1.96,17.6) = 6.4, p = 0.009] showed significant effects. A close examination of the factors indicated that saltation was the strongest at small SOA and short durations (70 ms). On the surface, only SOA had a significant effect [F(2.3,20.3) = 8.1, p = 0.002], indicating that sensory saltation was evoked with both long and short durations at low SOAs.

In summary, the experiments show that apparent tactile motion is robust and persists in the two test durations and four test configurations. Sensory saltation only persists on the arm and surfaces, mostly at low durations and low SOAs, and phantom tactile illusions (merging of two simultaneous vibrations into one virtual actuator [1, 13]) are evoked on the arm, the head and the surface using quick (short duration) stimulations.

5 Workshops and Activities

We have utilized the toolkit in a series of training and activity sessions, ranged from 6-h studio-workshops to 2-h short interactive courses. The sessions are geared towards giving familiarity to designers, artists, engineers, and computer scientists with state-of-the-art in haptic feedback. The goal of the workshop is to utilize the toolkit in hand-on design activities and explore its applicability and accessibility. It is assumed that the attendees of the session are novice in haptic feedback, with varying degree of familiarity with audio tools and framework. A sample instructional material and activities are submitted as supplementary material.

In a typical workshop, the session starts with a brief introduction of haptic perception and technologies, followed by demonstrations of the toolkit and activities (see the supplementary material). The activity sessions introduce attendees with the computer interface, hardware and software tools, and prepare them for working independently or in a small group of 2–3 participants. The activities are: (i) create and modulate a haptic effect – including assembly of the toolkit, familiarity with the interface, set parameters such as frequency, amplitude, waveform, and location; (ii) create and modulate a moving illusion – including experimenting with control spaces of illusions in Fig. 6, experimenting with body locations and familiarity with the quality of sensory effects; (iii) familiarity and playing with a library of tactile effects prepared for the attendees; and (iv) familiarity and playing with the activity sensor.

All activities are prepared in Max (cycling74.com) interface. Figure 7 shows screenshots of Activity 1 and Activity 3, where tactile illusions were parameterized and incorporated in the flow. Users set waveform parameters and tune the quality of both apparent tactile motion and saltation at different speeds. Finally, an interactive ideation, design and demo session concludes the training.

Fig. 7.
figure 7

Screenshots of the interface during user activities.

Our workshops have been highly interactive, and participants prepared numerous everyday scenarios with haptic feedback. These ranged from guiding visual impaired shoppers with directional haptic feedback in a marketplace experience to broadcasting gating actions of a lead-rower to other teammates in order to optimize synchronize rowing. Some ideas were later pursued by participants for their academic advancements. Figure 8 shows some experiences from the workshop. All attendees are asked to fill an online survey, however it is not mandatory. (A survey printout is shown in the supplementary materials.) Out of 54 participants, we received surveys from 22 attendees. These surveys were anonymous and asked for user’s background, familiarity with haptics, DIY tasks, open-source software, design, audio production tools and hardware (1: beginners, 2: intermediate, 3: expert), and their ratings (scale 1–5, 1:low and 5:high) to learning experience, topics, activities, applicability and usefulness. Finally, participants were asked for comments and feedback.

Fig. 8.
figure 8

Some pictures from the workshop-studios.

On average, attendees rated themselves as beginners in haptics and audio production tools (median = 1) and intermediate in DIY, open source, design and hardware (median = 2). Attendees’ ratings to learning (mean = 4.1: mode = 4), topics (4.2:5), activities (4.2:5), applicability (4.5:5) and usefulness (4.5:5) were generous. A few comments highlighting possible improvements were: “nice to have access to the hardware devices after the workshop, replicable either through low-cost or open hardware solutions”, “the pacing feels a little uneven: lot’s of time in the morning, rushed at the end”, “I expect more than vibration”, etc.

6 Applications and Use Cases

A key feature of our framework is that it is derived from audio-based tools available in abundance and familiar to creative, professional and academic workforce. We exploit this feature so current workforce can easily apply their skills to construct new haptic technologies and applications using off-the-shelf components.

Perhaps the most common application is in entertainment settings, where coherent haptic feedback complements activities and events in video games, movies, sporting events, rides, and in virtual and extended reality. Figure 9 shows a Haptic Cart that uses multiple actuators mounted on different elements of the cart and a potentiometer to measure the steering angle. The cart is integrated with a driving game and reacts to collisions, road pavements, and vehicle states. Similar frameworks are also embedded in benches, couches, beds and other furniture to create experience of living environments, where accessories can react to user actions as well as possess unique haptic characteristics [23].

Fig. 9.
figure 9

A Haptic Cart.

The toolkit can be employed in the educational domain in two ways. One, it is used to construct academic material to enhance user’s ability in listening, reading and comprehending stories, like in [33, 37], and two, the toolkit can be accessible to students enrolled in the media design, engineering, psychology and HCI course to explore new applications, products and experiences in haptics domain.

The toolkit can also be used for therapeutic and assistive services. The feedback complements relaxation and meditation, monitor and guide breathing patterns, and provide massage services and applied in smart-vests, smart-belts and headgears to provide directional and awareness cues to motorcycle riders, professional athletes, and construction workers. In social domains, personal messages through the Internet and data networks can be shared on personal computers and mobile devices. We demonstrated an application with two remote users interacting via Skype and communicate personal and emotional haptic messages sensed with an accelerometer and rendered on tactors connected with our toolkit (Fig. 10).

Fig. 10.
figure 10

Sharing haptic experiences via internet.

One advantage of the Stereohaptics framework is that it applies to a variety of actuation technologies. We made modifications in hardware electronics to accommodate high-voltage low-power electric pulses and created electrovibration [4] and electro-tactile [21] feedback using audio-based software (see Fig. 11). Similarly, the framework can explore the full capacity of haptic technologies and trigger different sensory modes in touch, such as vibrations and electric stimulations combined together, and with visuals and sound. Figure 12 shows a few extensions of the framework. Figure 12A coupled hand motion to trigger a phantom object in between hands. The location and motion of the object is modeled in [35] and implemented in [15]. A head mounted grid interface (Fig. 12B) creates 3D sensory illusions on, around and inside the user head. Figure 12C shows a scenario where Peltier devices are coupled with vibrotactile actuators in 3D printed touch controllers to render an integrated thermal-vibratory tactile experience.

Fig. 11.
figure 11

Adaptation of the framework to electrical systems.

Fig. 12.
figure 12

Scalability of the framework. (A) Wearable, (B) Head mounted gears, (C) 3D printed toys.

A key feature of the toolkit is that it senses user activities through high-speed high-bandwidth audio in-ports. Our framework utilizes closed-loop mechanism, where the user activities are monitored in real time and instantaneous feedback is triggered. One application of the closed-loop architecture is a haptic recorder that senses haptic features, record them as audio data, broadcast them via audio stream and playback them on demand, all using audio tools. Such a recorder is useful for recording the user’s activity during sports, haptic features on surfaces, impacts and other dynamic behaviors as shown in Fig. 13.

Fig. 13.
figure 13

Applications of haptic recorder.

6.1 Limitations

Although there are many advantages of our toolkit, currently there are limitations. First, the framework utilizes audio-based tools and input-output. These tools are optimized for audio bandwidth, which is roughly from 20 Hz to 20 kHz. Haptic effects span between ~dc to 400 Hz. Therefore, the low frequency haptic effects are usually attenuated and filtered out. Second, the audio signal is generally sampled at 44.1 kHz which is an over kill for haptic signals. Haptics systems construct clean feedback using 4 kHz sampling rate, which is 10 times slower than audio rates. Therefore, a large bandwidth resource is not utilized and wasted. Our future work includes development of a protocol to utilize the broad bandwidth of an audio channel to drive multiple haptic actuators using simple multiplexing circuitry.

Third, a large variety of haptic actuators are not applicable in the exemplary framework, and custom plugins are utilized in conjunction with our framework. For example, a separate interface was design and used for Peltier devices and dc motors. Fourth, the current toolkit utilizes only two haptic outputs and one sensor input. This limits production of rich haptic interactions needing multiple actuators and sensor. We have extended the toolkit to include mechanical as well as electrical actuators. Another extension includes modular haptic units, where the user adds desired numbers of actuator and sensor channels to a central motherboard unit. These scenarios are briefly discussed in the application section; however, the possibilities are limitless.

Finally, almost all computing devices are equipped with a single audio input-output pair, which is basically for sound purpose. Our framework restricts these channels for haptic production and comprises with sound input-output. A proposed solution is to add another set of audio channel in computing devices or deliver the sound through the same haptic transducers. Nevertheless, many of the actuators utilized here are merely speakers.

7 Conclusions

We propose a framework for designing rich and dynamic haptic interactions using audio tools. The audio infrastructure is already established and available in consumer electronics and everyday objects. This framework allows designers, artists, engineers and everyday users to integrate and design haptic content in their projects. The framework is simple-to-adopt and easy-to-use, and accommodates a wide variety of actuation technologies.

Utilizing the audio infrastructure for haptic experience design is not novel, and it has been used in research and products to enhance users experience (see e.g. [25]). However, haptic experiences are usually limited to homogenous synthetic vibrations presented at one location. We present a framework that utilized psychophysics of moving tactile illusions between two (or more) vibrating points and extend the use to vibrotactile grid arrays for moving tactile sensations on, across and around the body. We show evidence that sensory illusions of objects and motion exist between two actuators placed on the proximal skin locations, across the head, at two hands and on surfaces external to human body.