1 Haptic Perception

This section will give a short summary of relevant topics from the scientific disciplines dealing with haptic perception. It is intended to reflect the current state of the art in a necessary extend for an engineer designing a haptic system. Physiologists and psychophysicists are therefore asked to forgive simplifications and impreciseness. For all engineers, Fig. 2.1 gives a general block diagram of haptic perception that forms a  conscious \(\hookrightarrow \) percept from a \(\hookrightarrow \) stimulus.

Fig. 2.1
figure 1

© Springer Nature, all rights reserved

Block diagram of haptic perception and the corresponding scientific areas investigating the relationships between single parts as defined in [1, 2]

Analysing each block of this diagram, the mechanical properties of the skin as stimulus’ transmitting apparatus are dealt with in Chap. 3. Section 2.1.1 will deal with the characteristics of mechanoreceptors in skin and locomotion system, while Sect. 2.1.2 will introduce the psychophysical methods that are used to evaluate this characteristics. In Sects. 2.1.3 and 2.1.4 thresholds and super-threshold parameters of human haptic perception are presented.

1.1 Physiological Basis

This section deals with the physiological properties of the tactile and kinaesthetic receptors as defined in the previous chapter (Sect. 1.4.2). We will not cover neural activity in detail, but only look at a general model that is useful for a closer look on multimodal systems.

1.1.1 Tactile Receptors and Their Functions

From a histological view, there are four different sensory cell types in glabrous skin and two additional sensory cell types in hairy skin as well. They are located in the top 2 mm of the skin as shown in Fig. 2.2. The sensory cells in glabrous skin are named after their discoverers, while the additional cells in hairy skin have functional names [3]. Because of the complex mechanical transmission properties of the skin and other body parts like vessels and bone, compression and shear forces in the skin and—for high frequency stimuli—surface waves are expected in the skin as a reaction to external mechanical stimuli. These lead to various pressure and tension distributions in the skin, that are detected differently by the individual sensory cells. In general, sensory cells near the skin surface will react only to adjacently applied stimuli, while cells localized more deeply like the Ruffini endings and Pacinian corpuscles will react also to stimuli applied farther away. These differences are presented in the following for the well-researched receptors in glabrous skin. For hairy skin, less information is available. While tactile disks are assumed to exhibit similar properties as Merkel cells because of the same histology of the receptor, hair follicle receptors are attributed to detect movements on the skin surface. The following sections will concentrate on tactile receptor cells in glabrous skin because of the higher technical relevance of these areas.

Fig. 2.2
figure 2

Figure adapted from [4] © Springer Nature, all rights reserved

Histology of tactile receptors in a glabrous and b hairy skin.

To investigate the behavior of a individual sensory cell, a single nerve fiber is contacted with an electrode and electrical impulses in the fiber are recorded as shown in Fig. 2.3 [5]. The results of the following paragraphs are based on such measurements, that are very complicated to conduct on a living organism or a human test person.

Fig. 2.3
figure 3

Figure adapted from [5] © Springer Nature, all rights reserved

Recording of electrical impulses of a single mechanoreceptor with microneurography.

From Sensory Cells to Mechanoreceptors and Channels

When  reviewing literature about the physiology of haptic perception, several terms are used seemingly interchangeable with each other. Since microneurography often does not allow a distinct mapping of a sensory cell to the contacted nerve fiber, a formal separation between a single sensory cell as given in Fig. 2.2 and the term mechanoreceptor has been established [4]. A \(\hookrightarrow \) mechanoreceptor is defined as

Definition  Mechanoreceptor  An entity consisting of one or more sensory cells, the corresponding nerve fibers and the connection to the central nervous system.

The classification of a mechanoreceptor is based on the size of the \(\hookrightarrow \) receptive fields and the adaptation behavior of the receptor when a constant pressure stimulus is applied. The receptive field denotes the area on the skin, in which an external mechanical stimulus will evoke a nervous impulse on a single nerve fiber. The size of the receptive field depends on the number of sensory cells that are connected to the investigated nerve fiber. Tactile mechanoreceptors exhibit either small (normally indicated with I) or large receptive fields (indicated with II). The adaptation behavior is classified as slowly adapting (SA) or rapidly adapting (RA, sometimes also called fast adapting (FA)). With these declarations, four mechanoreceptors can be defined, that are shown in Table 2.1. This nomenclature is based on a biological view. Next to this biological motivated terms, you will find the term channel in psychophysical literature to describe the connection between sensory cells and brain. A channel is defined as

Table 2.1 Receptive fields of the tactile mechanoreceptors in glabrous skin Table gives size and form of the receptive fields based on data from [4, 6,7,8,9]

Definition  channel  Functional/structural pathway in which specific information about the external world is passed to some location in the brain where the perception of a particular sensory event occurs” (Quote from [10, p. 49])

The difference to the definition of mechanoreceptors is the integration of functional processes like masking in the channel model. In general, the terms for channels and mechanoreceptors are used synonymously. The channels are named NP-I (RA-I receptor, NP standing for Non-Pacinian), NP-II (SA-II receptor), NP-III (SA-I receptor) and PC (RA-II receptor) [11]. There is experimental evidence for the presence and involvement of four channels in haptic perception in glabrous skin, but only three channels in hairy skin [12]. When using the channel model to describe haptic interaction one has to be aware, that certain aspects of interaction like surface properties and reactions to static stimuli cannot be explained fully by it [13, Chap. 4]. In this book, this discrepancy is not discussed in detail in favor of a primitive-based description of interactions that involves perception and motion control as well. An overview about the different terms for the description of tactile perception is given in Fig. 2.4.

Fig. 2.4
figure 4

Relation of the terms sensory cell, mechanoreceptor and channel using the NP-I channel as an example. The NP-I channel consists out of RA-I mechanoreceptors, that are based on Meissner corpuscles as sensory cells. NP-I and NP-III channels process signals of multiple sensory cells, while NP-II and PC channels are based on the signals of single sensory cells [11]

Spatial Distribution of Mechanoreceptors

The  spatial distribution of the different mechanoreceptors depends on the skin region considered. For the skin of the hand there is a varying distribution depending on the depth of the mechanoreceptors in the skin: near-surface receptors (RA-I, SA-) show a higher density in the finger tips than in the palm, deeper localized receptors show only a light dependency on the skin region. This is shown in Fig. 2.5.

Fig. 2.5
figure 5

Figure based on [4] © Springer Nature, all rights reserved

Innervation density of mechanoreceptors near and far from the skin surface. The greater innervation density of Merkel and Meissner receptors leads to a higher spatial solution of quasi-static stimuli.

The highest density of receptors is found at the fingertips and adds up to 250 \({\text {receptors}}/\mathrm{{cm}}^2\) [14] (primary source [15]). Thereof 60% are Meissner corpuscles, 60% are Merkel disks and 5% Ruffini endings and Pacinian corpuscles respectively [16]. Because of the high spatial density it can be assumed, that a mechanical stimulus will stimulate always several receptors of different type. However, not the density, but the absolute number of mechanoreceptors of different users are approximately the same [17]. Because of that, small hands are more sensitive than large hands. An inverse study was done by Miller et al. [18] in which a simulated population of receptors was exposed to a virtual stress and the nervous signals were calculated and qualitatively compared to known neurological processes. The study was able to confirm a multitude of hypotheses from neurobiology and is a fascinating read.

Functions of Receptors and Channels

Next  to the physiological and histological differences of the mechanoreceptors described above, channels differ in additional, functional properties [10, 11]:  

Frequency Dependency:

Channels are sensitive in different frequency ranges. While NP-II and NP-III channels are sensitive to (quasi-)static and low-frequency stimuli (up to about 20–50 Hz), the PC channel becomes sensitive at about 50 Hz and detects stimuli up to a frequency of 10 KHz. The main reason for the frequency dependency lies in the biomechanical structure of the receptors [5, 19]. To investigate frequency dependence of channels, psychophysical measurements using masking schemes are used. This leads to different results for the sensitivity and frequency selectivity of the several channels depending on the measurement procedures.

Thresholds:

Each channel exhibits thresholds that are independent from the other channels. An aggregation of the information from different channels takes place in the central nervous system [10]. There is also no evidence of crosstalk between channels [20]. Recent studies find evidence for a linear behavior of the channels and the aggregation process (Sect. 2.1.4).

Summation Properties:

Summation describes the property of a channel, to consider more than one temporal or spatial contiguous stimuli as a single stimulus. The reasons for summation are given by the neural activities to conduct impulses through the nerve fibers [21]. There are also assumptions of summation mechanisms in the central nervous system to compensate for sensitivity differences of the different receptors in a channel [22, 23]. Studies show no spatial summation for (quasi-)static stimuli [24], but only for dynamic ones [25].

Temperature Dependency:

Thresholds and frequency dependency are influenced by temperature. A stronger dependency on temperature is attributed to NP-II and PC channels compared to NP-I and NP-III [11]. Other studies assess temperature induced threshold chances at the glabrous skin of the hand starting at a frequency of 125 Hz, but find no effect on the hairy skin of the forearm [26]. Next to this, also the mechanical properties of the skin exhibit a temperature dependency of the mechanical properties [27].

Table 2.2 gives an overview over the discussed properties for each channel. It also includes information about the coding of the channels referring to kinematic measures like deflection, velocity (change of deflection) and acceleration (change of velocity) [5, 6].

Table 2.2 Selection of relevant parameters of tactile sensory channels. Averaged values respectively common ranges are given, when more than one source is considered. For sizes of receptive fields see Table 2.1

Based on this properties, functions of the different channels in perception and interaction can be identified [30,31,32,33].

 

NP-I (RA-I, Meissner corpuscle):

 Most sensory cells in human skin belong to the NP-I channel. They exhibit a lower spatial resolution than SA-I receptors, but have a higher sensibility and a slightly larger bandwidth. The corresponding sensory cells are called Meissner corpuscles and exhibit a biomechanical structure that makes them insensitive to quasi-static stimuli.

The RA-I receptors are sensitive to stimuli acting tangential to the skin surface. They are important for the detection of slip of hand held objects and the associated sensomotoric control of grip forces. Together with the PC channel they are relevant for the detection of frequencies of vibrations [34, 35].

The NP-I channel can detect bumps with a height of just \({2}\; \upmu \mathrm{{m}}\) on a otherwise flat surface, if there is a relative movement between surface and skin. This movement leads to a deformation of the papillae on the skin by the object. Reaction forces and—deformations are located in a frequency bandwidth that will activate the RA-I receptors [36]. Similarly, filter properties of surface structures are used for the design of force sensors [37].

NP-II (SA-II, Ruffini ending):

 The SA-II receptors in this channel are sensitive to lateral elongation of the skin. They detect the direction of a external force, for example while holding a tool. The NP-II channel is more sensitive than the NP-III channel, but has a much lower spatial resolution.

This channel also transmits information about the position of limbs, when joint flexion induces skin elongation. The SA-II receptors are therefore also relevant for kinaesthetic perception. With specific stimulation of the NP-II channel, an illusion about the position of limbs can be generated [38, 39].

NP-III (SA-I, Merkel disk):

 The NP-III channel with Merkel disks right under the skin surface is sensitive to strains in normal and shear directions. Because of the slow adaptation of the channel, the high density of sensory cells and the high spatial resolution (less than 0.5 mm although the receptive field is larger than that) it is used to detect elevations, corners and curvatures. It is therefore the basis for the detection of object properties like form and texture.

Because of the coding of intensity and intensity changes, this channel is responsible (together with the RA-I channel) for reading \(\hookrightarrow \) Braille. Studies also show an effect of the channel when wrist and elbow forces are to be controlled [40].

PC (RA-II, Pacinian corpuscle):

 The PC channel with rapidly adapting RA-II receptors exhibits the largest receptive fields and the largest sensitivity bandwidth. It is mainly used to detect vibrations arising in the usage of tools. These vibrations originate in the contact of the tool with the environment and are transmitted by the tool itself. They allow the identification of surface properties with a stiff tool, for example [41]. Because of the very high sensitivity (vibrational amplitudes of just a few nanometers can be detected by the PC channel [11]), also sensory cells located further away from the application of stimuli contribute to perception by reacting to surface wave propagation [3, 42]. To suppress the influence of dynamic forces arising in the movement of limbs on the perception of the PC channel, the Pacinian corpuscles exhibit a strong high-pass characteristic with slopes up to 60 dB per decade. This is realized by the biomechanical structure of the sensory cells.

In interactions, RA-II receptors signalize that something is happening, but do not necessarily contribute to the actual interaction. In addition, contributions of the PC channel to the detection of surface roughness and texture are assumed [7].

1.1.2 Kinaesthetic Receptors and Their Functions

For  kinaesthetic perception there are two known receptor groups [29, 43, 44]. The so-called neuromuscular spindles consist out of muscle fibers with wound around nerve fibers that are placed parallel to the skeletal muscles. Because of that placement, strain of the skeletal muscles can be detected. Histological they consist out of two systems, the nuclear bag fibers and the nuclear chain fibers that react to intensity change and intensity [45].

The second group of receptors are Golgi Tendon Organs. These are located mechanically in series to the skeletal muscles and detect mechanical tension. They are used to control the force the muscle exerts as well as the maximum muscle tension. Special forms of the Golgi Organs exist in joints, where extreme joint position and ligament tension are detected [29]. They react mostly on intensity. Figure 2.6 shows these three types of receptors.

Fig. 2.6
figure 6

Figure adapted from [46] © Wiley, all rights reserved

Kinaesthetic Sensors for muscular tension (Golgi Tendon Organ) and strain (Bag and Chain Fibers).

The dynamic requirements on kinaesthetic sensors are lower compared to tactile sensors, since the extremities exhibit a low-pass behavior. The requirements with regard to relative resolution are comparable to the tactile system. Proximal joints exhibit a higher absolute resolution than distal joints. The hip joint can detect angle changes as low as \({0.22}^\circ \), while finger joint resolutions increases to \({4.4}^{\circ }\) [43]. This is because of the greater influence of proximal joints on the position error of an extremity. The position accuracy increases with increasing movement velocity [44].

Kinaesthetic Perception is supported by information from the NP-II channel, from the vestibular organ responsible for body balance and from visual control by the eye. Different to the tactile system, the kinaesthetic system does not code intensities or their changes, but exhibits some sort of sense for the effort needed to perform a movement [47, 48]. For applications like for example rotary knobs this means, that a description based on movement energy with regard to rotary angle does correlate better with user ratings than the widespread description based on torques with regard to rotary angle.

1.1.3 Other Sensory Receptors

The  skin also includes sensory receptors for thermal energy [49] as well as pain receptors. The latter are attributed a protection function, signaling pain when tissue is mechanically damaged [50]. Both aspects are not discussed in detail in this book because of the minor technical importance.

1.1.4 Neural Processing

Haptic  information detected by the tactile and kinaesthetic mechanoreceptors are coded and transmitted by action potentials on the axons of the involved neurons to the central nervous system. The coding of the information resembles the properties given in Table 2.2 and is illustrated in Fig. 2.7.

Fig. 2.7
figure 7

Figure adapted from [5] © Springer Nature, all rights reserved

Action potentials of sensory cells when a single stimulus is exerted on the skin.

The biochemical processes taking place in the cells are responsible for the temporal summation (when several action potentials reach a dendrite of a neuron within a short interval) and the spatial summation (when action potentials of more than one receptor arrive at the same neuron) of mechanoreceptor signals. Each individual action potential would not be strong enough to evoke a relay of the signal through the neuron [51]. For the rest of this book, further neurophysiological considerations are of minor importance, but can be assessed in a standard physiology or neurophysiology textbook.

A more interesting question for the design of haptic systems is the synthesis of different information from haptic, visual and acoustic senses to a unconscious or conscious \(\hookrightarrow \) percept and a resulting action. While the neural processes are investigated in depth [52], there is no comprehensively confirmed theory about the processing in the central nervous system (spinal cord and brain).

Current research favors a Bayesian framework, that also incorporates former experiences when assessing information [53, 54]. Figure 2.8 gives a schematic description of this process, that is confirmed by current studies [55, 56].

Fig. 2.8
figure 8

Sensory integration according to Helbig [57]. Prior knowledge is combined with current sensory impressions to a percept of the situation. Based on a gain/loss analysis, a decision is made and an interaction using the effectors (limbs, speech) is initiated

1.2 Psychophysical Description of Perception

The investigation of perception processes, that is the link between an objectively measurable \(\hookrightarrow \) stimulus and a subjective \(\hookrightarrow \) percept, is the task of psychophysics, a section of experimental psychology. It was established by G. T. Fechner in the late mid of the 19th century [58]. As shown in Fig. 2.1, there are a several parts of psychophysical studies. Inner psychophysics deals with the connection of neural activity and the formation of percepts, while outer psychophysics will investigate the reactions to an outer stimulus. These parts were established already by Fechner [1]. Nowadays, modern technologies also allow the investigation of neurophysiological problems linking outer stimuli with neural activity and the analysis of correlations between neurophysiology, inner psychophysics and outer psychophysics.

For the design of haptic systems we will concentrate on outer psychophysics, since only physical properties of stimuli and the corresponding subjective percepts will allow the derivation of design parameters and design goals. Therefore the remaining of this chapter will only deal with procedures and parameters from outer psychophysics. It will describe the main principles, that should be understand by each system engineer to interpret psychophysical studies correctly.

1.2.1 The Psychometric Function

Regardless of the kind of sense, the description of perception is not possible with the general engineering tools. Perception processes are whether non-linear nor stationary, because the perception process of inner psychophysics cannot be described that way. Looking at Fig. 2.8 this is obvious since weighting and decision processes and the risk assessment cannot be described in a universal way.

Because of that, perception processes in outer psychophysics are not described by specific values but by probability functions. From this functions, specific values can be extracted. Figure 2.9 gives an example of such a \(\hookrightarrow \) psychometric function. On the x-axis the intensity of an arbitrary stimulus \(\varPhi \) is plotted, while the y-axis gives the probability p for a test person detecting the stimulus with that intensity.

Fig. 2.9
figure 9

Psychometric function based on a normal distribution with \(c_\theta = 50\) stimulus units, guess rate \(p_\text {G}\) and lapse rate \(p_\text {V}\)

According to [59] the psychometric function \(p_\varPsi \) has a general mathematical description according to Eq. (2.1).

$$\begin{aligned} p_\varPsi (\varPhi ,c_\theta ,c_\sigma ,p_\text {G},p_\text {L})&= p_\text {G} + (1 - p_\text {G} - p_\text {L}) \cdot f(\varPhi ,c_\theta ,c_\sigma ) \end{aligned}$$
(2.1)

with a stimulus \(\varPhi \) and the following parameters:  

Base Function f:

The base function determines the form of the psychometric function. In literature you can find different approaches for the base function. Often a cumulative normal distribution (Eq. (2.2)), sigmoid functions (Eq. (2.3)), and Weibull distributions (Eq. (2.4)) are used:

$$\begin{aligned} f_\text {cdf}(c_\theta , c_\sigma , \varPhi )&= \frac{1}{c_\sigma \sqrt{2\pi }} \int _{- \infty }^{\varPhi }{e^{\frac{-(t-c_\theta )^2}{2 c_\sigma ^2}} {\text {d}}\!{t}} \end{aligned}$$
(2.2)
$$\begin{aligned} f_\text {sig}(c_\theta , c_\sigma , \varPhi )&= \frac{1}{1 + e^{-\frac{c_\sigma }{c_\theta }(\varPhi -c_\theta )}} \end{aligned}$$
(2.3)
$$\begin{aligned} f_\text {wei}(c_\theta , c_\sigma , \varPhi )&= 1- e^{-(\frac{\varPhi }{c_\theta })^{c_\sigma }} \end{aligned}$$
(2.4)

Nowadays there is no computational limit for the calculation of functions and extracting values, therefore the choice of a base function of depends on prior experiences of the experimenter. When investigating the visual sense, a Weibull distribution will better fit data [60], when working with \(\hookrightarrow \) Signal Detection Theory (SDT), a normal distribution is assumed [61]. Sigmoid functions are often used in early simulation studies because of the low computational effort needed to calculate psychometric functions. The current state of the art in mathematics would allow for non model based description of the psychometric function. In psychophysics these approaches have not been seen very often up till now, whereas first studies show a comparable performance of these techniques compared to a model-based description [62].

Base Function Parameters \((c_\theta ,~c_\sigma )\):

These two parameters can be treated demonstratively as the perception threshold and the sensitivity of the test person. \(c_\theta \) will give the threshold, denoting the stimulus \(\varPhi \) with a detection probability of 0.5. The sensitivity parameter \(c_\sigma \) is given as the slope of the psychometric function at the threshold level. A high sensitivity will yield a large slope, a low sensitivity will yield a flatter curve, resulting in many false detections.

Guess Rate \(p_\text {G}\):

 The guess rate gives the portion of false positive answers for very low stimulus intensities that cannot be detected by the test person normally. Such false positive answers can arise in guessing, erroneous answers, or abstraction of the test person. In simulating psychometric functions, the guess rate is used to model force-choice answering paradigms.

Lapse Rate \(p_\text {L}\):

 The lapse rate models false negative answers, when a large stimulus intensity is not detected by a test person. The main reason for lapses is inattentiveness of the test person during a psychophysical procedure.

To find a psychometric function, psychophysics knows a bundle of procedures, that will be addressed in the next section. It has to be kept in mind, that all of the above and the following is not only valid for a stimulus with a changing intensity, but also for any other kind of changing signal parameter like frequency, energy and proportions between different signals.

1.2.2 Psychometric Procedures

The  general goal of a psychometric procedure is the determination of a psychometric function \(p_\varPsi \) as a whole or a single point \((\varPhi |p_\varPsi (\varPhi ))\) defined by a given probability \(p_\varPsi \). In general, each run of a psychometric procedure consists out of several trials, in which a stimulus is presented to a test person and a reaction is recorded in a predefined way. Figure 2.10 gives a general taxonomy to classify psychometric procedures.

Fig. 2.10
figure 10

Figure is based on classifications in [63]

Taxonomy of psychometric procedures.

Each procedure consists of a measuring method and an answering paradigm. The method determines the course of a psychometric experiment, particularly the start intensity of the stimulus and the changes during each run, the conditions to stop a trial and the calculation rule to obtain a psychometric function from the measured data. The answering paradigm defines the way, a test person is presented the stimulus and the available answering options. The choice of a suitable procedure is not topic of this book, but an interesting topic nevertheless. Further information about the simulation of procedures and the definition of suitable quality criteria can be found in [64, 65].

Methods

The first, nowadays called “classical”, methods were developed back in the 19th century. The most familiar are called Method of Constant Stimuli, Method of Limits and Method of Adjustment. They have barely practical relevance in today’s experiments, so they are not detailed here any further. Please refer to [66] for a more detailed explanation. Modern methods are derived from this classical methods, but are generally adaptive in a way, that the progression rule is depending on the answers of the test person in the course of the experiment [67]. They can be classified into heuristic and model based methods.  

Heuristic Based Methods:

These methods are based on predetermined rules that are applied to the answers of the test person to change the stimulus intensity in the course of the psychophysical experiment (change of the progression rule). Stopping and starting rules are normally fixed (by the total number of trails for example) for each experiment beforehand. The most-widely spread heuristic method is the so-called Staircase method. It is based on the classic Method of Limits and tries to nest the investigated threshold with the intensities of the test stimulus. Figure 2.11 gives two examples of a staircase method with different progression rules. It becomes clear, that the name of this method originates in this kind of display of the test stimulus intensities over the test trails.

The definition of progression rules is based on the number of correct answers of the test person leading to a lower test stimulus and the number of false answers leading to a higher test stimulus. The original Staircase method, also called Simple Up-Down Staircase (upper part of Fig. 2.11), will change stimulus intensities after every trail, such converging at a threshold with a detection probability of 0.5 [68]. The request for other detection probabilities led to another form of the Staircase method, the so called Transformed Up-Down Staircase Method. For these methods, the progression rule is changed and needs for example more than one correct answer for a downward change of the test stimulus. Figure 2.11 gives an example of a 1up-3down progression rule, lowering the test stimulus after three correct answers and raising it for every false answer.

In [68], Levitt calculates the convergence probability for several progression rules. Table 2.3 gives the convergence probabilities for common progression rules. To interpret studies about haptic perception incorporating experiments with staircase methods, the convergence probability has to be taken into account. However, newer studies cast some doubts on this interpretation of the progression rule [69], arguing that the amount of intensity change is much more relevant for the convergence of a staircase than the progression rule. For system design, one therefore has to resort to larger assessment factors in the interpretation of these kind of data. The calculation of a threshold is normally carried out as a mean of the last stimulus intensities leading to a reversal in the staircase direction. Typical values are for example 12 reversals for the calculation of the threshold and 16 reversals as a stopping criterion for the whole experiment run.

Another important heuristic method is so called PEST—Parameter Estimation by Sequential Testing—Method [70]. The heuristic keeps a given stimulus intensity until some assessment of the reliability of the answers can be made. The method was designed to yield a high accuracy with a small number of trails. One of the main disadvantages of this method is the calculation rule, that only considers the very last stimulus. However, several modern adaptations like ZEST and QUEST try to overcome some of these disadvantages [71].

Model Based Methods:

 A model of the psychometric function is the basis of these methods, that measure or estimate the parameters of the function as given in Eq. (2.1). Most methods incorporate some kind of prior knowledge of the psychometric function from experience or previous experiments (Bayes’ approach) and use different kinds of estimators (maximum likelihood, probit estimation). Examples for these methods are ML-Test [60], that uses a maximum likelihood estimator for the determination of the function parameter. The end of each experiment run is determined by the confidence interval of the estimated parameters. If the interval is smaller than a given value, the experiment run is stopped.

In 1999, Kontsevich and Tyler introduced the \(\varPsi \)-Method [72], combining promising elements from several other methods. This method is not very prominent in haptics research, but is considered the most sophisticated method in psychophysics in general [66, 73]. It is able to estimate a threshold in as little as about 30 trials and the sensitivity in about 300 trials.

One of the general advantages of a model based method is the calculation of a whole psychometric function, not only of a single threshold. Therefore, more than one psychometric parameter can be calculated from a single experiment. Adversely, one should have a slight confidence in the model used in the method for the investigated sense. As said above, data show a slightly advantage for Weibull-based models for the visual sense [60], while studies of the author of this chapter yield better results for a logistic function for the assessment of force perception thresholds [74].

Fig. 2.11
figure 11

Simulated runs of common psychometric methods. The upper graph shows a simple up-down staircase, theoretically converging at \(c_\theta = 50\), the middle graph shows a transformed up-down staircase with a 1up-3down progression rule and a theoretical convergence level of \(c_\theta = 57.47\). The lower graph shows a run from a \(\varPsi \)-method, also converging at \(c_\theta = 50\). Simulated answers of the subject are shown in green circles (correct answer) and red squares (incorrect answer), staircase reversals are circled. The dotted line indicates the calculated threshold

Table 2.3 Probability of convergence of adaptive staircase methods with different progression rules. Table based on [68]

Paradigms

Answering  paradigms describe the way a test person will give the answer to a stimulus in a way, that the procedure can react according to its inherent rules. The theoretical basis for answering paradigms is given in the \(\hookrightarrow \) SDT, a statistical approach to describe arbitrary decision processes, that is often applied to sensory processes. It is based on the assumption, that not only stimuli, but also noise will contribute to perception. In the perception continuum, this is represented by a noise distribution (mainly Gaussian). If there is no stimulus present, the noise distribution will be present in neural activity and processing, if a stimulus is present, the noise distribution is added to the stimulus. Figure 2.12 shows this theoretical basis of \(\hookrightarrow \) SDT.

Fig. 2.12
figure 12

Baseline model of the Signal Detection Theory (SDT). The model consists of two neural activity distributions for Noise and Signal + Noise. The noise distribution is always present and can be interpreted as sensory background noise. If an additional stimulus is presented, the whole distribution shifts upwards. The subject decides based on a decision criterion \(c_\lambda \), whether a neural activity is based on a stimulus or not. The subject shown here exhibits a badly placed respectively conservative decision criterion: Many signals are missed (horizontal striped area \(p_\text {M}\)), but only a few false-positive answers are recorded (vertical striped area \(p_\text {FP}\)). The detectability \(d'\), defined as the span between the midpoints of both distributions, is independent from the decision criterion of the subject and can be used as an objective measure of the difficulty of a detection experiment

Near the absolute detection threshold, both distributions will overlap. In this area on the perception continuum it is indistinguishable if a neural activity is coming from a stimulus or just from innate noise. To decide whether a stimulus is present or not, the test person will construct a decision criterion \(c_{\lambda }\). If a input signal is greater than this criterion, a stimulus is identified, smaller inputs are neglected. Unfortunately, this decision criterion is varying with time and other external conditions. Therefore, one aim of \(\hookrightarrow \) SDT is to investigate the behavior of a test person regarding this criterion. The detectability \(d'\) arising from the signal detection theory can be used to calculate comparable sensitivity parameters for different test persons and to compare studies with different psychometric procedures.

With these implementations, one can differentiate liberal (low decision criterion) from conservative (high decision criterion) test persons. For example, studies show that the consumption of alcohol will not change the sensitivity of a person, but will influence the decision criterion to become more liberal. This leads to a better detection of smaller stimuli, but will also produce more false-positive answers. Based on this stochastic approach of decision theory, answering paradigms can be defined, that are used to minimize the influence of varying decision criteria. The most common paradigms are described in the following.  

Yes/No-Paradigm:

The easiest paradigm is the simple Yes/No-Paradigm. A test person will for example answer “yes” or “no” to the question, whether a stimuli was present or not. Obviously, a varying decision criterion will affect this answer. One has to trade this disadvantage for the shorter time this paradigm needs in presenting a stimulus compared to other paradigms.

Forced-Choice-Paradigm:

 These paradigms arise directly from \(\hookrightarrow \) SDT to find an objective measure of a subjective assessment. To achieve this, each trial will include more than one alternative with a test stimulus and the test person is compelled to give a positive answer in every trial, for example which interval contained a stimulus or which stimulus hat the largest intensity. This paradigm can be combined with most of the methods mentioned above. In general, forced choice paradigms are denoted by xAFC (x Alternative Forced-Choice) or xIFC (x Interval Forced Choice). The first abbreviation is used, when several alternatives are presented to a test person, while the term interval is used for a temporal sequence of the alternatives. x denotes the number of alternatives respectively intervals. Naturally, forced choice paradigms increase the guessing rate in the psychometric function with an additional probability of \(\frac{1}{x}\). This has to be considered in the mapping of experimental results back to an psychometric function.

An example experiment setup would include five reference stimuli and one unknown test stimulus given to the test person. The test person would have do decide which reference stimulus corresponds to the test stimulus. If stimuli were miniature golf balls with different compliance, this experiment can be classified as a 5AFC paradigm.

Unforced-Choice-Paradigm:

In [75] Kärnbach describes an adaptive procedure, that does not require a forced choice, but also allows an “I don’t know” answer. This procedure leads to more reliable results from test persons without extensive experience in simulations. Especially in experiments incorporating a comparison of different stimuli this answering paradigm could provide a more intuitive approach to the experiment for the test person and could therefore lead to more motivation and better results. Based on Fig. 2.10 this unforced-choice option belongs to the paradigm definition, but has to be incorporated in the method rules as well. Therefore this paradigm is only found in a limited number of studies, but finds also application in recent studies of haptic perception [76].

1.2.3 Psychometric Parameters

In most cases, not the whole psychometric  function, but characteristic values are sufficient for the usage in the design of haptic systems. The most important parameters are described in this section.

Absolute Thresholds

These parameters describe the human ability, to detect a stimuli at all. They are defined as the stimulus intensity \(\varPhi \) with a detection probability of \(p_\varPsi = 0.5\) [77, Chap. 5]. However, since many psychometric procedures will not converge at this probability, most studies call their results threshold regardless of the convergence probability.

For the design of haptic systems, absolute thresholds will give absolute margins for sensors and actuators for noise- and otherwise induced errors: A vibration, that is “detected” by a sensor because of inherent noise in the sensor signal processing or displayed by an actuator is acceptable as long the user of a haptic system will not feel it. Therefore a reliable assessment of these thresholds is important to define suitable requirements. On the other hand, absolute thresholds define a lower limit in communication applications: Each coded information has to be at least as intense as the absolute threshold to be detectable, even if one probably will chose some considerably higher intensity level to ensure detection even in distracting environments.

Differential Thresholds

Differential thresholds describe the    human ability to differentiate between two stimuli, that differ in only one property. The first differential thresholds were recorded by E. H. Weber at the beginning of the last century [78]. He investigated the differential threshold of weight perception by placing a mass (reference stimulus \(\varPhi _0\)) of a test persons hand and adding additional mass \(\varDelta \varPhi \) until the test person reported a higher weight. The additional mass needed to evoke this perception of a higher weight was called Differenz Limen (DL).

Further studies showed that the quotient of \(\varDelta \varPhi \) and \(\varPhi _0\) would be constant in a wide range of reference stimulus intensities. This behavior is called Weber’s Law and the Weber fraction given in Eq. (2.5) is also called \(\hookrightarrow \) Just Noticeable Difference (JND).

$$\begin{aligned} \text {JND} := \frac{\varDelta \varPhi }{\varPhi _0+a} \end{aligned}$$
(2.5)

The \(\hookrightarrow \) JND is generally given in percent (%) or decibel (dB) with respect to the reference stimulus \(\varPhi _0\). Since further studies of Weber’s Law showed an increase in JNDs for low reference stimuli near the absolute threshold, the additional parameter a was introduced. It is generally interpreted as sensory background noise in the perception process [79, Chap. 1], which is a similarity to the basic assumption of the \(\hookrightarrow \) SDT. The resulting change of the JND near absolute thresholds is so large, that a consideration in the design of technical systems is advisable.

It is generally agreed, that the JND denotes the amount of stimulus change, that is detected as greater half the time. In literature one can find two different approaches to measure a JND in a psychophysical experiment. It has to be noted that these approaches do not necessarily measure the 50% point of the psychometric function:

  • Direct comparison of a reference stimulus \(\varPhi _0\) with a test stimulus \(\varPhi _0 + \varDelta \varPhi \). The stimulus controlled by the psychometric procedure is necessarily \(\varDelta \varPhi \), tests persons have asses if the test stimulus is greater than the test stimulus. The JND is calculated according to the procedures calculation rule, the convergence probability has to be taken into account when interpreting and using the JND.

  • According to [77, Chap. 5], the JND can also be determined by using two points of a psychometric function as given in Eq. (2.6)

    $$\begin{aligned} \text {JND} := \varPhi (p_\varPsi = 0.75) - \varPhi (p_\varPsi = 0.25) \end{aligned}$$
    (2.6)

    This definition is useful, if one cannot control the stimulus intensity freely during the test and has to measure a complete psychometric function with fixed stimuli (for example real objects with defined texture, curvature or roughness) or with long adaptation times of the test person.

It has to be noted that both approaches do not necessarily lead to the same numerical value of a JND. For certain classes of experiments, special terms for the differential thresholds have been coined. They are briefly described in the following:  

Point of Subjective Equality (PSE):

In experiments with a fundamental difference between test and reference stimulus in addition to the stimulus change, the differential threshold with \(p_\varPsi = 0.5\) is also called Point of Subjective Equality (PSE). At this stimulus configuration, the test person cannot discriminate the two stimuli. An example could be the assessment of the intensity of two vibrations that a coupled into the skin normally and laterally. The fundamental difference is the coupling direction and the intensities of both stimuli are adjusted in a way, that the intensity is perceived as equal by the test person.

Successiveness Limen (SL):

If two stimuli are presented to a test person, they are only perceived as two different stimuli, if there is a certain time period in between them. This time period is called Successiveness Limen (SL). For mechanical pulses, SL can be determined to about 5 ms, while direct stimulation of nerve fibers will exhibit a SL of about 50 ms [13, Chap. 4].

Two-point threshold:

The two-point threshold describes the distance between the application points of two stimuli that is needed to make this stimuli distinguishable from another. The smallest two-point thresholds can be found at tongue and lips (< 1 mm for static stimuli), at the fingertip thresholds of \(1\ldots {2}\,\mathrm{{mm}}\) can be found. Other body areas exhibit two-point thresholds of several centimeter as shown in Fig. 2.13 [5]. The spatial resolution is the reciprocal value of the two-point threshold.

Just Tolerable Difference (JTD):

The Just Tolerable Difference denotes the difference between two stimuli, that is differentiable, but still tolerable for the test subject. It is also termed the Quality JND and depends more on individual appraisal and judgment of the subjects than on the abilities of the sensory system. This measure can be used to determine system properties that are acceptable to a large number of users, as it is done in various other sensory modalities like taste [80] or vision [81].

Fig. 2.13
figure 13

figure adapted from [5] © Springer Nature, all rights reserved

Two-Point-Threshold at various locations,

The knowledge of differential thresholds has a major meaning. JNDs give the range of signals and stimuli that cannot be distinguished by the user, i.e. a limit for the reproducibility of a system. Proper consideration of JNDs in product design will yield systems with good user ratings and minimized technical requirements as shown for example in [82].

Description of Scaling Behavior

In the mid of the 19th century, Fechner formulated a relation between objectively measurable stimulus \(\varPhi \) and the subjective percept \(\varPsi \) based on Weber’s Law in Eq. (2.5). He set the JND equal to a non-measurable increment of the subjective percept and integrated over several stimulus intensities defined by increments of the JND.Footnote 1 This leads to Fechner’s Law as given in Eq. (2.7):

$$\begin{aligned} \varPsi = c \log {\varPhi } \end{aligned}$$
(2.7)

In Eq. (2.7), c is a constant depending on the investigated sensory system. However, Fechner’s Law is based on two assumptions that rendered invalid in further studies: The basis of a non-universal valid variant of Weber’s Law and the assumption, that an increment as high as the current JND will evoke a increment in perception [79, Chap. 1]. In the mid of the 20th century, S. S. Stevens proposed the Power Law, a new formulation of the relation of objective stimuli and subjective percepts, based on experimental data that could not be explained by Fechner’s Law:

$$\begin{aligned} \varPsi = c \varPhi ^a \end{aligned}$$
(2.8)

In Eq. (2.8), c is a scaling parameter as well, that is often neglected in further analysis. The parameter a denotes the coupling between subjective perception and objective measurable stimuli and depends on the individual experiment. By logarithmization of Eq. (2.8), it can be calculated as the slope of the resulting straight line \(\log \varPsi =\log c + a\log \varPhi \) [79, Chap. 13]. The Power Law can be summarized shortly as “a constant percentage change in the stimulus produces a constant percentage change in the sensed effect” [83, p. 16]. To analyze these changes particular psychometric procedures can be used, as for example found in [83]. Typical values in haptics include \(a = 3.5\) for the intensity of electrical stimulation at the fingertip (a 20% increase will double the perceived intensity) and \(a = 0.95\) for a vibration with a frequency of 60 Hz (i.e. a declining relation).

1.2.4 Factors Influencing Haptic Perception

From different studies of haptic perception, external  influencing factors are known that affect absolute and differential thresholds. They originate in the properties of the different blocks given in Fig. 2.1. For the design of haptic systems they have to be considered as disturbance variables or can be used to purposeful manipulate the usage conditions. An example may be the design of a grip or the control of a minimum or maximum grip force at an end effector of a haptic system. The following list will give the technical relevant influencing factors.  

Temperature:

Temperature will influence the mechanical properties of the skin [27]. Furthermore, perception channels exhibit a temperature dependance as given in Table 2.2. The absolute perception threshold is affected by temperature, the lowest thresholds are observed at about body temperature [84, 85]. This effect increases for higher frequencies [26, 86], denoting a higher temperature dependence of mechanoreceptors with greater receptive fields.

Age:

With increasing age the perception capabilities of high frequency vibration decrease. This is observed in different studies for finger and palm [87,88,89,90,91,92]. The change of form and spatial distribution of mechanoreceptors, especially of Pacinian Corpuscles, is deemed the cause for this effect.

Contact Area:

Because of the different receptive fields of the mechanoreceptors, the contact area is an important influencing parameter on haptic perception. With small contact areas, the thresholds for high frequency vibrations increase (higher intensities are needed for detection), while lowest thresholds can only be measured with large contact areas about the size of a finger tip or greater.

Furthermore, the absolute number and not the density of mechanoreceptors is approximately constant among test persons. Therefore the size of the hand is relevant for the perception capabilities, smaller hands will be more sensitive [17].

Since there is a (slight) correlation between sex and hand size, this is the reason for some contradictory studies about the dependency of haptic perception on the sex of the test person [89, 91, 93,94,95].

Other Factors:

Several other factors with influences on haptic perception thresholds like the menstrual cycle [85, 88, 96], diseases like bulimia or anorexia nervosa [97], skin moisture [98], and the influence of drinks and tobacco can be identified. In the design of haptic systems these factors cannot be incorporated in a meaningful way, since they can neither be controlled nor influenced in system design or usage.

1.2.5 What Do We Feel?

To investigate perception, an exact physical representation of the stimulus must be known. In auditory perception this is sound pressure, that will affect the eardrum and will be conducted via the middle ear to the nerve fibers in the cochlea and the organ of Corti. Visual perception is based on the detection of photons of a particular wavelength in the cones and rods in the retina.

In haptics, one will find different physical representations of stimuli, namely forces F and kinematic measures like acceleration a, velocity v or deflection d. The usage of a certain representation mainly depends on the purpose of the study or the system: forces are sometimes easier to describe and measure because of their characteristics as a flux coordinate defined at a single point. Kinematic measures exhibit characteristics of a differential coordinate, i.e. they can only be measured in relation to a prior defined reference. Many studies (especially of dynamic stimuli) are based on kinematic measures, since their definitions do not depend on the mechanical properties of the test person. Greenspan showed psychophysical measurements with less variation when stimuli were defined by kinematic measures compared to force [99].

However, there is evidence that humans do not only feel forces or kinematic measures. Perception is most likely based on the distribution of mechanical energy in the skin, where the mechanoreceptors are located. This distribution cannot be described with reasonable effort in detail (although there are some attempts for FE-modeling of the human skin [100,101,102,103]), furthermore it cannot be produced as a controlled stimuli for psychophysical experiments.

A common approach is to consider the human skin as a mechanical system, whose properties are not changed by haptic interaction. This is supported by studies conducted by Hogan, who showed that a human limb can be modeled as a passive mechanical impedance for frequencies higher that the maximum frequency of human motion capabilities. In that case, forces and kinematic measures coupled into the skin are related via the mechanical impedance \(\underline{z}_{\text {user}}\) according to Eq. (2.9)

$$\begin{aligned} \frac{\underline{F}_{\text {}}}{\underline{v}_{\text {}}} = \underline{z}_{\text {user}} \end{aligned}$$
(2.9)

with \(v = \frac{{\text {d}}\!{d}}{{\text {d}}\!{t}} = \int {a {\text {d}}\!{t}}\). Applied to perception this means, that each force perception threshold could be calculated from other thresholds defined by kinematic measures via the mechanical impedance of the test person. This relation is used in a couple of studies [104, 105] to calculate force perception thresholds from deflection-based measurements. Own studies of the author used force-based measurements to experimentally prove the relation given in Eq. (2.9) [106].

One can therefore conclude, that perception is based on the complex distribution of mechanical energy in the skin. For the design of haptic systems, a simplified consideration of the user as a source of mechanical energy with own mechanical parameters as given by the mechanical impedance \(\underline{z}_{\text {user}}\) is applicable. Furthermore, this model is also valid for the description of perception, linking perception parameters by the mechanical impedance as well. Some important psychometric parameters are given in the next section of this chapter, a detailed view on the modeling of the user is given in Chap. 3. Meanwhile modern and fast imaging technologies revealed dynamical mechanical stimulations to reach much further than the area of interaction. Shao et al. showed that oscillations induced at the finger tip result in responses reaching almost as far as the wrist [107, 108]. The contribution and importance of such afferent vibrations to the overall perception is subject to ongoing research.

Hayward asked in [109]: Is there a “plenhaptic” function? A question unanswered still. But it is a tempting assumption that with the right understanding of perceptional dimensions we could translate the physical domain into a perceptional-domain in all temporal and macro- and micro-dynamics we know.

1.3 Characteristic Values of Haptic Perception

There  are a vast number of studies investigating haptic perception. For the design process of haptic systems the focus does not lie on single biological receptors but rather on the human’s combined perception resulting from the sum of all tactile and kinaesthetic mechanoreceptors. As outlined in the following chapters, a dedicated investigation of perception thresholds is probably advisable for the selected grip configuration of a haptic system. This section will give some results of the most important ones, but will fail in being complete. It is ordered according to the type of psychometric parameter. To interpret the results of the different studies correctly, Fig. 2.14 gives some explanation of the anatomical terms for skin location and skeleton parts.

Fig. 2.14
figure 14

Anatomical terms for skin areas and skeleton parts of the human hand

1.3.1 Absolute Thresholds

One of the most advanced studies of haptic perception is carried out by the group of Gescheider et al. The probably most popular curve is the absolute perception threshold of vibrotactile stimuli defined by deflections of the skin at the thenar eminence as given in Fig. 2.15 [110]. Since the channel model arises in the work of this group, a lot of their studies deal with these channels and their properties. In Fig. 2.15 some properties of this model can be seen: The thresholds are influenced by the receptive fields, the highly sensitive RA-II-receptors are only exited with large contact areas, in addition, the most sensitive channel will be responsible for the detection of a stimulus. Other, non shown work includes the investigation of the perception properties of the finger tip [19] and intensive studies of masking and summation properties [88].

Fig. 2.15
figure 15

Figure is adapted from [110]

Absolute threshold of tactile perception channels at the thenar eminence with respect to contact size. Measurements were conducted with closed-loop velocity control of the stimuli. To address individual channels, combinations of frequencies, intensities, contact areas and masking effects are employed. The psychometric procedure used converges at a detection probability of \(p = 0.75\).

Other relevant studies were conducted by Israr et al. investigating vibrotactile deflection thresholds of hands holding a stylus [104] and a sphere [105], some quite common grip configurations of haptic interfaces. They investigate seven frequencies in the range of 10–500 Hz with an adaptive staircase (1up-3down progression rule) and a 3IFC paradigm and find absolute thresholds of 0.2–0.3 \(\upmu \)m at 160 Hz. The studies include the calculation of the mechanical impedance and force perception thresholds as well. Brisben et al. investigated the perception thresholds of vibrotactile deflections tangential to the skin, a condition becoming more and more important when dealing with tactile feedback on touch screen displays. Whole hand grasps and single digits were investigated with an adaptive staircase (different progression rules) and 2- and 3IFC paradigms. They additionally investigate perception thresholds for 40 and 300 Hz stimuli at different locations on the hand and with different contact areas. Newer studies by Gleeson et al. investigate the properties of several stimuli parameters like velocity, acceleration and total deflection [111] on the perception of shear stimuli. They found accuracy of direction perception depending on both speed and total displacement of the stimulus, with accuracy rates grater than 95% occurring at tangential displacement of 0.2 mm and a displacement speed of \({1} \,\mathrm{{mm s}}^{-1}\). The study further includes analysis of priming and learning effects and the application to skin stretch based communication devices.

One of the most important effect on haptic perception originates in the size of the contact area. All of the above mentioned studies show lower perception thresholds for frequencies around 200 Hz with larger contact areas. However, this effects seems to be limited by the minimum area required to arouse mechanoreceptors in the PC channel, which is probably about \({3}\,\mathrm{{cm}}^2\), corresponding to a contactor diameter of about 20 mm. When more than one finger is involved in the interaction, [24] did not found a summation effect of thresholds.

Regarding the perception of forces, the corresponding absolute thresholds can be calculated according to Eq. (2.9). There are only a few studies dedicated to the absolute perception of forces. Thornbury and Mistretta investigate the sensitivity to tactile forces applied by a modified version of von-Frey filaments. They find a significant influence of age on the absolute threshold that is most likely related to the decrease of mechanoreceptor density. Young subjects (mean age 31 years) exhibit absolute thresholds of 140 \(\upmu \)N, while older subjects have higher thresholds of about 660 \(\upmu \)N, measured with a staircase method, constant stimuli intensities and a 2IFC paradigm. Since the stimuli were applied manually by the experimenter, application dynamics cannot be determined from the study but probably contribute to the very low reported thresholds. Abbink and van der Helm investigated absolute force perceptions at the foot with different footwear (socks, sneaker, bowling shoe) for low-frequency stimuli (< 1 Hz) and a static preload of 25 N. They find lowest perception thresholds of 8 N in the sock condition, whereas the perception threshold is defined with a detection probability greater than 0.98. Also motivated by the small number of studies, the author of this chapter measured perception thresholds for vibrotactile forces up to 1000 Hz as shown in Fig. 2.16.

Fig. 2.16
figure 16

Data taken from [74, 106] © Springer Nature, all rights reserved

Absolute force perception threshold based on experiments with 27 test persons, measured with a quasistatic preload of 1 N. Thresholds are obtained with an adaptive staircase procedure converging at a detection probability of 0.707 with an 3IFC paradigm. Data is given as boxplot, since not all data for each frequency are normal distributed. The boxplot denotes the median (horizontal line), the interquartile range (IQR, closed box defined by the 0.25- and the 0.75-quantile), data range (dotted line) and outliers (data points with more than 1.5 IQRs from the 0.25- or the 0.75-quantile). The indentation denotes the confidence interval of the median (\(\alpha = 0.05\)).

In summary, one can find a large number of studies determining absolute thresholds for the perception of stimuli defined by deflections. Less studies are conducted regarding the absolute perception of forces. Table 2.4 summarizes some values of absolute perception thresholds for the human hand.

Table 2.4 Selected absolute thresholds of the human hand

1.3.2 Differential Thresholds

For haptics, several studies furnish evidence about the applicability of Weber’s Law as stated in Eq. (2.5). Gescheider et al. [121] as well as Verrillo et al. [122] measure \(\hookrightarrow \) JNDs of 1 ... 3 dB for deflection-defined stimuli with reference stimuli of 5 ... 40 dB above absolute threshold for frequency ranges exceeding 250 Hz. The measurements of Gescheider et al. are based on broadband and single frequency stimulus excitation. They show an independence of channels for the JND, whereas no fully constant JND was determined for high reference levels. This is addressed as “a near miss to Weber’s Law” by the authors [121], but this observation should not have a significant impact on the design of haptic systems.

Regarding the JND of forces, several studies were conducted with an active exertion of forces by the test person. Jones measures JNDs of about 7% from matching force experiments of the elbow flexor muscles [123], a value that is confirmed by Pang et al. [124]. However, one cannot determine the measurements dynamics from the experimental setup, based on Fig. 1.9 a maximum bandwidth of 10 ... 15 Hz seems to be likely. From other studies evaluating the perception of direction and perception-inspired compression algorithms (Sect. 2.4.4) estimations of the JND for forces can be made. This is summarized in Table 2.5. All studies show JNDs over 10% for reference stimuli well above the absolute threshold and increasing JNDs for reference stimuli near the absolute threshold.

Table 2.5 Relevant parameters and results of studies of dynamic force JNDs. Table based on [74]

Own studies of the author of this chapter evaluated the JND for dynamic forces in the range from 5 ... 1000 Hz. As reference stimuli, the individual perception threshold and fixed values of 0.25 N and 0.5 N were used. The results are given in Fig. 2.17. They show no channel dependence (despite a significant higher value for the JND at 1000 Hz) and affirm the increasing JND for reference stimuli near the absolute threshold. However, with about 4 ... 8 dB for frequencies less than 1000 Hz the JND in the 0.25 and 0.5 N condition is higher than the previously reported values.

Fig. 2.17
figure 17

Data taken from [74, 132]. © Springer Nature, all rights reserved

Just Noticeable Differences of dynamic forces JNDs were calculated with an adaptive staircase procedure converging at a detection probability of 0.707 and a 3IFC paradigm from studies conducted with 29 test persons (absolute threshold reference) and 36 test persons (0.25 and 0.5 N reference conditions) respectively. The test setup is described in [131] © Elsevier, all rights reserved, a static pre-load of 1 N was used.

Jones and Hunter investigated the perception of stiffness and viscosity and found JNDs of 23% for stiffness [133] and 34% for viscosity [134] with a matching procedure using both forearms with stimuli generated by linear motors. The JND for stiffness is similar to other studies as reported in [135, 136]. Further differential thresholds for the perception of haptic measures by the human hand are given in Table 2.6.

Table 2.6 Selected differential thresholds of the human hand

1.3.3 Object Properties

The  properties of arbitrary objects are closely related to the interaction primitives. Typical exploration techniques to detect object properties are dealt with in the following section. Despite the basic perception of form, size, texture, hardness and weight of an object, there are are couple of other properties relevant to the design of haptic systems. Bergmann Tiest reviews a large number of studies regarding the material properties roughness, compliance, temperature, and slipperiness. The results are relevant for the design of devices to display such properties, the representation of compliance is especially relevant for the interaction with virtual realities. Key points of the analysis are outlined in the following based on [143], whereas primary sources and some other references are cited as well. Klatzky et al. also review the perception of object properties and algorithms to render this properties in engineering applications [144]. The work of Samur, summarizing several studies about the perception of object properties, could be of further interest [145].  

Roughness:

Roughness is one of the most studied object properties in haptic perception. The perception of roughness is based on an uneven pressure distribution on the skin surface for static touch conditions and the vibrations arising when stroking a surface or object dynamically. It was shown, that finer textures with particle sizes smaller than 100 \(\upmu \mathrm{{m}}\) can only be detected in dynamic touch conditions, while coarser textures can be detected in static conditions, too. Active and passive touch conditions have no effect on the perceived roughness. This is called the duplex theory of roughness perception [146]. However, not only sensitive bandwidth and the touch condition have an influence on the human ability to perceive roughness. Other studies found influences of the contact force, other stimuli in the tested set and the friction between surface and skin. Regarding differential thresholds, Knowles et al. found JNDs of 10 ... 20% for friction in rotary knobs [147], Provancher et al. recorded JNDs of 19 ... 28% for sliding virtual blocks against each other [148].

Scaling experiments showed that roughness can be identified as the opposite to smoothness. In similar experiments, no effect of visual cues was found and a power-function exponent (Eq. (2.8)) of 1.5 was measured. In a nutshell, the perception of roughness appears to be a complex ravel of not only material properties, but also of interaction conditions like friction and contact force. This makes the modeling of roughness challenging, on the other hand, there are a vast number of possibilities to display roughness properties in technical systems [149].

Compliance:

This property describes the mechanical reaction of an material to an external force. It can be described by the Young’s Modulus or—technically more relevant—the stiffness of an object, that combines material and geometric properties of an object as shown further on in Eq. (2.10). When evaluating physical stiffness with the perceived compliance, a power-function exponent of 0.8 was calculated and softness and hardness were identified as opposites. For the perception of softness, cutaneous and kinaesthetic cues are used, while cutaneous information is both necessary and sufficient. Studies by Bergmann Tiest and Kappers determined that soft materials were mostly judged by the stiffness information, i.e. the relationship of material deformation to exerted force, while harder stimuli are judged by the surface deformation [150].

Several other studies show that the perception of the hardness, i.e. the contrary of compliance, of an object is better modeled by the relation of the temporal change of forces compared to the penetration velocity than by the normally employed relation of force to velocity [151]. This has to be considered in the rendering of such objects and is therefore dealt with in Chap. 12. To render a haptic contact perceived as undeformable, necessary stiffnesses from 2.45 \(\mathrm{{Nm}}^{-1}\) [138] to 0.2 \(\mathrm{{Nm}}^{-1}\) [152] are reported.

Slipperiness:

Slipperiness is not researched very deeply until now. It is physically strongly related with friction and roughness. The detection of slipperiness is important for the adjustment of grip forces when interacting with objects. While an accurate perception of slipperiness requires some relative movement, micro-slip movements of an grasped objects that are sensed with cutaneous receptors are made responsible for the adjustment of grip forces [153]. Studies show forces just 10% higher than the minimum force needed to prevent slip. The adjustment occurs with a reaction time of 80 ... 100 ms, that is faster than a deliberate adjustment [154].

Viscosity:

Not necessarily an object property, the ratio between shear stress and shear rate is relevant for virtual representation of fluids and visco-elastic materials. Based on real viscous fluids stirred with the finger and a wooden spatula, Weber fractions of about 0.3 were determined for high viscosities with increasing values for low viscosities [155]. Regarding scaling parameters, power function exponents for stirring silicone fluids of 0.42 are reported [83, Chap. 1].

Curvature:

While curvature itself is not necessarily relevant for the design of haptic systems, the detection capabilities of humans are quite astonishing. In [156] subjects were able to report a curvature with a base-to-peak height of just 90 \(\upmu \mathrm{{m}}\) on a strip with the length of 20 mm. However, the researchers suggest a measure of base-to-peak height in relation to half the strip length to generate a robust measure for curvature perception that can be interpreted as the perceivable gradient. This measure leads to a unit-less parameter with a value of 0.09. Differential thresholds are reported to be about 10% for convex curvatures with radii ranging starting from about 3 mm [157]. In the same study, convex curvatures with a radius of 204 mm could discriminated from flat surfaces, for concave curvatures a threshold of 185 mm could be assessed with a detection probability of 0.75.

Temperature:

Although not only a object property, basic properties of temperature perception are summarized here, partly based on [13]. Humans can detect intensity differences in warming pulses as low as 0.1% (base temperature of \({34}\,^{\circ }\mathrm{{C}}\), warming pulse with base intensity of \({6}\,^{\circ }\mathrm{{C}}\)) [158]. Changes of \({0.16}\,^{\circ }\mathrm{{C}}\) for warmth and \({0.12}\,^{\circ }\mathrm{{C}}\) for cold from a base temperature of \({33}\,^{\circ }\mathrm{{C}}\) can be detected at the fingertip and are still lower for the thenar eminence. When skin temperature changes slowly with less than \({0.1}\,^{\circ }\mathrm{{Cs}}^{-1}\), changes of up to \({6}\,^{\circ }\mathrm{{C}}\) in a zone of 30–\({36}\,^{\circ }\mathrm{{C}}\) cannot be detected. More rapid changes will make small changes perceivable. The technical use of temperature perception is limited by a temperature of \({44}\,^{\circ }\mathrm{{C}}\), where damage is done to the human skin [159].

Perceptually more relevant is the thermal energy transfer from skin into the object. Humans are able to discriminate a chance in heat transfer of about 35–45%. Because of different thermal conductivity and specific heat capacity, different materials can be identified by static touch alone. On this heat transfer mechanisms, the modeling, rendering and displaying thermal information is discussed in a number of studies cited in [143]. For technical applications, Jones and Ho discuss known models and the implications for the design of thermal displays [160].

Despite these general properties, there is a vast number of more complex object properties, that arise largely in the interpretation of the user. It is difficult to find clear technological terms for these interpretation. In literature one can find one approach to describe this interpretations: Users are asked to rate objects on different scales called semantic differentials. Based on these ratings, a multi-dimensional scaling procedure will identify similar scales [161]. Regarding surface properties, Hollins showed three general dimensions perceived by an user: rough \(\leftrightarrow \) smooth, hard \(\leftrightarrow \) soft and a third, not exactly definable dimension (probably elastic compressibility) [162]. This approach is also successfully used in the evaluation of passive haptic interfaces [163]. The accurate display of surface properties is still a relevant topic in haptic system design. Readers interested in this topic are pointed to the work of Wiertlewski [164] and the results of the Haptex-Project [165].

1.3.4 Scaling Parameters

Another important psychophysical measure is the interpretation of the intensity of different stimuli by the user, normally termed scaling. Especially for tactile applications, the perception of the intensity of normal and lateral applied stimuli is of importance. One of the first comparisons of the perception of tangential and normal stimuli was carried out by Biggs and Srinivasan. They found a \(1.7\,\ldots \,3.3\) times higher sensitivity for tangential displacements compared to normal stimulation at both the forearm and fingerpad. They conclude, that tangential displacement is the better choice for peak displacement limited actuators, while normal displacement should be chosen for actuators limited in peak forces. One has to note that this not caused by higher sensitivity, but differences in the mechanical impedance for normal and tangential stimuli [166].

Classical psychophysical evaluation of scaling behavior is reported by Hugony in the last century. Figure 2.18 shows the result as curves of equal perceived intensity, denoting stimulus amplitudes for different frequencies that will be perceived as equal intense by the user. Such curves can be applied to generated targeted intensity changes of complex stimuli: A slight amplitude increase for low-frequency-components will evoke the same perceived intensity than a much larger amplitude change of mid- and high-frequency components. This behavior can be optimized with regard to the energy consumption of the actuators in a haptic system. 

Fig. 2.18
figure 18

Figure is adapted from [118]

Curves of equal perceived intensity at the fingertip. The lowest curve denotes the measured absolute threshold, while the upper curve is perceived as nuisance.

The results further imply perception dynamics as high as 50 dB (defined as difference between absolute threshold and nuisance level), that are confirmed by newer studies like [122] stating a dynamic of 55 dB. Other results from the study imply an amplitude JND of \(10\,\ldots \,25\%\) and a JND of \(8\,\ldots \,10\%\) for frequency. This goes along well with the above reported results.

1.3.5 Some Words About the Quality of Studies About Haptic Perception

Studies of haptic perception are conducted by scientists with various backgrounds. Depending on the formal training and customs in different disciplines, the author experienced a large variety of qualities of haptic perception studies. Based on his own training in measurement and instrumentation and own studies dealing with haptic perception, the following hints are given on how to assess the quality of a perception study for further use in the design of haptic systems.  

Measurement Goal and Hypothesis:

A hypothesis should be stated for each experiment. Hypothesis formulated in terms of well-established psychophysical properties like the ones described above (Sect. 2.1) are preferable for the latter comparability of the study results. Further, external influencing variables should be considered in the hypothesis formulation. In general, one can differentiate dependent, independent, controllable and confounding variables as shown in Table 2.7 for investigations of haptic perception.

Independent variables are addressed in the formulation of the hypothesis and are varied during the experiment. Depending on the hypothesis, known influencing variables can be considered as independent or controllable variable. Controllable variables will have a known effect on the result of the experiment and should therefore be measured or closely watched. Possible means are keeping the test setup at constant temperature, a pre-selection of test subjects based on age, body length and weight etc. Confounding variables will contribute to the measurement error and cannot be completely taken care of.

Measurement Setup and Errors:

The measurement setup of a haptic perception study should be well fit for the investigated perception parameter or intended result. This means for example, that all parts of the measurement setup should exhibit adequate frequency response, rated ranges and sampling rates for the expected values in the experiment.

The design and construction of the setup should be neat to prevent unwanted effects and errors like for example electromagnetic disturbance by other equipment in the lab. Setups should favorably be fully automated to prevent errors induced by the experimenter.

The setup should be documented including all procedures and measurements of systematic and random errors. Based on a model of the measurement setup and its components, an analysis of systematic error propagation should be conducted as well as a documented calibration of the setup and its components with known input signals and a null signal. Long time stability, reproducibility, external influences and random errors should be analyzed and documented. Application of standardized methods like the \(\hookrightarrow \) Guide to the Expression of Uncertainty in Measurement (GUM) [167] is preferable. If possible, systematic errors should be corrected.

Measurement Procedure:

There should be a considerable amount of test persons in a study, a dedicated statistical analysis with less than 15 subjects seems to be questionable and should at least be explained in the study, explicitly addressing the type II error of the experiment design [168]. Larger numbers of 30 and more subjects are probably advisable.

Regarding psychophysical procedures, a previously reported and favorably adaptive procedure should be used. Newer studies should only used non-adaptive procedures in case of non-changeable stimuli (like gratings on real objects). The report of pre-tests and the impact on the design of the final study should be discussed in the documentation. Interactions with other sensual modalities like vision and audition should be kept in mind and eventually controlled, for example by ear plugs and masking noise.

Analysis:

Data sets not included in the analysis should be addressed and the criteria for this decision must be reported. All results should be analyzed statistically and the location parameters of the results should be given (i.e. mean and standard error for normal distributed results, and median and IQR for not normal distributed results). If external parameters are included in the study, an analysis of variance (ANOVA) as well as post-hoc tests for significance of treatment group averages should be conducted and reported. If other analysis tools like for example a confusion matrix are used, effort should be put into a statistical analysis of the significance of the result. Errors of the measurement setup should be addressed in the analysis.

If possible, results should be compared to other studies with similar setups and intention. When large differences occur, a detailed discussion of these differences and suggestions for further studies is advisable. To enable further studies based on the experiment results, test results for all effects (not only the significant ones) should be reported, as they can be used to determine effect sizes (useful for sample size calculations, see [168]) and to conduct meta-studies [169].

If all of the above hints are considered, most conference proceedings would not report results, but only measurement setups and their characterization. However, keeping the criteria for good measurement setups in mind will improve the quality of results and the broaden the usage possibilities of the study results.

Table 2.7 Possible variables in haptic perception experiments

1.4 Further Aspects of Haptic Perception

Despite the classic psychophysical questions (detection, discrimination, identification, scaling), there are a couple of other aspects relevant for the design of haptic systems. Some of them are discussed briefly in the following.

1.4.1 Effects of Multiple Stimulation

When more than one stimulus is applied in close temporal or spatial proximity to the first stimulus, several effects of multiple stimulation are known. The following list is based on Verrillo [122]:  

Masking:

This effect describes the decline of the detection ability of a stimulus, when an additional, disturbing stimulus, the so-called masker, is present at the same location in temporal proximity. Masking can occur when masker stimuli are present before, while and after the actual stimulus is presented. The masking properties depend on frequency and amplitude of the masker as well as on age of the test person and the receptor channel involved. When the masker is presented right before the test stimulus (< 1 s), the amount of masking depends on the time offset between masker and stimulus [170, 171], which is specific for each receptor channel. If a dedicated masker is used, specific receptor channels can be addressed. This is an important procedure to investigate properties of individual channels [172]. Masking finds application in the perception-based compression of data streams, for acoustics this is one of the main elements of the MP3-format [173].

Enhancement:

This effect occurs when a conditioning stimulus causes a stimulus in temporal succession to appear to be of greater intensity.

Summation:

When two ore more stimuli are presented closely in time, the combination of the sensation magnitude is described as summation.

Suppression:

This effect is basically a masking effect, when both stimuli are presented at different locations.

For haptics, especially masking effects were investigated, mainly by the group of Gescheider et al. [104, 171, 172, 174,175,176,177]. Studies of other effects are not known to the author. At the moment, these multiple stimulation effects have to be considered as side-effects in haptic interaction. Except for the analysis of receptor channels, there is no direct use of one known to the author.

1.4.2 Linearity of Haptic Perception

Recent studies imply, that the channels of haptic perception do not only have independent thresholds [178], but resemble a linear system. Cholewiak et al. investigated spatial displayed gratings and found a necessity for each spatial frequency harmonic to be higher than the perception threshold at that frequency to be perceived by the user [179]. These results allow to consider error margins and detection thresholds independently for each frequency in the design of haptic systems [180].

A first application of this property of haptic perception was presented by Allerkamp et al. in the design of a haptic system to describe surface properties of textiles: analog to the spectral decomposition of an arbitrary color into red, green and blue, textures were analyzed to be represented by two dedicated vibration frequencies for single receptor types [181]. This approach minimizes hardware and data storage effort to present complex surface properties.

1.4.3 Anisotropy of Haptic Perception

Despite the above mentioned differences in the scaling of lateral and tangential stimuli on the skin, there is also an anisotropy of kinaesthetic perception and interaction capabilities [126, 182, 183]. The perception and control of proximal movements (towards the body) is worse than movements in distal direction (away from the body). This property can be of meaning in the ergonomic design of workplaces with haptic interfaces and in tests and evaluations based on Cartesian coordinates.

1.4.4 Fooling the Sense of Touch

As  well as in acoustics and vision, there are a couple of haptic illusions. They are generated by anatomic properties, neural processing or mis-interpretation of percepts like a conflict of visual and haptic perception [140]. Since many visual illusions can be found in haptics, too, and because of the similar neural processing and interpretation mechanisms, an explanation analogue to the visual system is anticipated [184]. As Hayward puts it, “Perceptual illusions have furnished considerable material for study and amusement” [185]: Two examples of basic haptic illusions are given in Fig. 2.19. The Müller-Lyer-illusion on the left side is borrowed from visual perception, but can also be proven for haptic stimuli. Both lines are perceived as of different length because of the arrow heads, even if they have the same length. The Aristoteles illusion can be reproduced easily by the reader: Touching an object like a pencil with crossed fingers will evoke the illusion of two objects. If a wall is touched instead of an object, a straight wall will be perceived as a corner and vice versa. Further illusions can be found in the works of Hayward and Lederman [185, 186].

Fig. 2.19
figure 19

Examples of haptic illusions a Müller-Lyer-illusion, b Aristoteles-illusion

Example: Kooboh

An application of haptic illusions in the design of haptic systems was presented by Kildal in 2012 [187]. Kooboh consists of a solid, non-deformable box with an integrated force sensor and a vibration actuator as shown in Fig. 2.20. The control software simulates an internal system model containing a spring connected with a (massless) object sliding on a rough surface.

Fig. 2.20
figure 20

Kooboh, a outer form and internal components, b internal system model. Picture courtesy of J. Kildal, Nokia Research Center, Espoo, FIN, used with permission

The user applies a force \(F_\text {a}\) to the system, normally resulting in a deflection \(d = \frac{F_\text {a}}{c}\) of the spring c. When the object is moved because of the applied force, a frictional force \(F_\text {f}\) would be generated depending on the texture of the rough surface and the position of the object. Since the box is non-deformable, the reaction of the (virtual) spring cannot be felt. But because the applied force is measured by the force sensor, the theoretical deflection of the object and the therefrom resulting friction force \(F_\text {f}\) can be calculated. Depending on the structure of the rough surface, \(F_\text {f}\) will exhibit periodical, high-frequency contents that can be displayed by the integrated actuator. The user interprets these two contradictory percepts as a fully functional model as shown in Fig. 2.20, efficiently neglecting that the system does not move.

Pseudo-Haptic Feedback

A important technical application of another kind of haptic illusions is the usage of disagreeing information on the visual and the haptic channel. Termed “Pseudo-haptic feedback” it is used in virtual environments to simulate properties like stiffness, texture or mass with limited or distorted haptic feedback and accurate visual feedback [54, 188]. A simple example is given by Kimura et al. in [189] as depicted in Fig. 2.21. A visual representation of a spring is displayed on a mobile phone equipped with a force sensor. The deformation of the visual representation is depending on the force applied and the virtual stiffness of the displayed spring. Changing the virtual stiffness leads to a different visual representation and a feeling of different springs—although the user will always press the unchanged mobile phone case.

Fig. 2.21
figure 21

Pictures courtesy of T. Nojima, University of Electro-Communication, Tokyo, JP

Pseuo-haptic feedback in a mobile application [189] © Springer Nature, all rights reserved. Force exerted on the mobile device by the user is measured with pressure sensors. Based on this force, the deformation on the screen is calculated based on a virtual stiffness, leading to the impression of a compliant device.

1.4.5 Haptic Icons and Categorized Information

All of the above is based on continuous stimuli and their perception. Another important aspect is the perception of categorized information, that comes to use mainly in communication applications. Probably the most prominent example is the vibration alarm on a smartphone, that can be configured with different patterns for signaling a message or a call. Several groups investigated basic properties of such haptic icons (sometimes also called tactons or hapticons) [190,191,192]. They found different combinations out of waveform, frequency, pattern and spatial location suitable to create a set of distinguishable haptic icons based on multi-dimensional scaling analysis.

The use of categorized information in haptic systems introduces another measure of human perception, i.e. the information transfer (IT) [193]. This measure describes, how much distinguishable information can be displayed with the haptic signals defined by combinations of the above mentioned signal properties. However, it is no pure measure of perception, but also depends on the haptic system used. Because of that, it qualifies as a evaluation measure for haptic communication systems as detailed in Chap. 13. Reported information transfer ranges from 1.4–1.5 bits for the differentiation of force magnitude and stiffness [135] up to 12 bits for multi-axis systems especially designed for haptic communications of deaf-blind people [171, 194].

2 Concepts of Interaction

In  daily life, only the least haptic interactions of man with the environment can be classified as solely passive, that are pure passive perception procedures. The most interactions are a combination of motion and perception to implement a prior defined intention. For the design of haptic systems, general agreed on terms are needed to describe intended functions of a system. In this section, some common approaches for this purpose are described. The section ends with a list of motion capabilities of the human locomotor system.

The taxonomy of haptic interaction by Samur as given in Sect. 1.4.2 is one of the possibilities. It was developed for the evaluation of systems interacting with virtual environments and is therefore most suitable for the description of such. Other interactions can be described by combinations of the taxonomy elements as well, but lack some intuition when describing everyday interactions. Stepping a little bit away from the technical basis of Samur’s taxonomy and turning towards the functional meaning of haptic interaction for man an its environment, one will find the exploration theory of Lederman and Klatzky outlined in the following section. Further concepts like active and passive touch are described as well as gestures, that are commonly used as input modality on touch screens and other hardware with similar functionality.

2.1 Haptic Exploration of Objects

One of the most important task of haptic  interaction is the exploration of unknown objects to assess their properties and usefulness. Not only tactile information, but also kinaesthetic perception contributes to these assessments. One of the most relevant sources for the evaluation of surfaces is the relative movement between the skin and the object.

In [195], Lederman and Klatzky identify different exploratory procedures that are used to investigate unknown objects. Figure 2.22 shows the six most important procedures [196]. Table 2.8 gives an insight about costs and benefits when assessing certain object properties.

Fig. 2.22
figure 22

figure adapted from [196]

Important exploratory procedures,

Table 2.8 Correlation of exploratory procedures to ascertainable object properties according to [13]. denotes properties that be can be asserted optimally by the exploration technique, denotes properties that are asserted in a sufficient way

2.2 Active and Passive Touch

The above described combination of movement action and perception is from such fundamental meaning, that two terms have been established to describe this type of interaction.

Definition  Active Touch  Active touch describes the interaction with an object, where a relative movement between user and object is controlled by the user.

Definition  Passive Touch  Passive touch describes the interaction with an object, when relative movement is induced by external means, for example by the experimental setup.

Both conditions can be summarized as dynamic touch, while the touch of objects without a relative movement is defined as static touch [146]. This differentiation is indeed seldom used. Active touch is generally considered superior to passive touch in its performance. Lederman and Klatzky attribute this to the different focus of the observer [196, p. 1439]:

Being passively touched tends to focus the observer’s attention on his or her subjective bodily sensations, whereas contact resulting from active exploration tends to guide the observer’s attention to properties of the external environment.

Studies show independence of the assessment of material and system properties from the exploration type (active or passive touch condition) [197, 198]. Active touch delivers a better performance for the exploration of geometric properties [196, 199] from a technical view, the implementation of active exploration techniques is a challenge, since transmitted signals have to be synchronized with the relative movement.

2.3 Gestures

Gestures are a form of non-verbal communication that are studied in a large number of scientific disciplines like social sciences, history, communication and rhetoric and—quite lately—human-computer-interaction. Concentrating on the latter, one can find gestures when using pointing devices like mice, joysticks and trackballs. More recently, gestures for touch-based devices became more prominent. Some examples are given in Fig. 2.23, for further information see [200] for a taxonomy of gestures in Human-Computer-Interaction. An informative list on all kinds of gestures can be found in the Wikipedia under the reference term “List of Gestures”.

Fig. 2.23
figure 23

Pictures by Gestureworks, used with permission

Gesture examples for touch input devices. a Horizontal flicker movement, b two-finger scaling, c input gesture for the letter h.

Gestures can be used as a robust input means in complex environments, as for example in the car as shown with touch-based gestures in Sect. 14.1 or based on a camera image [201]. For the use in haptic interaction, gestures have further meaning when interacting with virtual environments, as discrete input options in mobile applications, and in connection with specialized haptic interfaces like AIREAL [202], that combines a 3D camera with haptic feedback through an air vortex, or the UltraHaptics project, that generates haptic feedback in free air by superposing the signals from a matrix of ultrasound emitters [203]. In 2017 a standard on the usage of gestures in tactile and haptic interaction (ISO 9241-960) was created covering those among other items.

2.4 Human Movement Capabilities

Since  users will interact with haptic systems, the capabilities of their movement has to be taken into account.

2.4.1 Dynamic Properties of the Locomotor System

While anatomy will answer questions regarding the possible movement ranges ([204] for example), there are a few studies dealing with dynamic abilities of humans. Tan et al. conducted a study to investigate the maximum controllable force and the average force control resolution [138]. They found maximal controllable forces that could be maintained for at least 5 s in the range of 16–51 N for the joints of the hand and forces in the range of 35–102 N for wrist, elbow and shoulder joints. Forces about half as large as the maximum force could be controlled with an accuracy of 0.7–3.4%. This study is based on just three test persons, but other studies find similar values, for example when grasping a cylindrical grip with forces ranging from 7 N (proximal phalanx of the little finger) to 99 N (tip of the thumb) [205]. An et al. find female’s hand strengths in the range of 60–80% of male’s hand strengths [206]Footnote 2.

Regarding velocities, Hasser derives velocities of \(60\,\dots \,{105}\,\mathrm{{cm s}}^{-1}\) for tip of the extended index finger [205] and about \({17}\, \mathrm{{rad s}}^{-1}\) for the MCP- and PIP-joints. Brooks reports maximum velocities of \({1.1}\, \mathrm{{m s}}^{-1}\) and maximum accelerations of \({12.2}\, \mathrm{{m s}}^{-2}\) from a survey of 12 experts of telerobotic systems [114].

2.4.2 Properties of Interaction with Objects

When touching a surface, users show exploration velocities of about \({2}\,\mathrm{{cm s}}^{-1}\) (with a range of \(1\,\dots \,{25}\,\mathrm{{cm s}}^{-1}\)) and contact forces ranging from 0.3 ... 4.5 N [207]. Other studies confirm this range for tapping with a stylus [208] and when evaluating roughness of objects [209]. Smith et al. found average normal forces of 0.49 N–0.64 N for exploring raised and recessed tactile targets on surfaces with the index finger. Recessed targets were explored with slightly larger forces and lower exploring speed (\({7.67}\,\mathrm{{cm s}}^{-1}\) compared to \({8.6}\,\mathrm{{cm s}}^{-1}\) for raised targets), increased friction between finger and explored lead to higher tangential forces. While the average tangential force in normal condition was 0.42 N, the tangential force was raised to 0.65 N in the increased friction condition (realized by a sucrose coating of the fingertip) [210].

For minimal invasive surgery procedures with a tool-mediated contact, radial (with respect to the endoscopic tool axis) forces up to 6 N and axial forces up to 16.5 N were measured by Rausch et al. High forces of about 4 N on average were recorded for tasks involving holding, pressing and pulling of tissue, low forces were used for tasks like laser cutting and coagulation, all measured with a force measuring endoscope operated by medical professionals as reported in [211, 212]. Tasks were carried out with movement frequency components of up to 9.5 Hz, which is in line with the above reported values (Fig. 1.9).

Hannaford once created a database with measurements of force and torque for activities of daily living like writing, dialing with a cell phone among others [213].

3 Interaction Using Haptic Systems

In  this section, interactions Using haptic systems are discussed and the nomenclature for haptic systems is derived from these interactions. The definitions are derived from the general usage in the haptics community and a number of publications by different authors [214,215,216,217,218] as well as logically extended based on the interaction model shown in Fig. 2.24.

While used in a general way until here, the term haptic systems will be defined as follows:

Definition  Haptic Systems  Systems interacting with a human user using the means of haptic perception and interaction. Although modalities like temperature and pain belong to the haptic sense, too, haptic systems refers only to pure mechanical interaction in this book. In many cases, the term haptic device is synonymously used for haptic systems.

In that sense, haptic systems will not only cover the fundamental haptic inputs and outputs, but also the system control instances needed to drive actuators, read out sensors and take care of data processing. This is in accordance to known definitions of mechatronic systems like the one by Cellier [219]:

A system is characterized by the fact that we can say what belongs to it and what does not, and by the fact that we can specify how it interacts with its environment. System definitions can furthermore be hierarchical. We can take the piece from before, cut out a yet smaller part of it and we have a new system.

The terms system, device and component are not defined clearly on an interdisciplinary basis. Dependent on one’s point of view, the same object can be a “device” for a hardware designer, a “system” for a software engineer or just another “component” of another hardware engineer. These terms are therefore also used in different contexts in this book.

Compared to other perception modalities, haptics offers the only bidirectional communication means between the human user and the environment [220, p. 94]. A user is defined as

Definition  User  A person interacting (haptically) with a (haptic) system. The user can convey intentions to the system, receiving (haptic) information depending on the application of the system. In that sense, a test person or subject in a psychophysical experiment is a user as well, but not all users can be considered as subjects.

In this book, a haptic system is always considered to have a specific application as for example the ones outlined in Sect. 1.5. We therefore also define this term as follows:

Definition  Application  Intended utilization of a haptic system.

One has to keep in mind, that this definition includes \(\hookrightarrow \) commercial off-the-shelf (COTS) haptic interfaces coupled to a computer with a software program to visualize biochemical components as well as the use of a specially designed haptic display as a physical interface. Especially in this section, the term application has therefore to be considered context sensitive.

Figure 2.24 gives a schematic integration of an arbitrary haptic system in the interaction between a human user and a (virtual) environment. Based on this, one can identify typical classes of haptic systems.

Fig. 2.24
figure 24

Haptic interaction between humans and environment. a Direct haptic interaction, b Utilization of haptic systems. The interaction paths are denoted as follows: I—Intention, P—Perception, M—Manipulation, S—Sensing, C—Comanipulation/other senses

3.1 Haptic Displays and General Input Devices

The probably most basic haptic system shown in Fig. 2.25 is a

Definition  Haptic Display  A haptic display solely addresses the interaction path P with actuating functions. Mechanical reactions of the human user have no direct influence on the information displayed by the haptic display, since user actions are not recorded and cannot be provided to the application.

Haptic displays are used to convey information originating in status information of the system incorporating the display. Typical applications are \(\hookrightarrow \) Braille row displays and—of course—the vibration alarm in mobile devices. Since the overlap to the next class of systems is somehow fuzzy, for the rest of this book a haptic display is defined as a device that only incorporates actuating functions but no sensory functions (except the internal ones needed for the correct functionality of the actuating part). These type of device is mainly used in communication applications like for example shown in Fig. 1.17, subfigure (a), (c) and (d). Often, a haptic display can be seen as a mechatronic component of a haptic system with additional functionality, for example an assistive system described in the next section.

Fig. 2.25
figure 25

Interaction scheme of a haptic display

For completeness, also systems addressing only the interaction path I can be identified. These are basically general input devices like buttons, keyboards, switches, touch screens and mice, that record intentions of the user mechanically and convey them to an application. Being mechanical components themselves, they naturally exhibit mechanical reactions felt as haptic feedback by the user, but these are normally independent from the application. For example, the haptic feedback from a computer keyboard is the same either for the F1 key or the Return key, while the effects of these intentions are quite different. Therefore, general input devices are defined as devices with a predominant input functionality that can be used in different applications and a subordinated haptic feedback independent from the application and resulting unintended from the real mechanical design of the input device. With the focus on generality, specialized input devices like emergency stop buttons are excluded, since they exhibit a defined haptic feedback to convey the current state of the input device.

3.2 Assistive Systems

This class of haptic systems shown in Fig. 2.26 is based on haptic displays, but will also include an application dependent sensory function.

Definition  Haptic Assistive System  System that will add haptic information to a natural, non technical mediated haptic interaction on path P based on sensory input on path S.

Assistive Systems are a main application area for haptic displays. The sensory input of assistive systems is not necessarily of a mechanical kind. However, compared to a haptic display as described above, an assistive system will add to existing, natural haptic interaction (i.e. without any technical means).

Fig. 2.26
figure 26

Interaction scheme of an assistive system, that adds haptic feedback to an existing direct interaction

3.3 Haptic Interfaces

If an intention recording function and a intended haptic feedback functionality is combined, another class of haptic systems can be defined as shown in Fig. 2.27:

Definition  Haptic Interface  Haptic interfaces address the interaction path P with actuating functions, but also record the user’s intentions along the interaction path I with dedicated sensory functions. These data are fed to the application and evoke commands to the system or visualization under control, depending on the application a mechanical user input can result in direct haptic feedback.

Fig. 2.27
figure 27

Relevant interaction paths of a haptic interface

Haptic interfaces are mostly used as universal operating devices to convey interactions with different artificial or real environments. Typical applications with task-specific interfaces include stall-warning sidesticks in aircraft and force-feedback joysticks in consumer applications. Another application is the interaction with virtual environments that is normally achieved with a large number of \(\hookrightarrow \) COTS haptic interfaces. These can also be used in a variety of other interaction tasks, some applications were outlined in Sect. 1.5. Some \(\hookrightarrow \) COTS haptic interfaces are shown in Fig. 2.28 as well as an example for a task-specific haptic interface for driving assistance. Other task-specific haptic interfaces are developed for the usage in medical training systems.

Fig. 2.28
figure 28

(AFFP, © 2022 Continental Automotive, Hannover, Germany). Both images used with permission

Two haptic interfaces, (PHANToM Premium 1.5, © 2022 3D Systems geomagic Solutions, Rock Hill, SC, USA) and Accelerator Force Feedback Pedal

In general, \(\hookrightarrow \) COTS devices support input and output at only a single point in the workspace. The position of this \(\hookrightarrow \) Tool Center Point (TCP) in the workspace of the device is sent to the application and all haptic feedback is generated with respect to this point. Since the interaction with a single point is somewhat not intuitive, most devices supply contact tools like styluses or pinch grips, that will mediate the feedback to the user. This grip configuration is a relevant design parameter and further addressed in Sect. 3.1.3. Figure 2.29 shows some typical grip situations of \(\hookrightarrow \) COTS devices with such tool-mediated contact.

Fig. 2.29
figure 29

© 2022 3D Systems geomagic Solutions, (Rock Hill, SC, USA). All images used with permission

Realizations of tool-mediated contact in commercial haptic interfaces. a omega.6 with a stylus interface (Force Dimension, Nyon, Switzerland), b Falcon with a pistol-like grip for gaming applications (Novint, Rockville Centre, NY, USA), c and d pinch and scissor grip interfaces for the PHANToM Premium

3.3.1 System Structures

To fulfill the request for independent channels for input (user intention) and output (haptic feedback) of the haptic interface and the physical constraint of energy conservation, one can define exact physical representations of the input and output of haptic interfaces. This leads to two fundamental types of haptic systems that are defined by their mechanical inputs and outputs as follows:

Definition  Impedance-Type System  Impedance-type systems (or just impedance systems) exhibit a mechanical input in form of a kinematic measure and a mechanical output in form of a force or torque. In case of a haptic interface, the mechanical input (in most cases the position of the \(\hookrightarrow \) TCP device) is conveyed as an electronic output to be used in other parts of the application.

Definition  Admittance-Type System  Admittance-type systems exhibit a mechanical input in form of a force or a torque, that is conveyed as a electronic output in most cases as well. The mechanical output is given by a kinematic measure, for example deflection, velocity or acceleration.

The principle differentiation of impedance-type and admittance-type of systems is fundamental to haptic systems. It is therefore further detailed in Chap. 6.

3.3.2 Force Feedback Devices

The term force feedback is often used for the description of haptic interfaces, especially in advertising force-feedback-joysticks, steering wheels and other consumer products. A more detailed analysis of these systems yields the following characteristics for the majority of such systems:  

System Structure:

Because of the output of forces, they resemble impedance-type systems as defined above.

Dynamics:

Force feedback systems address the whole dynamic range of haptic interaction, but do not convey spatially distributed (tactile) information.

Contact Situation:

Most force feedback systems do not allow for natural exploration, but will convey information via a tool.

These characteristics show a quite deep level of detail. The only comparable other term with similar detail depth is perhaps tactile feedback, mostly defining spatial distributed feedback in the dynamic range of passive interaction (Fig. 1.9). However, these terms are used so widely in technical and non-technical applications with different and not agreed on definitions, that they will not be used in this book in favor of other, clearly defined terms. In that cased, force feedback devices would probably better be described as impedance type interfaces with tool-mediated haptic feedback. Since this is a scientific book, the longer term is preferred to a unclear definition.

3.4 Manipulators

There is only a limited number of systems from outside the haptic community that can be classified as impedance systems. For admittance systems, one can find haptic interfaces (For example, the Haptic Master interface shown in Fig. 1.14 is an admittance-type interface) as well as mechanical manipulators from other fields. For example, industrial robots are normally designed as admittance systems that can be commanded to a certain position and measure reaction forces if equipped properly. In the here presented nomenclature of haptic system design, such robots can be defined as manipulators:

Definition  Manipulator  Technical system that uses interaction path M to manipulate or interact with an object or (remote) environment. Sensing capabilities (interaction path S) are used for the internal system control of the manipulator and/or for generating haptic feedback to a user.

Figure 2.30 shows the corresponding interaction scheme.

Fig. 2.30
figure 30

Interaction scheme of a manipulator

3.5 Teleoperators

The combination of a haptic interface and a manipulator yields the class of teleoperation systems with the interaction scheme shown in Fig. 2.31.

Definition  Teleoperation Systems  A combined system recording the user intentions on path I via the manipulation path M to a real environment, measuring interactions on the sensing path S and providing haptic feedback to the user via the perception path P.

Fig. 2.31
figure 31

Teleoperation interaction scheme

An extension of teleoperators is the class of \(\hookrightarrow \) telepresence and teleaction (TPTA) systems, that include additional feedback from other senses like vision and/or audition. Both terms are used synonymously sometimes. Teleoperation systems allow a spatial separated interaction of the user with a remote physical environment. The simplest system is archived by coupling a impedance-type haptic interface with an admittance-type manipulator, since inputs and outputs correspond correctly. Often, couplings of impedance-impedance systems are used because of the availability of components, which generates higher demands on the system controller.

3.6 Comanipulators

If additional mechanical interaction paths are present, telepresence systems will turn into a class of systems called comanipulator [214]:

Definition  Comanipulation System  Telepresence system with an additional direct mechanical link between user and the environment or object interacted with.

Comanipulator systems are often used in medical applications, since they minimize the technical effort compared to a pure teleoperator because of less moving mass, fewer active \(\hookrightarrow \) DOF and minimized workspaces, but also induce new challenges for the control and the stability of a system. In an application, the user will move the reference frame of the haptic system.

Compared to the above mentioned assistive systems, comanipulators exhibit a full teleoperational interaction scheme with additional direct feedback, while assistive systems will add additional haptic feedback to a non-technical mediated interaction between user and application. This is shown in Fig. 2.32.

Fig. 2.32
figure 32

Interaction scheme for comanipulators with an additional direct feedback via comanipulation path C

3.7 Haptic System Control

To make the above described systems useable in an application, another definition of more technical nature has to be introduced:

Definition  Haptic System Control  The haptic system control is that part of a real system that will not only control the single mechanical and electrical components to ensure proper sensing, manipulating and displaying haptic information, but will also take care of the connection to other parts of the haptic system. This may be for example the connection between a haptic interface and a manipulator or the interface to some virtual reality software.

While the pure control aspects are addressed in Chap. 7, one has also consider other design tools and information structures like Event-Based-Haptics (Sects. 11.3.4, 11.3.4), Pseudo-Haptics (Sect. 2.1.4.4 and the general connection to software (Chap. 12) using a real interface (Chap. 11). In this book and in other sources, one will also find the term haptic controller used synonymously for the whole complex of the here-described haptic system control. 

4 Engineering Conclusions

Based on the above, one can conclude a general structure of the interaction with haptic systems and assign certain attributes to the different input and output channels of a haptic system. This is shown in Fig. 2.33, that extends Fig. 2.24. In the figure, the output channel of the haptic system towards the user is separated in mainly tactile and mainly kinaesthetic sensing channels. This is done with respect to the explanations given in Sect. 1.4.1 and with the knowledge that there are many haptic interfaces that will fit in this classification, that will also be used further on in this book occasionally. The parameters given in Fig. 2.33 give an informative basis for the interaction with haptic systems.

Fig. 2.33
figure 33

General input and output ports for a haptic system in interaction with the human hand. Figure is based on [6, 221] © Wiley, all rights reserved, values form [221] are based on surveys among experts and are labeled with an asterisk (*), other values are taken from the different sources stated above

In the remaining part of this section, several conclusions for the design of task-specific haptic systems are given based on the properties of haptic interaction.

4.1 A Frequency-Dependent Model of Haptic Properties

Haptics and especially tactile feedback is a dynamic impression. There are little to no static components. Without exploring scientific findings, a simple impression of the dynamic range covered by haptic and tactile feedback can be estimated by taking a look at different daily interactions (Fig. 2.34).

Fig. 2.34
figure 34

Concept of modalities and their frequency dependency

When handling an object, the first impression which will be explored is its weight. There is probably no one who was not caught by surprised at least once lifting an object which in the end was lighter than expected. The impression is usually of comparably low frequency and typically directly linked to the active touch and movement applied to the object.

Exploring an object with the finger to determine its fundamental shape is the next interaction-type in terms of its dynamics. When touching objects like that, a global deformation of the finger and a tangential load to its surface are relevant to create such an impression. There have been research performed by Hayward showing that indeed the pure inclination of the finger’s surface already create an impression of shape. However, still being quite global the dynamic information coded in this property is not very rich.

Dynamics increases when it becomes urgent to react. One of the most critical situations our biology of touch is well prepared for is the detection of slippage. Constant control of normal forces to the object prevent it from slipping out of our grasp. Being highly sensitive to shear and stick-slip this capability enables us to gently interact with our surrounding.

When it comes to slippage textured surfaces and their dynamics must be mentioned too. Their frequency is of course depending on geometrical properties, however their exploration during active touch typically happens in the range above 100 Hz. Within this sensitive area discrimination capabilities of textures are naturally most sensitive, as the vibrotactile sensitivity of the human finger is climbing to its highest level.

Whether gratings do differ from textures is something which can be discussed endless. The principal excitation of the tactile sensory orchestra may be identical, however gratings are more like a dirac pulse, whereas textures are more comparable to a continuous signal.

Last but not least, hard contacts and the properties they reveal about an object reflect the most dynamic signal processing a haptic interaction may have. And surprisingly, a strong impact to an object reveals more about its volume and structural properties as any gentle interaction can ever show. Therefore stiffness is worth an own set of thoughts in the following section.

4.2 Stiffnesses

Already the initial touch of a material gives us information about its haptic properties. A human is able to immediately discriminate, whether he or she is touching a wooden table, a piece of rubber or a concrete wall with his or her finger tip. Besides the acoustic and thermal properties, especially the tactile and kinaesthetic feedback plays a large role. Based on the simplified assumption of a double-sided fixed plate its stiffness k can be identified by the usage of Young’s modulus E according to Eq. (2.10) [222].

$$\begin{aligned} k=2\,\frac{b\,h^3}{l^3}\cdot E \end{aligned}$$
(2.10)

Figure 2.35a shows the calculation of stiffnesses for a plate of an edge length of 1 m and a thickness of 40 mm of different materials. In comparison, the stiffnesses of commercially available haptic systems are given in (Fig. 2.35b). It is obvious that these stiffnesses of haptic devices are factors of ten lower than the stiffnesses of concrete, every-day objects like tables and walls. However, stiffness is just one criterion for the design of a good, haptic system and should not be overestimated. The wide range of stiffnesses reported to be needed for the rendering of undeformable surfaces as shown in Sect. 2.1.3 is a strong evidence of the inter-dependency of several different parameters. The comparison above shall make us aware of the fact that a pure reproduction of solid objects can hardly be realized with a single technical system. It rather takes a combination of stiff and dynamic hardware, for especially the dynamic interaction in high frequency areas dominates the quality of haptics, which has extensively been discussed in the last section.

Fig. 2.35
figure 35

a Comparison between stiffnesses of a 1 \(\times \) 1 \(\times \) 0.04 m\(^3\) plate of different materials and b realizable stiffnesses by commercial haptic systems

4.3 One Kilohertz—Significance for the Mechanical Design?

As stated above, haptic perception ranges up to a frequency of 10 kHz, whereby the area of highest sensitivity lies between 100 Hz and 1 kHz. This wide range of haptic perception enables us to perceive microstructures on surfaces with the same accuracy as enabling us to identify the point of impact when drumming with our fingers on a table.

For a rough calculation a model according to Fig. 2.36 is considered to be a parallel circuit between a mass m and a spring k. Assuming an identical “virtual” volume V of material and taking the individual density \(\rho \) for a qualitative comparison, the border frequency \(f_b\) for a step response can be calculated according to Eq. (2.11).

$$\begin{aligned} f_b&=\frac{1}{2\pi } \sqrt{\frac{k}{m}} = \frac{1}{2\pi } \sqrt{\frac{k}{V \rho }} \end{aligned}$$
(2.11)

Figure 2.36 shows the border frequencies of a selection of materials. Only in case of rubber and soft plastics border frequencies of below 100 Hz appear. Harder plastic material (Plexiglas) and all other materials show border frequencies above 700 Hz. One obvious interpretation would state that any qualitatively good simulation of such a collision demands at least such bandwidth of dynamics within the signal conditioning elements and the mechanical system.

Fig. 2.36
figure 36

3 dB border frequency \(f_b\) of an excitation of a simple mechanical model parametrized as different materials

As a consequence, a frequent recommendation for the design of haptic systems is the transmission of a full bandwidth of \(1\,\text {kHz}\) (and in some sources even up to \(10\,\text {kHz}\)). This requirement is valid with respect to software and communications-engineering, as sampling-systems and algorithmic can achieve such frequencies easily today. Considering the mechanical part of the design, we see that dynamics of \(1\,\text {kHz}\) are enormous, maybe even utopian. Figure 2.37 gives another rough calculation of oscillating force amplitude according to Eq. (2.12).

$$\begin{aligned} F_0&= \left| \underline{x} \cdot (2 \pi f)^2 \cdot m \right| \end{aligned}$$
(2.12)

The basis of the analysis is a force source generating an output force \(\underline{F}_0\). The load of this system is a mass (e.g. a knob) of 10 g (!!). The system does not have any additional load, i.e. it does not have to generate any haptically active force to a user. A periodic oscillation of a frequency f and an amplitude \(\underline{x}\) is assumed. With expected amplitudes for the oscillation of 1 mm at 10 Hz a force of approximately 10 mN is necessary. At a frequency of 100 Hz there is already a force of 2–3 N needed. At a frequency of 700 Hz the force already increases to 100 N—and this is what happens when moving a mass of 10 g. Of course in combination with a user-impedance as load the amplitude of the oscillation will decrease in areas of below 100 \(\upmu \mathrm{{m}}\) proportionally decreasing the necessary force. But this calculation should make aware of the simple fact that the energetic design and power management of electromechanical systems with application in the area of haptics needs to be done very carefully.

Fig. 2.37
figure 37

Equipotential line of necessary forces in dependency of amplitude and frequency of the acceleration of a mass with 10 g

The design of a technical haptic system is always a compromise between bandwidth, stiffness, dynamics of signal conditioning and maximum force-amplitudes. Even with simple systems the design process leads the engineer to the borders of what is physically possible. Therefore it is necessary to have a good model for the user according to his being a load to the mechanical system and according to his or her haptic perception. This model enables the engineer to carry out an optimized design of the technical system, which is the focus of Chap. 3. However there is also the option to use psychophysical knowledge to trick perception by technical means.

4.4 Perception-Inspired Concepts for Haptic System Design

At the end of this chapter, two examples shall illustrate the technical importance of an understanding of perception and interaction concepts. The chosen examples present two technical applications that purposeful use unique properties of the haptic sensory channel to design innovative and better haptic systems.

Example: Event-Based-Haptics

Based on the bidirectional view of haptic interactions (Sect. 1.4.2) with a low-frequent kinaesthetic interaction channel and a high-frequent tactile perception channel, Kontarinis and Howe published a new combination of kinaesthetic haptic interfaces with additional sensors and actuators for higher frequencies. Tests included the use in the virtual representation and exploration of objects [223] as well as the use in teleoperation systems.

Based on this work, Niemeyer et al. proposed Event-Based Haptics as a concept for increasing realism in virtual-reality applications [224]. In superposing the kinaesthetic reactions of a haptic interface with high-bandwidth transient signals for certain events like touching a virtual surface, the haptic quality of this contact situation can be improved considerably [225]. The superposed signals are recorded using accelerometers and played back open-loop if a predefined interaction event takes places.

This concept proved as a valuable tool for the rendering of haptic interactions with virtual environments. Rendering quality is increased with a comparatively small hardware effort in form of additions to (existing) kinaesthetic user interfaces. Technically not an addition to an existing kinaesthetic system, but still based on the Event-Base Haptics approach, the VerroTouch-System by Kuchenbecker et al. was developed as an addition to the DaVinci Surgical System. It adds tactile and auditory feedback based on vibrations measured at the end of the minimal invasive instrument attached to the robot [226]. These vibrations are processed and played back using vibratory motors attached to the DaVinci controls and additional auditory speakers.

The system showed in Fig. 2.38 is able to convey the properties of rough surfaces and contact events with manipulated objects. The augmented interaction was evaluated positively in a study with 11 surgeons [227]. Objective task metrics showed neither an improvement nor impairment of the tested tasks.

Fig. 2.38
figure 38

Figure adapted from [226] © Springer Nature, all rights reserved

Integration of VerroTouch into the DaVinci Surgical System.

Example: Perceptual Deadband Coding

The Perceptual Deadband Coding (PD) is a perception-oriented approach to minimize the amount of  haptic data that has to be transmitted in real-time applications such as teleoperation [127, 228]. To achieve his data reduction, new data is only transmitted from the slave to the master side, if the change compared to the preceding data point is greater than the \(\hookrightarrow \) JND. The Perceptual Deadband Coding is illustrated for the one-dimensional case in Fig. 2.39, but can be extended easily to more-dimensional so-called dead-zones [229].

Fig. 2.39
figure 39

Lossy compression based on the perceptual deadband [127], own illustration

Recommended Background Reading

 

[73]:

Attention, Perception, and Psychophysics, Volume 63, Issue 8, 2001. Special edition about the psychometric function and psychophysical procedures.

[230]:

Jones, L. & Tan, H. Z.: Application of Psychophysical Techniques to Haptic Research. in IEEE Transactions on Haptics, vol. 6, no. 3, pp. 268-284, July-Sept. 2013. https://doi.org/10.1109/TOH.2012.74.

An expert review and recommendation on psychophysical methods with the focus on human haptic interaction.

[66]:

Prins, N. & Kingdom, F. A. A.: Psychophysics: a Practical Introduction. Academic Press, Maryland Heights, MO, USA, 2010.

Psychophysics textbook with a good overview about modern psychometric procedures and the underlying statistics.

[61]:

Wickens, T. D.: Elementary Signal Detection Theory. Oxford University Press, Oxford, GB, 2002.

Modern and quite entertaining book about modern signal detection theory.