1 Research Disciplines

Haptics—in a non-scientific understanding, refers to the sense of touch and everything connected with it. If you think about it more carefully, you will realise that touch always requires interaction. Thus, the perception of touch cannot take place without contact, and consequently, without something being touched or being touched by. Following this basic concept, it is obvious that haptics requires interaction. A statement that sounds simple, but in terms of research and technical tasks it adds complexity to the subject. This is because, in contrast to vision and sound, haptics always has an impact on the touched object itself due to the interaction, and the classification of interactions varies depending on the physical properties of the body and object. If there is also awareness that the sense of touch is relevant to every mechanical part of the body that interacts with the environment, and in particular to every area covered with skin, each of them having different sensory capabilities, the challenges in this field should become clear.

Consequently with haptics-research still growing the field is restructured frequently. A snapshot of the core-disciplines is given in Fig. 1.1. Whereas 20 years ago haptic research areas were maybe eight or ten, the diversification of research changed drastically in the last decade due to increased understanding of interdependencies but also more specialization and specific needs of industry. One main direction can be found with the group of perception-based research covering psychophysical and neuroscience-related topics. This field has a strong influence on all the application-based research such as ,  or , which themselves again need several components and subsystems and are used in different applications.

Fig. 1.1
figure 1

Concept-Map on Haptic Disciplines, own visualization

The topic of this book is engineering haptic devices. So with regards to Fig. 1.1 we are in the blueish device and yellow application areas, but of course doing this the book does not ignore the interlinked areas and gives those details required to understand the influences from those interfaces.

2 Some Broad Scope on Haptics

But what is haptics in the first place? A common and general definition is given as

Definition  Haptics   Haptics describes the sense of touch and movement and the (mechanical) interactions involving these.

but this will probably not suffice for the purpose of this book. This chapter will give some more detailed insight into the definition of haptics (Sect. 1.4) and will introduce four general classes of applications for haptic systems (Sect. 1.5) as the motivation for the design of haptic systems and—ultimately—for this book. Before that we will have a short summary of the philosophical and social aspects of this human sense (Sect. 1.3). These topics will not be addressed any further in this book, but should be kept in mind by every engineer working on haptics.

3 Philosophical and Social Aspects

An  engineer tends to describe haptics primarily in terms of forces, elongations, frequencies, mechanical tensions and shear-forces. This of course makes sense and is important for the technical design process. However haptics starts before that. Haptic perception ranges from minor interactions in everyday life, e.g., drinking from a glass or writing this text, to a means of social communication, e.g. shaking hands or giving someone a pat on the shoulder, and very personal and private interpersonal experiences. Touch has a conscious, but also a very relevant unconscious component as demonstrated e.g. by a study of Crusco et al. [1] showing a tip to a waitress being on average 10% higher with the customer being slightly touched. This touch is known as the Midas Touch and is surprisingly independent of gender and age on both sides. This section looks at the spectrum and influence of haptics on humans beyond technological descriptions. It is also a hint for the development engineer to deal responsibly and consciously with the possibilities of outwitting the haptic sense.

3.1 Haptics as a Physical Being’s Boundary

Haptics is derived from the Greek term “haptios” and describes “something which can be touched”. In fact the consciousness about and understanding of the haptic sense has changed many times in the history of humanity. Aristoteles puts the sense of touch in the last place when naming the five senses:

  1. 1.

    sight

  2. 2.

    hearing

  3. 3.

    smell

  4. 4.

    taste

  5. 5.

    touch

Nevertheless he attests this sense a high importance concerning its indispensability as early as 350 B.C. [2]:

Some classes of animals have all the senses, some only certain of them, others only one, the most indispensable, touch.

The social estimation of the sense of touch experienced all imaginable phases. Frequently it was afflicted with the blemish of squalor, as lust is transmitted by it [3]:

Sight differs from touch by its virginity, such as hearing differs from smell and taste: and in the same way their lust-sensation differs

It was also called the sense of excess [4]. In a general subdivision between lower and higher senses, touch was almost constantly ranged within the lower class. In western civilization the church once stigmatized this sense as forbidden due to the pleasure which can be gained by it. However, in the 18th century the public opinion changed and Kant is cited with the following statement [5]:

This sense is the only one with an immediate exterior perception; due to this it is the most important and the most teaching one, but also the roughest. Without this sensing organ we would be able to grasp our physical shape, whose perception the other two first class senses (sight and hearing) have to be referred to, to generate some knowledge from experience.

Kant thus emphasizes the central function of the sense of touch. It is capable of teaching the spatial perception of our environment. Only touch enables us to feel and classify impressions collected with the help of other senses, put them into context and understand spatial concepts. Although stereoscopic vision and hearing develop early, the first-time interpretation of what we see and hear, requires the connection between both impressions perceived independently and information about distances between objects. This can only be provided by a sense, which can bridge the space between a being and an object. Such a sense is the sense of touch. The skin, being a part of this sense, covers a human’s complete surface and defines his or her physical boundary, the physical being.

3.2 Formation of the Sense of Touch

As  shown in the prior section, the sense of touch has numerous functions. The knowledge of these function enables the engineer to formulate demands on the technical system. It is helpful to consider the whole range of purposes the haptic sense serves. However, at this point we do not yet choose an approach by measuring its characteristics, but observe the properties of objects discriminated by it.

The sense of touch is not only specialized on the perception of the physical boundaries of the body, as said before, but also on the analysis of immediate surroundings including the contained objects and their properties. Human beings and their predecessors had to be able to discriminate e.g. the structure of fruits and leaves by touch, in order to identify their ripeness or whether they were eatable or not, like e.g. a furry berry among smooth ones. The haptic sense enables us to identify a potentially harming structure, like e.g. a spiny seed, and to be careful when touching it, in order to obtain its content despite its dangerous needles.

For this reason, the sense of touch has been optimized for the perception and discrimination of surface properties like e.g. roughness. Surface properties may range from smooth ceramic like or lacquered surfaces with structural widths in the area of some \(\upmu \textrm{m}\), to somewhat structured surfaces like coated tables and rough surfaces like coarsely woven cord textiles with mesh apertures in the range of several millimeters. Humans developed a very typical way how to interact with theses surfaces enabling them to draw conclusions based on the underlying perception mechanism. A human being moves his or her finger along the surface (Fig. 1.2), allowing shear forces to be coupled to the skin. The level of the shear forces is dependent on the quality of the frictional coupling between the object surface and the skin. It is a summary of the tangential elasticity of the skin depending on the normal pre-load resulting from the touch \(F_{\text {norm}}\) and the velocity \(v_{\text {explr}}\) of the movement and the quality of the coupling factor \(\mu \).

Fig. 1.2
figure 2

Illustration for the interaction of movements, normal forces on the finger pad and frictional coupling

Everyone who has ever designed a technical frictional coupling mechanism knows that without additional structures or adhesive materials viscous friction between two surfaces can hardly reach a factor of \(\mu _r \ge 0.1\). Nevertheless nature, in order to be able to couple shear force more efficiently into the skin, has “invented” a special structure at the most important body-part for touching and exploration: the fingerprint. The epidermal ridges couple shearing forces efficiently to the skin, as by the bars a bending moment is transmitted into its upper layers. Additionally these bars allow form closures within structural widths of similar size, which means nothing else but canting between the object handled and the hand’s skin. At first glance this is a surprising function of this structure. When one looks again, it just reminds you of the fact that nature does not introduce any structure without a deeper purpose.

Two practical facts result from this knowledge: First of all the understanding of shear-forces’ coupling to the skin has come into focus of current research [6] and has resulted in an improvement of the design process of tactile devices. Secondly, this knowledge can be applied to improve the measuring accuracy of commercial force sensors by building ridge-like structures [7].

Another  aspect of the haptic sense and probably a evolutionary advantage is the ability to use tools. Certain mechanoreceptors in the skin (see Sect. 2.1 for more details) detect high-frequency vibrations that occur when handling a (stiff) tool. Detection of this high-frequency vibrations allows to identify different surface properties and to detect contact situations and collisions [8].

3.3 Touchable Art and Haptic Aesthetics

Especially  in the 20th century, art deals with the sense of touch and plays with its meaning. Drastically the furry-cup (Fig. 1.3) makes you aware of the significance of haptic texture for the perception of surfaces and surface structures. Whereas the general form of the cup remains visible and recognizable, the originally plane ceramic surface is covered by fur.

Fig. 1.3
figure 3

DIGITAL IMAGE ©2022, The Museum of Modern Art/Scala, Florence

Meret Oppenheim: furry-cup, 1936 [9, 10],

In 1968, the “Pad- and Touch-Cinema” (Fig. 1.4) allowed visitors to touch Valie Export’s naked skin for 12 s through a box being covered by a curtain all the time. According to the artist this was the only valid approach to experience sexuality without the aspect of voyeurism [9]. These are just a few examples of how art and artists played with the various aspects of haptic perception.

Fig. 1.4
figure 4

b/w—photography ©Valie Export, Bildrecht Wien, 2022, photo ©Werner Schulz, courtesy Valie Export, http://80.64.129.152:8080/share.cgi?ssid=0vdjJr7

Valie Export TAPP und TASTKINO, 1968

As with virtual worlds and surroundings, also haptic interaction has characteristics of artistry. In 2004, Ishii from MIT Media Laboratory and Iwata from the University of Tsukuba demonstrate startling exhibits of “tangible user interfaces” based on bottles opened to “release” music.

And meanwhile, the human-triggered touch is extended to devices touching back. With Marc Teyssier exploring very actively the limits of what is socially acceptable or not in the unexplored field between art and robotics (Fig. 1.5).

Fig. 1.5
figure 5

©2022 Marc Teyssier, used with permission

MobiLimb project with a device touching back [11],

Despite the artistic aspect of such installations, recent research evaluates new interaction possibilities for \(\hookrightarrow \) Human-Computer-Interaction (HCI)Footnote 1 based on such concepts:

  • In [12], picture frames are used as tangible objects to initiate a video call to relatives and friends, when placed on a defined space on a special table cloth.

  • With Touché, Disney Research presents a capacitive sensing principle to use almost every object as a touch input device [13]. It is intended to push the development of immersive computers that disappear in objects.

  • And even for everyday-objects touch-enhanced functions can be built-in and demonstrated, e.g. by the company Playtronica focusses on touch-enhancing everyday objects by an interpretation of capacitance into midi-signals and synth-music (Fig. 1.6).

Fig. 1.6
figure 6

©2022 Daria Malysheva, used with permission

Playtronica product playtron and Touch ME with capacitive measurement and midi-sound generation based on touch-intensity,

In  technical applications, the personal feeling of haptic aesthetics is a distinguishing factor. Car manufactures work on objective quality schemes for the perceived quality of interfaces [14, 15] with the target to create a touchable brand identity, there are whole companies claiming to “make percepts measurable” [16] and designers provide toolkits to evaluate characteristics of knobs and switches [17, 18] and meanwhile even design-packages are proposed and commercialized to evaluate typical vibrational feedbacks [19]. However, the underlying mechanisms of the assessment of haptic aesthetics are not fully understood. While the general approach of all studies is basically the same, using multidimensional scaling and regression algorithms to combine subjective assessments and objective measurements [20], details on perceptional dimensions are subject to ongoing research [21] and sophisticated data-models [22].

Carbon and Jakesch published a comprehensive approach based on object properties and the assessment of familiarities [23]. This topic still remains a fascinating field of research for interdisciplinary teams from engineering and psychology and is applied to regular product design [24].

4 Technical Definitions of Haptics

To use the haptic sense in a technical manner, some agreements about terms and concepts have to be made. This section deals with some general definitions and classifications of haptic interactions and haptic perception and is the basis for the following Chap. 2, which will dig deeper into topics of perception and interaction.

4.1 Definitions of Haptic Interactions

The haptic system empowers humans to interact with real or virtual environments by means of mechanical, sensory, motor and cognitive abilities [25]. An interaction consists out of one or more operations, that can be generally classified into motion control and perception [26]. The operations in these classes are called primitives, since they cannot be divided and further classified.

The  perception class includes the primitives detection, discrimination, identification and scaling of haptic information [27]. The analysis of these primitives is conducted by the scientific discipline called \(\hookrightarrow \) psychophysics. To further describe the primitives of the description class, the term \(\hookrightarrow \) stimulus has to be defined:

Definition   Stimulus (pl. stimuli)   Excitation or signal that is used in a psychophysical procedure. It is normally denoted with the symbol \(\varPhi \). The term is also used in other contexts, when a (haptic) signal without further specification is presented to a user.

Typical stimuli in haptics are forces, vibrations, stiffnesses, or objects with specific properties. With this definition, we can have a closer look at the perception primitives, since each single primitive can only be applied to certain haptic stimuli, as explained below.  

Detection:

The detection primitive describes, how the presence of a stimulus is detected by a human respectively a user. Depending on the interaction conditions, stimuli can be detected or not detected. This depends not only on the sensory organs involved (see Sect. 2.1) but also on the neural processing. Only if a stimulus is detected, the other perception primitives can be applied.

Discrimination:

If more than one stimulus is present and detected, the primitive discrimination describes how information are perceived, that are included in different properties of the signal (like frequency or amplitude of a vibration) or an object (like hardness, texture, mass).

Identification:

As well as the discrimination primitive, also the identification primitive is based on more than one present and detected stimuli. These stimuli are however not compared to each other, but with practical or abstract knowledge to allow a classification of the information contained in the stimuli. An example for such a task is the identification of geometric properties of objects like size and global form.

Scaling:

Scaling is the fourth primitive of perception as generally described by psychophysicists. This primitive describes the behavior of scales when properties of stimuli and objects are rated [28]. While scaling is only of secondary meaning for the description of interactions, it can provide useful information about signal magnitudes in the design process.

 

The motor control class can be divided in different operations as well. In this class, the primitives travel, selection and modification exist [29]. They can be better explained, if they are linked to general interaction tasks [29, 30]:  

Travel:

The movement or travel of limbs, the whole body or virtual substitutes (avatar) is used to search for or reach a destination or an object, to explore (unknown) environments or to change the position of oneself. Changing of a movement already in progress is included in this primitive.

Selection:

Especially in virtual environments, marking and/or selection of an object or a function is a vital primitive. It allows for a direct interaction in this environments in the first place.

Modification:

The modification primitive is based on a selection of a function or an object. It describes a change in orientation, position or other properties of an object as well as the combination of more than one object to a single one.

 

When using motor control primitives, not only the operation itself but the aim of the operation have to be considered for an accurate description of an interaction. If, for example, a computer is operated with a mouse as an input device and an icon on the screen is selected, this interaction could be described as a travel primitive or as a selection primitive. A closer look will probably reveal, that the travel primitive is used to reach an object on the screen. This object is selected in a following step. If this interaction should be executed with a new kind of haptic device, the travel primitive is probably considered subordinate to the selection primitive.

Based on these two classes of interaction primitives, Samur introduces a \(\hookrightarrow \) taxonomy of haptic interaction [31]. It is given in Fig. 1.7 and allows the classification of haptic interaction. A classification of a haptic interaction is useful for the design of new haptic systems: Requirements can be derived more easily (see Chap. 5), analogies can be identified and used in the design of system components and the evaluation is alleviated (see Chap. 13).

Fig. 1.7
figure 7

Figure based on [27, 31]

Taxonomy of haptic interaction.

Next to the analysis of haptic interaction based on interaction primitives, some more psychophysically motivated approaches exist:

  • Lederman and Klatzky propose a classification of haptic interaction primitives in two operation classes: Identification (The What-System) and Localization (The Where-System) [32].

  • Hollins proposes a distinction of primitives based on the spatial and temporal resolution of perception (and the combinations thereof) on the one side and and a class of “haptic” interactions on the other side [33]. Latter correspond roughly to the above mentioned motion control primitives.

The application of the taxonomy of haptic interactions as given in Fig. 1.7 to the development of task-specific haptic systems seems to be much more straightforward as the application of the approaches by Lederman and Klatzky and Hollins as stated in the above listing. Therefore these are not pursued any further in this book.

4.2 Taxonomy of Haptic Perception

Up till now, one of the main taxonomies in haptic literature has not been addressed: The classification based on \(\hookrightarrow \) kinaesthetic and \(\hookrightarrow \) tactile perception properties. It is physiological based and defines perception solely on the location of the sensory receptors. It is defined in the standard ISO 9241-910 [30] and given in Fig. 1.8.

Fig. 1.8
figure 8

Taxonomy of haptic perception as defined in [30]

With this definition, tactile perception is based on all \(\hookrightarrow \) cutaneous receptors. These include not only mechanical receptors, but also receptors for temperature, chemicals (i.e. taste) and pain. Compared to the perception of temperature and pain, mechanical interaction is on the one side much more feasible for task-specific haptic systems in terms of usability and generality, on the other side it is technically much more demanding because of the complexity of the mechanoreceptors and the inherited dynamics. Therefore this book will lay its focus on mechanical perception and interaction.

For processes leading to the perception pain the authors point to special literature [34] dealing with that topic, since an application of pain stimuli in a haptic system for everyday use seems not to be likely. The perception of temperature and possible applications are given for example in [35, 36]. Whereas some technical applications of thermal displays are known [37,38,39], these seem to be minor to mechanical interaction in terms of information transfer and dynamics. Therefore, temperature is primarily considered as an influencing factor on the mechanical perception capabilities and discussed more detailed in Sect. 2.1.2.

With the confinement on mechanical stimuli, we can define kinaesthetic and tactile perception as follows:

Definition   kinaesthetic   kinaesthetic perception describes the perception of the operational state of the human locomotor system, particularly joint positions, limb alignment, body orientation and muscle tension. For kinaesthetic perception, there are dedicated sensory receptors in muscles, tendons and joints as detailed in Sect. 2.1. Regarding the taxonomy of haptic interactions, kinaesthetic sensing is primarily involved the motion control primitives, since signals from kinaesthetic receptors are needed in the biological control loop for the positioning of limbs.

Definition   tactile   Tactile perception describes the perception based on sensory receptors located in the human skin. Compared to kinaesthetic receptors, they exhibit much larger dynamics and are primarily involved in the perception primitives of haptic interaction.

While originally the terms tactile and kinaesthetic are strictly defined by the location and the functions of the sensory receptors, they are used in a more general way recently. While the root of the word kinesthesia is linked to the description of movement, the term kinaesthetic is also used to describe static conditions nowadays [40]. Sometimes, kinaesthetic is only used for the perception of properties of limbs, while the term proprioception is used for properties regarding the whole body [41]. This differentiation is neglected further in this book because of its minor technical importance. The term tactile often describes any kind of sensor or actuator with a spatial resolution, regardless if it is used in an application addressing tactile perception as defined above. While these examples are only of minor importance for the design of haptic systems, the following usage of the terms is an important adaption of the definitions: Primarily based on the dynamic properties of tactile and kinaesthetic perception, the term definition is extended to haptic interactions in general nowadays. The reader may note that the following description is not accurate in terms of temporal sequence of the cited works, but focuses on the works with relevant contributions to the present use of the terms kinaesthetic and tactile.

Based on the works of Shimoga, the dynamics of kinaesthetic perception are set equal to the motion capabilities of the locomotor system [42]. The dynamics of tactile perception are bordered at about \(1\,\ldots \,2 \,\textrm{kHz}\) for practical reasons. Higher frequencies can be perceived [43, 44], but it is questioned, whether they have significant contribution to perception [45, p. 3]. As further explained in Sect. 2.4.3, this limitation is technically reasonable and necessary for the design of the electromechanical parts of haptic systems. Figure 1.9 shows this dynamic consideration of haptic interaction based on characteristic values from [44, 46, 47].

Fig. 1.9
figure 9

Figure is based on data from [44, 46, 47]

Kinaesthetic and tactile haptic interaction.

To extend this dynamic model of perception to a more general definition of interactions, Daniel and McAree propose a bidirectional, asymmetric model with a low-frequency (<30 Hz) channel for the exchange of energy and a high-frequency channel for the exchange of information [48] with general implications on the design of haptic interfaces. The mapping based on dynamic properties is meaningful to a greater extend, since users can be considered as mechanical passive systems for frequencies above the dynamics of the active movement capabilities of the locomotion system [49]. This will be explained in more detail in Chap. 3. Altogether, these aspects (dynamics of perception and movement capabilities, exchange paths of energy and information and the modelling of the user as active and passive load to a system) lead to the nowadays widely accepted model for the partition of haptic interaction in low-frequency kinaesthetic interaction and high-frequency tactile perception.

Both taxonomies of haptic interaction as seen in Fig. 1.7 and haptic perception as seen in Fig. 1.8 and extended in Fig. 1.9 are relevant sources for standard vocabulary in haptic system design. This is needed in the design of haptic systems, since it will simplify and standardize descriptions of haptic interactions. These are necessary to describe the intended functions of a task-specific haptic system and will be described more detailed in Sect. 5.2. Further definitions and concepts about haptic interaction and perception are given in Chap. 2 in more detail. In the next part of this chapter, possible applications for haptic systems that will become part of the human haptic interaction with systems and environments are presented.

5 Application Areas of Haptic Systems

Haptic systems can be found in a multitude of applications. In this section, four general application areas are identified. Benefits and technical challenges of haptic systems in this areas are given. In the latter Sect. 2.3, these application areas are combined with a general model of human-system-environment interaction, leading to an interaction-based definition of basic system structures.

5.1 Telepresence, Teleaction and Assistive Systems

Did  you ever think about touching a lion in a zoo’s cage?

With a \(\hookrightarrow \) telepresence and teleaction (TPTA)-system you could do just that without exposing yourself to risks, since they provide the possibility to interact mechanically with remote environments (We neglect the case of the lion feeling disturbed by the fondling...).

In a strict definition of TPTA-systems there is no direct mechanical coupling between operator and manipulated environment, but only via the TPTA-system. Thus the transmission of haptic signals is possible in the first place, since the mechanical interaction is converted to other domains (mainly electrical) and can be transmitted more easily. They are often equipped with additional multimodal features, mainly a one-directional visual channel displaying the environment to the operator of the TPTA-system.

Examples include systems for underwater-assembly, when visual cues are useless because of dispersed particles in the water [50], scaled support of micro- and nano-positioning [51, 52] and surgical applications [53, 54]. The use of TPTA-systems shortens task completion time, and minimizes errors and handling forces compared to systems without a haptic feedback [55]. In surgical applications new combinations of insofar incompatible techniques are possible, for example palpation in minimal invasive surgery. Studies also show an safety increase for patients [56]. In recent years especially the strong increase in band with in any networked application is driving imagination on what could be done. Antonakoglou et al. [57] did a very nice overview paper in the context of the availability of 5G. But despite aerial or space applications, the input-device stays in focus for an efficient operation [58].

Most TPTA-Systems knwon are used for research applications. Figure 1.10 shows an approach by Quanser, supplying a haptic interface and a robot manipulator arm. Based on this combination, versatile bilateral teleoperation scenarios can be designed, as for example neuroArm, a teleoperation system for neurological interventions [59]. Example interventions include the removal of brain tumors, that require high position accuracy and real-time integration of \(\hookrightarrow \) Magnetic Resonance Imaging (MRI) images.

Fig. 1.10
figure 10

Image courtesy by Quanser, Markham, Ontario, CA., used with permissions

Versatile teleoperation by Quanser: HD \(^2\) haptic interface with 7 DoF of haptic feedback and Denso Open Architecture robot with 6 DoF.

The development of TPTA-systems is technically most challenging. This is caused by the unknown properties of the environment, having an influence on the system stability, the required high accuracy of sensors and actuators to present artifact-free haptic impressions and the data transmission over long distances with additional aspects of packeted transmission, (packet-)losses and latency.

A special type of TPTA-systems are so-called \(\hookrightarrow \) comanipulators, that are mainly used in medical applications [53]. Despite the mechanical interaction over the TPTA-system, additional environment manipulation (and feedback) can be exerted by parts of the system (a detailed definition based on the description of the interaction can be found in Sect. 2.3). Examples for such comanipulators are INKOMAN and HapCath developed at the Institute for Electromechanical Design.

The HapCath-system that adds haptic feedback to cardiovascular interventions is presented in detail as an example in Sect. 14.2. Figure 1.11 displays the INKOMAN instrument, which is the result of the joint research project SOMIT-FUSION funded by the German Ministry of Education and Research. It is an extension of a laparoscopic instrument with a parallel kinematic structure [60], that provides additional \(\hookrightarrow \) degrees of freedom (DOF) of an universal tool platform [61]. This allows minimal invasive interventions at previously unreachable regions of the liver. By integrating a multi-component force sensor in the tool platform [62] interaction forces between instrument and liver can be displayed to the user [63]. This allows techniques like palpation to identify vessels or cancerous tissue. With the general form of a laparoscopic instrument, additional interaction forces can be exerted by the surgeon by moving the complete instrument, it is therefore classified as a comanipulation system.

Fig. 1.11
figure 11

Figure adapted from [63]

INKOMAN—intracorporal manipulator for minimal invasive abdomen interventions with increased flexibility. The figure shows the handheld instrument with a haptic display based on a delta kinematic structure. The parallel kinematic structure used to move the tool platform is driven by ultrasonic traveling wave motors.

TPTA  systems are mainly focus of research activities, probably since there are only small markets with a high potential for this kind of systems. An exception are medical applications, where non-directly coupled instruments promise higher safety and efficient usage, for example by avoiding collisions between different instruments or lowering contact and grip forces [56, 64]. Also automated procedures like knot tying can be accelerated and conducted more reliable [65]. However, the distinction between a haptic TPTA-system and a robotic system for medical use is quite a thin line: The aforementioned functions do not require haptic feedback. This explains the large number of existing medical robotic systems in research and industry [66, 67], dominated by the well-known Da Vinci by Intuitive Surgical Operations Inc.. This system was developed for urological and gyneological interventions and incorporates a handling console with three-dimensional view of the operation area and a considerable number of instruments, that are directed by the surgeon on the console and actuated with cable drives [68]. There is no haptic feedback for this system preinstalled, although there are promising extensions available as discussed in Sect. 2.4.4. Just recently, the system is extended to single-port entry, which further reduces the liaisons of the intervention and allows a quick exchange of tools used during the procedure (Fig. 1.12).

Fig. 1.12
figure 12

©2022 Intuitive Surgical Operations, Inc., used with permission

Da Vinci SP surgical system for single port access,

For consumer application, Holland Haptics sold a product called Frebble intended to convey the feeling of holding someones hand over the internet. This was as well an interesting hardware concept as a low-cost teleoperation device.

Also practical magnetic resonance imaging studies into the hand neural control revealed significant progress, but the harsh MRI environments are a challenge for devices capable of delivering a large variety of stimuli. This work focused on presenting an fMRI-compatible haptic interface to find the neural mechanisms for precision grasp control. The interface is placed at the scanner bore, and it is controlled through a shielded electromagnetic actuation system. It is located at the scanner bed end and uses a high stiffness cable. Performance evaluation showed up to 94 N renderable forces and structural stiffness of 3.3 N/mm, and at least 19 Hz position control bandwidth.

In this system, two closed-loop cable transmissions actuate the two DOF, which are for each finger. It consists of aluminum profiles that hold redirection modules. Cables are passing through a length and tension adjustment mechanism. The guiding pulleys are combined with low friction polymer/glass ball bearings. They are fixed on an aluminum bar rigidly attached to the scanner bedside. Fixing the cables to the capstan prevents slippage. Due to the transmission friction, cable wear is important, and for making better interaction with operators, the cable should be easily exchangeable in a breakdown during an fMRI study.

5.2 Virtual Environments

The second main application area for haptic systems is interaction with virtual environments. Since this is quite a large field of applications, we will have a closer look on different areas, where interaction with generated situations is used in a wider extent.  

Medical Training:

A large  number of systems is designed to provide medical training without jeopardizing a real patient [69]. In addition to haptic feedback, this systems generally provide also visual and acoustic feedback to generate a realistic impression of the simulated procedure. You can find systems to train the diagnosis of joint lesions [70] and simulators for endoscopic, laparoscopic and intravascular interventions [31]. Figure 1.13 shows an example of such a surgical simulator. Surgeons trained on simulators show a better task performance in several studies [71, 72]. In addition simulators can be used very early in medical training, since they do not put patients at risk and have a higher availability.

Industrial Design:

 In industrial design applications, virtual environments are used to simulate assembly operations and subjective evaluation of prototypes. Although there are much less applications than in medical training, this area pushes technology development: Some requirements can only be met with new concepts such as admittance systems and form displays. One of these is the Haptic Strip, that consists of a bend- and twistable surface that can be additionally positioned in 6 DoF in space [73]. It is shown in Fig. 1.14 and can be used to display large-scale forms of new designs without having to manufacture a prototype.

Multimodal Information Displays:

 Since the haptic sense was developed to analyze objects and the environment, similar application with a high demand of intuitive access to information can be found in literature. Haptic systems are used to display large amount of information in biology and chemistry [74, Chap. 9] and are also used as means for the synthesis of complex molecules [75]. For this application, the human ability to detect patterns (in visual representations) is used for a coarse positioning of synthesis partners, whereas micro positioning is supported by haptic representation of the intermolecular forces.

Another example for multimodal display of information was recently presented by Microsoft Research [76]. The TouchMover is an actuated screen with haptic feedback that can be used to display object and material properties or to intuitively access volumetric data like for example \(\hookrightarrow \) MRI scans. Figure 1.15 shows this application of the system. Annotations are marked visually and haptically with a detent, allowing for intuitive access and collaboration.

Consumer Electronics:

 For the integration of haptic feedback in computer games, Novint Technologies, Int. presented the Falcon haptic interface in 2006. It is based on a delta parallel kinematic structure and distinguishes itself through a very competitive price tag at around 500$. This device is also used in several research projects like for example [77], because of the low price and the support in several \(\hookrightarrow \) application programming interface (API). Looking from the 202x\({\text {th}}\) perspective, complex haptic enhanced input devices did not perform well in consumer electronics. The main area where they still persists are in gamepad or game-controller-applications but reduced to a function of pure vibrotactile feedback, Sony’s Dual-Sense Technology recently again increased the complexity and combined a vibration actuator with a motor-actuated and adaptable trigger. The future will show whether this is a revival of kinaesthetic feedback in consumer electronics.

But there are other areas. To provide a more intense gaming experience, haptic systems conveying low-frequency acoustic signals Butt Kicker by The Guitammer Company exist (Fig. 1.16). The system delivers low-frequency signals increasing the immersion. To allow for the touch of fabric over the internet, the Haptex project developed rendering algorithms as well as interfacehardware [78].

 

Fig. 1.13
figure 13

Picture courtesy of Simbionix USA, Cleveland, OH, USA., used with permission

Laparoscopic simulator LAP Mentor III The system was designed to simulate interventions in the abdomen.

Fig. 1.14
figure 14

Figure is based on [73] © Springer Nature, all rights reserved

The Haptic Strip system. The strip is mounted on two HapticMaster admittance type interfaces. Capacitive sensors on the strip surface sense the user’s touch.

Fig. 1.15
figure 15

Picture courtesy of Microsoft Research, Redmond, WA, USA., used with permission

TouchMover with user exploring MRI data.

Fig. 1.16
figure 16

©2022 The Guitammer Company, used with permission

Electrodynamic actuator ButtKicker for generating low-frequency oscillations on a gaming seat,

Compared to the design of TPTA-Systems the development of haptic interfaces for interactions with virtual environments seems to be slightly less complex, since more knowledge about the interaction environment is present in the design process. However, new aspects like derivation and allocation of the environment data arise with this applications. Because of the wider spread of such systems, cost efficiency has to be taken into account.

5.3 Non-invasive Medical Applications

Based on specific values of haptic perception diagnosis of certain illnesses and dysfunctions can be made. Certain types of eating disorders [79, 80] and diabetic neuropathy [38] are accompanied with diminished haptic perception capabilities. They can therefore be diagnosed with a measurement of perception or motor exertion parameters and comparison with the population mean. Next to diagnosis, haptic perception parameters can be used as a progress indicator in stroke [81] and limb [82] rehabilitation, too.

For these purposes cost-efficient systems with robust and efficient measurement protocols are needed. Because feedback from the user can be received with any means, development is easier than the development of TPTA- or VR-systems. These systems are foci of several research groups, up till now there is no system for comprehensive use in the market.

5.4 Communication

The fourth and by numbers largest application area of haptic systems is basic communications. The most prominent example is probably on your desk or in your pocket—the vibration function of your phone. Compared to communication based on visual and acoustic signals, haptics give the opportunity to convey information in a discrete way and offer the possibility of a spatial resolution. Communication via the haptic sense tends to be very intuitive, since feedback arises at the point the user is interacting with. A simple example is a switch, that will give a haptic feedback when pressed.

Therefore, haptics are an attractive communication channel in demanding environments, for example when driving a car. Several studies show that haptic communication tends to distract users less from critical operations than the use of other channels like vision or audition [83, 84]. Applications include assistive systems for navigation in military applications [85], a practical example for an adaptive haptic user interface for automotive use is given in Sect. 14.1. With the increasing number of steer-by-wire applications and the vision of autonomous driving vehicles, the haptic channel is identified as a possibility to raise awareness of the driver in possibly dangerous situations as investigated in [86].

More recently, the increasing use of consumer electronics with touch screens triggers a demand for technologies to add haptic feedback. It is intended to facilitate the use without recurring visual status inspection. Solutions for this applications include the usage of quite a lot different actuation principles, which will be the focus of Chap. 9.

Another application area are tactile interfaces for the blind and visual impaired [87, 88]. Despite displaying Braille characters, tactile interfaces offer navigation support (see for example the HaptiMap project providing toolkits for standard mobile terminals [89] or the tactile You-Are-Here-Maps or interactions with graphic interfaces [90, 91]. Newer studies show even advantages on finger-rehabilitation for stroke-patients by vibrotactile actuation [92]. Figure 1.17 gives some examples of haptic systems used for communication applications.

Fig. 1.17
figure 17

Components and systems for communication via the haptic sense. a Exciter for touchpads and mobile devices—Grewus Exciter EXR4403L-01A). b Hyperbraille-system for displaying graphic information for visual impaired users, image courtesy of metec AG, Stuttgart, Germany. c Lormer-system as machine-human-interface conveying text information using the lorm-alphabet on palm and hand of the user, image courtesy of Thomas Rupp. d Tactile Torso Display, vest for displaying flight information on the pilots torso, image courtesy of TNO, Soesterberg, The Netherlands. All images used with permission

Another type of haptic interface is the shape-changing interface. This interface type creates the information communication by altering its form. A usage of this haptic interface is navigation assistance by changing the shape and guiding the user to reach a point Fig. 1.18. This change is felt via the fingers of visually or hearing impaired, deafblind, and sighted pedestrians.

This shape-changing device is developed to implement the navigation guidance via a bi-directional expanding mechanism. It uses two similar parts to move away from the device central section. This shape change generates a sensation of variable volume. Inside of the system, one motor can be used for providing the rotational movement, and a rack and pinion can provide the translational movement. The top and bottom faces are designed to make the device easy to rest on the palm without pinching the user’s skin.

Fig. 1.18
figure 18

figures by Ad Spiers, used with permission

Different shapes of the haptic interface for sending different commands [93],

Despite  the analysis of energy-efficient actuation principles for mobile usage, scientific research in this area addresses the design of haptic icons for information transfer. Sometimes also called tactons, hapticons or tactile icons, the influence of rhythm, signal form, frequency and localization are investigated [94, 95]. Up till now, information transfer rates of \(2 \ldots 12\) bit per second were reported [96, 97], although the latter require a special haptic interface called the Tactuator designed for communication applications [98]. The exact bandwidth is still unclear yet. One application related study from Seo and Choi [99] reported 3.7 bits.

5.5 Completing the Picture

For completeness, also passive systems like a computer keyboard, trackballs and mice are part of this application area, since they will convey information given in form of a motion control operation to a (computer) system. Although there exists some kind of haptic feedback, it is not dependent on the interaction, but solely on the physical characteristics of the haptic system like inertia, damping or friction.

Another area inspired by haptic research and sometimes even used in haptic telepresence and telemanipulation scenarios is the area of robotic hands or limbs equipped with perception-inspired sensors. The whole area of tactile sensors was and is part of haptic research and referred to in chapter Chap. 10. Its main and fascinating application domain however is the area of robotics, especially when it comes to bionic-inspired systems [100]. A preliminary summit is reached by the micromechanical design of a fully dexterous robotic hand in combination with high-end combined capacitive pressure sensors (Fig. 1.19). But there is more to come, and not even limited to humanoid shapes.

Fig. 1.19
figure 19

©2022 Shadow Robot Company, used with permission

Fully actuated robotic hand Shadow Dexterous Hand by Shadow Robot Company with integrated BioTacs by SynTouch allowing manipulation with direct contact force- and direction measurement for each fingertip

5.6 Why Use a Haptic System?

The reasons one might want to use a haptic system are quite numerous: Perhaps you want to improve the task performance or lower the error rate in a manipulation scenario, address a previously unused sensory channel to convey additional information or gain advantages over a competitor in an innovation driven market. This book will not answer the question if haptics is able to fulfil the wishes and intentions connected to this reasons, but will focus on the design of a specific haptic system for the intended application.

Although there are many guidelines on how to implement haptic and multimodal feedback for optimal task performance (they will be addressed in Sect. 5.1.2), there are only limited sources on how to decide whether a haptic feedback is usable for an application. Acker provides some criteria for telepresence technologies in industrial applications [51], Jones gives guidelines on the usage of tactile systems [101].

6 Conclusions

Technical systems addressing the haptic sense cover a wide range of applications. Since this book focuses on the design process of task specific haptic interfaces, the following chapters will first focus on the deeper analysis of haptic interaction in Chap. 2 and the role of the user in a haptic system in Chap. 3, before a detailed analysis of the development and the structure of haptic systems is presented in Chaps. 4 and 6. This provides the basis for the second part of the book, that will deal with the actual design of a task-specific haptic system.

Recommended Background Reading

  • [102]    Papetti, S. & Saitis, C.: Musical Haptics. Springer Nature, 2018

    Inspirational between art and technology on haptics relevance for instruments.

  • [23]       Carbon, C.-C. & Jakesch, M.: A model for haptic aesthetic processing and its implications for design. Proceedings of the IEEE, 101(9), pp. 2123–2133, 2013.

    General model about the development of haptic aesthetics and the implications for the design of products.

  • [4]        Grunwald, M.: Human Haptic Perception: Basics and Applications.

    Birkhäuser, Basel, CH, 2008.

    General collection about the haptic sense with chapters about theory and history of haptics, neuro-physiological basics and psychological aspects of haptics.