1 Introduction

During last years, low-cost hand-tracking devices have attracted the attention of many researchers who try to make more natural the interaction with application emulating real procedure, such as training in surgery, design of custom-fit products and clothing [1, 15]. The main idea is to find alternative solutions to recreate professional virtual environments to allow the end-user to do his/her work with no particular skills about IT. In fact, the use of hand-tracking devices becomes very interesting when we try to emulate traditional workflows mainly done using either hands/fingers or objects held in hand.

Among different commercial hand-tracking devices, we consider the Leap motion device [14] as best choice for our research activity. The underlying idea is to include this interaction style within a knowledge-based CAD system, named Socket Modelling Assistant (SMA) [3], specifically developed to design lower limb socket. SMA permits to create the virtual model of a prosthetic socket according to the patient’s anthropometric measurements and his/her digital model. SMA emulates traditional process manufacturing to create socket shape, which is the most critical part of whole prosthesis. During traditional process, the technician continuously uses her/his hands to shape the socket. Initially, s/he makes an evaluation of the amputee and creates a negative cast manipulating plaster patches directly on patient’s residual limb. Then, s/he realizes the positive model that is manually modified by adding and removing chalk in specific zones and according to stump measurements and morphology (Fig. 1).

Fig. 1.
figure 1

Some traditional steps of socket manufacturing process.

Our aim is to emulate manual operations directly using both hand-tracking and haptic devices. In this paper, we focus the attention on the development of the software interface to create a Natural User Interface (NUI) for hand-tracking devices. To this end, we have developed a NUI mainly based on the use of Leap Motion device to interact with the 3D models of the residual limb and the socket. Many research works have been done to study and improve the quality of this type of NUI, in particular, we paid attention to different aspects mainly related to ergonomics and ease of use, which are the most important features to recreate a better user’s experience with augmented interaction.

2 Scientific Background

Virtual reality is the emulation of environment in which human being can perform a series of actions in order to interact with 3D objects We focus our attention on virtual reality that permits to recreate sensory experience through the emulation of tactile sense [12]. The emulation of tactile sense is possible using computer devices that are now available at low cost and, thus, we can exploit them in different research works to understand real potentialities for using new solutions in real contexts, such as lower limb prosthesis design and clothing [10]. In this section, we describe different IT devices to recreate virtual reality as well as different issues relative to the software development of natural user interface for hand-tracking devices.

2.1 Haptic Devices and Hand/Tracking Devices

Emulation of tactile sense considers two types of devices, i.e., haptic and hand-tracking devices. Haptic devices recreate force-feedback to emulate sense of touch according to shapes and material associated to 3D objects in a virtual environment. There are several haptic devices, such as single contact-point haptic devices with different degrees of freedom and haptic gloves. Among different available solutions, we have exploited both low cost haptic devices (i.e., Novint Falcon) and developed in house haptic-mouse) to emulate some operation during socket design with SMA [9].

Hand-tracking devices are able to detect motion of hands/fingers and object held in hand [11, 17, 21]. There are different hand-tracking devices on the market, among which Leap Motion device, Intel Gesture Camera, Duo3D and Microsoft Kinect 2.0 (Fig. 2).

Fig. 2.
figure 2

Low cost hand-tracking devices: Leap Motion device (a), Intel Gesture Camera (b), Duo3D device (c) and Microsoft Kinect V2.

The Leap motion device exploits two IR cameras to determine position and orientation of hands/fingers with high precision. Tracking is very accurate and it can be calibrated to map fingertip positions on the screen. Leap motion device has been used in several research works in order to try a new type of interaction with applications [4, 11]. Furthermore, this hand-tracking device can be used with Oculus Rift to increase the quality of user experience into a virtual system. Figure 3 shows how Leap Motion device is used with Oculus Rift device. This solution is able to track hands/fingers according to natural motion of head.

Fig. 3.
figure 3

Leap Motion device and Oculus Rift.

Intel Gesture Camera includes RGB and depth camera as well as microphones. It is able to detect simple gestures, such as waving, swiping and circling with hand. Intel gesture camera permits also face analysis and speech recognition.

Duo3D is very small and ultra-compact and it’s an ideal solution for mobile projects. Duo3D offers different technologies on its board, i.e., an accelerometer, a gyroscope, a temperature sensor as well as two IR cameras.

Microsoft Kinect v2 offers an HD-RGB camera and a powerful IR sensor as well. A large field of view allows this sensor to track object and people closer to the cameras. Just simple gesturers can be tracked by it, such as open and closed hands.

Furthermore, they make available software development kits (SDKs), which permit to create software interfaces with other applications. In this research work, we pay attention on Leap Motion device and its SDK because it allows us to simply define a set of gestures as well as basic hands/fingers tracking.

2.2 Natural User Interface

NUI is an emerging concept in Human/Computer Interaction that refers to an interface that becomes invisible to its user with successive learned interactions related to natural human behavior. As stated in [16] “The word natural is used because most computer interfaces use artificial control devices whose operation has to be learned. A NUI relies on a user being able to carry out relatively natural motions, movements or gestures that they quickly discover control the computer application or manipulate the digital content”. In other words, designed NUI has to be able to make user experience very simple and comfortable. Preliminary research works have been carried out in order to understand and consequently solve matters related to NUIs for hand-tracking devices [7, 8, 13, 22].

Developed software interfaces have to implement a set of features to create the most comfortable user interface to interact which chosen hand-tracking device. The aim of this research work is the NUI design between Leap Motion device and a virtual system, which permits to design prosthetic socket. In the next section, we describe the features of the application for prosthetic modelling as well as software architecture to which the developed NUI is linked.

3 Prosthetic Socket Modelling

Socket modelling assistant is a CAD application that allows designing socket of lower limb prosthesis starting from the digital model of the patient (Fig. 4.a). It is based on a knowledge guided approach and makes available a virtual environment where the user can emulate actions made by orthopedic technicians during traditional manufacturing process. Starting from MRI images, SMA is able to automatically reconstruct the 3D model of the residual limb by exploiting NURBS surfaces [6]. Then, a set of virtual tools allows user to mold virtual socket shape according to patient’s data (Fig. 4.b).

Fig. 4.
figure 4

(a) SMA main modules and (b) modelling virtual tools.

Virtual tools are subdivided in three main groups according to type of modeling tasks:

  1. 1.

    Preliminary modeling. Generation of a preliminary geometric model of the socket onto which other specific modifications will be applied to achieve the functional shape. The main operations during this phase are carried out almost completely in automatic way according to patient characteristics and traditional process.

  2. 2.

    Customized modeling. Customization of socket model according to the residual limb morphology. The user can proceed in two different ways: automatic or interactive shape manipulation. Indeed, an ad hoc modeling tool permits to modify the shape according to the stump tonicity.

  3. 3.

    Completing the socket geometric model. The designer shapes the upper edge in an automatic or semi-automatic way, and the system automatically trims the model along defined upper edge and applies the socket thickness to obtain the final socket model.

As described above, the virtual tools of SMA emulate the operation usually made by technicians who continuously use his/her hands to modify the chalk of residuum positive model to define the socket and used to create the thermoformed socket. This feature has been possible through the development of a software interface, which permits to interact with the 3D environment using either traditional interaction devices (e.g., mouse and keyboard) or hand-tracking and haptic devices.

Finally, an external module exploits a FEA commercial system (i.e., Abaqus) to study the residuum socket interaction [2]. The designed socket can be exported in different formats (e.g., STL and IGES) that may be used to create real model using 3D printer or emulating gait analysis using virtual bio-mechanicals models.

3.1 Software Architecture

The whole system has been completely developed in C ++ using open source SDKs, such as Qt [18], VTK [20] and OpenCascade [19]. Qt allowed us to create the user interface in very simple way using a graphic editor. The Visualization Tool Kit permits to manage advanced modelling techniques, such as implicit modelling, mesh reduction and Delaunay triangulation as well. VTK exploits OpenGL or DirectX in automatic way and thus, the developer can realize his/her applications in very simple way. OpenCascade is a software library, which permits to develop complex CAD, CAM and CAE system. We used OpenCascade mainly for its exporting modules, which allow us to save socket model in either STL or IGES format. Furthermore, a developed in house software library, named SimplyNURBS, has been used to manage NURBS surface. SimplyNURBS permits to recreate 3D model of residuum from MRI volume through the use of modules that hide every complex aspect relative to NURBS models, such as mathematical models and operation applied to them [5].

The modularity of software architecture allows us to add a new module, which exploits Leap Motion device in order to interact with SMA using hands/fingers. In the next section we describe different steps to design NUI to achieve our aim.

4 NUI Design for Hand-Tracking Devices in SMA

NUI design for hand-tracking devices has to be based on definition of rules that are useful to design an intuitive interaction for a generic application. Leap motion developers subdivided these rules as follows [14]:

  • Ergonomics, Posture and Environment. We have to consider how much fatigue the interaction creates at arms and shoulders.

  • Virtual Space Mapping and Interaction Resolution. Every gesture has to be mapped according to 2D screen resolution and 3D virtual environment of the application.

  • Dynamic Feedback and Gestalt Intermediates. The interface has to give user the best feedback in order to understand what happened using a particular gesture.

  • Interaction Space: Separation, States and Transitions. User has to be able to understand which type of action is activated during interaction.

These rules are implemented into every software interface that exploits hand-tracking devices. In the last year, we aimed at creating a NUI to make more natural the interaction with SMA using Leap Motion device. Preliminary experimentation has been executed to design a good NUI, but many problems came to light with regards to ease of use and ergonomics during long lasting interaction. In fact, testers were not able to learn how to interact with SMA in correct way, because they didn’t understand which gesture was activated and thus, they couldn’t execute the correct action to model 3D socket shape. Furthermore, a long lasting interaction using Leap Motion device created fatigue to arms and shoulders making users’ experience very tiring and useless.

4.1 Gestures Definition

Before starting to develop the software interface, we have defined a set of gestures according to different actions that are executed to design the socket with SMA. They are the following ones:

  • Horizontal motion of single finger to move from one modeling tool to another one.

  • Free motions of a single finger to move the selection cursor.

  • Palm rotation of right hand to rotate the stump and socket models during the use of the modeling tools.

  • Vertical motions of the palms to zoom–in and out of 3D models.

  • Movement of thin object hold in hand (e.g., a pencil) to geometrically modify the 3D geometric model. This can be used, for example, to mark with pencil of different colors critical zones as done traditionally by the prosthetist (Fig. 8.a).

  • Action of pinch to select a point of surface in order to modify socket shape according to the residuum virtual model (Fig. 8.b and c) and create the socket upper edge.

Other gestures have been added in order to give a better feedback to the user during interaction with SMA. In particular, there are two available interaction styles; Camera and Modification . Camera style rotates 3D model following the orientation of a palm. In this modality, other gestures are disabled. Modification style permits to execute actions relative to socket design according to the gestures previously described.

To switch between the two modalities, one has to put a hand above the sensor with thumb and index finger extended and, then, trace a circle in the air (Fig. 5.a).

Fig. 5.
figure 5

Gestures used to switch between the Camera and Modification modalities.

To move the hand without interacting, the user has to face the palm to the screen. In this way you can find a more comfortable position or rest your hand without compromising the current state of the application (Fig. 5.b).

The user can always check the confidence of data coming from the Leap Motion device. This percentage is displayed at the bottom of the software interface. This percentage permits to understand which is the state of the detected action. Indeed, if percentage is under 20 % the detected gestures is considered NOT_DETECTED; while percentage is between 20 % and 65 % a warning icon appears on left lower corner of screen. If confidence is over 65 % user can interact with high precision.

In order to provide additional feedback there is an icon at the right bottom corner of the screen showing which hand gesture has been detected (Fig. 6).

Fig. 6.
figure 6

Icon and progress bar into SMA to show state of interaction using Leap Motion device.

The gestures are few because the user has to learn how to interact with SMA in a very simple way. The difference among gestures allows user to distinguish each action from the other ones and thus, the use of augmented interaction appears simple and intuitive during user’s experience.

4.2 Software Development of NUI

SMA communicates with Leap Motion device through its SDK and applies the appropriate action, which has to be executed according to retrieved data. In order to use Leap Motion device into SMA, we exploited a set of VTK classes, which makes available methods to interact with the 3D scene of application. The interaction handler is a subclass of the existing VTK class vtkInteractorStyle , i.e., vtkInteractorStyleTrackballcamera ; while SensorInteractorStyle is an abstract class for augmented interaction. It is a subclass of vtkInteractorStyleTrackballcamera , which represents the interactions by moving the camera. In our class some methods are over-ridden and some abstract methods are introduced (Fig. 7).

Fig. 7.
figure 7

UML class diagram of software interface for exploiting Leap Motion device within SMA.

Some of the most important fields and methods are described in the following:

  • enableSensor() and disableSensor() : abstract methods that enable/disable the interaction when user is using an hand-tracking device.

  • transformVector(double* vector) : method for transforming a vector from the sensor coordinate system to the VTK scene coordinate system.

  • OnTimer() : abstract method redefined in this class and implemented in the specialized class. This method is executed every 50 ms in order to query hand-tracking device and execute the correct action according to detected gesture.

This class has been extended with another class in order to use Leap Motion device. LeapInteractorStyle contains a set of methods and instructions to exploit Leap motion SDK. Some of the most important fields and methods are described in the following:

  • State : its value may be either CAMERA or MODIFICATION, which depends by the current interaction mode.

  • NextMode() : method that switches between interaction modes.

  • leapController : Leap Motion instance that communicates with the device. From this object we get all information about hands and movements through software instructions of Leap Motion SDK.

In these methods, there are the instructions, which permit to give information to the user in order to understand what happens during the interaction, such as quality of interaction, interaction mode and state of each action.

5 Preliminary Tests and Results

The system has been tested with ten volunteers to evaluate system performances especially with regards ergonomics, ease of use and precision. The testers were six male and four female with different levels of experience (i.e., beginners and experts) regarding the use of hand-tracking device. First a demo was carried out to show how to execute modeling operations by means of the Leap Motion device. Then, each tester performed the three main basic operations until s/he was able to interact in natural way without the help of technical staff. The modeling operations were: basic interactions with 3D environment (e.g., zoom in and out), marking critical zones (Fig. 8.a) and trim-line definition (Fig. 8.b–c).

Fig. 8.
figure 8

Marking critical zones of residual limb (a), pinch action to model socket shape (b and c).

Testers with a low level of experience found very useful the icons showing the detected gestures but the gestures should be repeated from 5 to 8 times to be properly executed by the user and detected by the system especially for tasks involving 3D geometry modifications. Experts were able to correctly perform the modeling tasks with no more than 2–3 trials.

Regarding ergonomics issues, the new version of the NUI and related gestures allows an adequate and long lasting interaction without fatigue at the arms.

The icons permit to simply learn how to interact with SMA and have been considered helpful mainly by beginners. On the other hand, experts have appreciated the progress bar while beginners were more concentrated on gesture execution and detection.

6 Conclusion

This paper presents the use of Leap Motion device to design the socket for lower limb prosthesis. We have implemented a new NUI starting from a set of rules related to ergonomics and ease of use. The NUI has been tested by volunteers who provided a positive feedback. We have planned to introduce other technologies in order to make more natural and realistic the interaction with SMA system, such as Oculus Rift and last version of Microsoft Kinect.

Finally, tests have been planned with a set of orthopedic technicians who will use our system to design a wearable socket as well as to experiment this new development approach in other contexts, such as consumer products design and clothing.