Keywords

1 Introduction

Within recent years touch-based interfaces have become more and more pervasive and with the introduction of multi-touch capable touch screens a new level of interaction was possible. However, the resulting interactions were mainly based on using the X and Y coordinates of a user’s touch. To further improve the interaction and to create new forms of touch interaction, much research effort has been going on to enrich touch-based interaction by using more data related with the user’s touch (e.g., [5, 6]) This research stream was coined by Harrison et al. as “Rich Touch”, as these touch interactions go beyond traditional ones. To further study the design space of rich touch Harrison et al. [3] introduced the concept of using shear (the force tangential to a screens surface) as a supplemental 2-dimensional input channel. Thereby, a user exerts a force on the surface of an object (e.g., touchscreen of a mobile phone) into a tangential direction, which in turn moves the object physically. Shear-based interactions on surfaces consist of two forms of force – the tangential force to move the object and friction force that is created by applying pressure on such an object. Previous concepts of incorporating shear force into touch-based interactions, however, only use shear for input purposes. Also, the challenge of how to provide feedback for shear force interactions is still an open issue [3]. The goal of our research was to address this problem of giving feedback in shear-based touch interactions by utilizing actuated shear and explore the resulting design space for touch interaction.

Therefore, we built a prototype that extends a touch interface with actuated shear for physically limiting the shear-based input, i.e., locking the devices corresponding X and Y-axis. Actuated shear allows us to create a new way of physical output - namely changing the shear-based input affordance of the device itself to regards of the digital context created via the user’s touch (i.e., area of touch changes shear input behaviour). This approach provides an opportunity to give feedback during shear force based touch interaction, and to create a coupling between the digital context created via the user’s touch and the device’s shear input affordance.

Our concept of actuated shear extends prior work (see [3]) as we are able to actively change the possible axis of shear-based input and use this for feedback purposes. In this paper we present our approach of actuated shear for touch-based interaction and our exploratory prototype. We describe several use cases of how the concept of actuated shear can be utilized as a supplemental feedback and output channel. We finish with a discussion on the resulting design space and point out future work in this area.

2 Related Work

Rich touch approaches do not only use X and Y coordinates of a user’s touch: new interactions are possible by using e.g., different areas of the fingers [6] or thumb gestures [9], therefore enabling an additional layer of input. Harrison et al. [3] investigated how shear force (force applied tangential to the screens surface) could be applied as touch input on a tablet by linking common touch gestures (e.g., double tap) to this new layer of input. Shear force input was further applied to the field of smart watch interaction by creating a device which is capable of being physically tilted, panned, clicked, and twisted for interaction purposes [11]. However, rich touch interactions still lack proper feedback [3]. There is much research effort on how feedback on touch interfaces can be improved, e.g., near-surface feedback [2], or tactile feedback using air [10]. These approaches are mostly concerned to guide the user’s touch input. The area of research on actuated displays shows how dynamic, on-demand control of the physical properties of interfaces can improve interaction [7]. Previous work on actuated displays was done by researching shape changing displays [4] or by creating displays that are actuated to dynamically present different physical forms as feedback to the user [1, 8].

Nevertheless, as shear-based touch interfaces still lack proper feedback, we want to address this issue by extending the concept of shear force interaction using actuated shear to enable dynamic changes of the physical input affordance presented of a touch device. In that sense, we use shear not only as input but for actively locking the X and Y axis as a new form of output and supplemental feedback modality. We utilize the digital context created via the user’s touch to change the devices shear-based input behaviour. This creates a coupling between digital context and the device’s shear behavior. With our work, we contribute to the existing research efforts towards contextually aware rich touch interfaces.

3 Prototype Device

To explore the design space of using actuated shear, we built a prototype of a touch device that follows the concept of actively constraining the possible shear input axis (see Fig. 1). The prototype was built using a 7″ tablet and a set of Maker BeamFootnote 1 as a frame to hold the device. The frame holds buttons on each side of the tablet that are activated when shear force is applied via the user’s touch. Each side of the device is actuated via ArduinoFootnote 2 controlled solenoids that can actively lock the different axis of the shear input. For instance, activating the solenoids on the left and right locks the device’s shear input in the corresponding axis. We used solenoids because they are capable of changing their state rapidly. This enables us to quickly adapt the possible shear input based on the users input, resulting in a more seamless interaction. The communication between the tablet and the Arduino hardware was done via SpacebrewFootnote 3, which is a websocket-based prototyping framework that allowed us to quickly connect the prototypes hardware and software parts.

Fig. 1.
figure 1

Picture of prototype device. The device captures 2D touch location and 2D shear input. Each solenoid on the side of the tablet can be activated independently to lock a specific axis of the shear input (e.g., based on the touch input of the user).

4 Utilizing Actuated Shear

In the following, we describe different aspects of how our approach of actuated shear enables new interaction concepts. These concepts are the result of exploring the design space with our prototype and applying it to different contexts and domains by building a range of application prototypes. The current prototype allows us to explore the following types of interactions: actuated shear as haptic and directional feedback, embodied interaction by coupling digital context with matching actuated shear, and multimodal interaction based on constraining shear input to specific areas of the screen.

4.1 Actuated Shear as Haptic and Directional Feedback

Since the lack of feedback on touch screens is still an important issue, we see actuated shear as a further step to improve the feedback on such devices. When designing shear-based touch interaction, the question of how to design for visual feedback is also still an open question [3]. We want to address this lack of feedback on shear-based devices via haptic feedback created with actuated shear as it can be used to support the visual feedback presented on screen or to set tangible “boundaries”. For instance, when the maximum of a slider is reached, the corresponding shear input axis can be locked as a physical indicator. Further, our approach can be utilized to enable directional feedback. By activating a specific axis of the touch device (e.g., using solenoids as actuators) the device can be moved a short distance within the boundaries of the prototype’s frame. Therefore, we can support the visual “snapping” of digital objects e.g., during a drag and drop task with the corresponding directional feedback (slightly pushing the device in the target direction) to create the perception of two items “snapping” to each other (see Fig. 2).

Fig. 2.
figure 2

Utilizing actuated shear as directional feedback in a drag and drop task. The user drags a digital object while actuated shear feedback is applied to create a tangible “snapping” effect.

4.2 Changing Shear Input Affordance

By constraining the device’s shear input capabilities, we create a channel of physical output that can be used to change the device’s input affordance. Hence, the digital context (e.g., location of the user’s touch) and the device’s “shear affordance” can be coupled. For instance, this can be used to create an affordance coupling between physical movement attributes of UI elements (e.g., a slider moving horizontally) and the device’s affordance itself (i.e., shear force constrained to X axis). The touch input is used to set the digital context (e.g., the selection of a number value), whereas the shear input changes the value itself. In this example, adjusting a number value corresponds to a slider element, thus, the device locks the corresponding Y axis to make the device moveable only in the X axis. Hence, the properties of a slider, i.e., being able to move left and right, are applied to the shear input affordance of the device (see Fig. 3). In contrast to that, touching on areas related to boolean values (i.e., switching something on or off) conveys the affordance of a toggle, i.e., the ability to move up and down, by locking the corresponding shear input axis of the device.

Fig. 3.
figure 3

Translation of the physical movement attributes of the UI elements into changes of the device’s possible shear input. The touch input, i.e., area of screen, sets the digital context to actuate/lock the corresponding axes.

This mapping between shear input affordance and the digital context created via touch input, can be used to support (semi) blind interaction scenarios in safety critical application domains like the car.

In-car scenarios are often based on peripheral interaction with a touch screen in the middle console, while visually focusing on the main driving task (interaction remains the secondary task). Touch screens become more and more present in nowadays cars, however, the interaction with such interfaces lack the haptic nature of physical interface elements making peripheral interaction potentially harder. Actuated shear can integrate haptic guidance, that would be conveyed by a physical slider or toggle, into touch interaction (e.g., touch screen incorporates movement affordance of a slider).

4.3 Constrain Shear Input to a Specific Screen Area

By actively locking a specific side around the device, we are able to constrain the possible shear force input to specific screen areas (see Fig. 4a and b). Locking a single side of the device, enables “rotational” shear input (i.e., device moves in rotational manner around the locked axis that defines a rotation point). One exemplary use case is to consider actuated in situ manipulation of objects, e.g., selecting an object via touch and rotating it via shear-based input (see Fig. 4a). The digital context (i.e., position of the user’s touch and type of object to be manipulated) defines the shear input behaviour and which area of the screen is locked correspondingly. For instance, when the user touches the corner of a digital object (e.g., via long-press), the device locks its axis at the corresponding side by activating its solenoids. That creates a rotation point in the center of this axis enabling a rotate interaction via shear input.

Fig. 4.
figure 4

(a) Actuated shear-based in situ manipulation of objects. (b) Game controller overlay on touch screen. Left side with shear-based “joystick” overlay; shear movement on right side is locked.

There are more such object manipulation interactions possible like flipping an object horizontally/vertically or turning it for 90 degrees. Such interactions can be utilized for touch interactions where exact positioning and adjusting is necessary (e.g., touch-based image manipulation). Further, actuated shear could be used for game-based interactions on touch devices. For instance, Fig. 4b illustrates another concept of limiting the possible shear force to a specific screen area. By using a typical overlay of a physical game controller and limiting the shear movement to one side of the device, we create two different parallel interaction modalities on the same touch device.

5 Design Space

Based on the exploratory prototype and the above mentioned aspects of how to utilize actuated shear, we describe three levels of information that can be conveyed by actuated shear. We do this by applying a framing introduced by Antolini et al. [1] that also applies to actuated shear as a feedback modality.

Information Redundancy:

Actuated shear can be used to create feedback redundant to visual on-screen information (i.e., coupling of possible shear movement and digital context). This additional channel of information can be used in (semi) blind/peripheral interaction scenarios or to provide feedback in in situ touch manipulation tasks.

Information Transposition:

Actuated shear provides a way of giving feedback beyond what is felt via touch. The sensory substitution provided by our approach can be used to provide e.g., directional cues or physically perceivable virtual “boundaries” for touch-based games.

Information Balance:

Actuated shear can be used to reduce the amount of visual on-screen information needed, based on providing information via this new layer of output. We address the lacking visual feedback of other shear-based approaches with haptic feedback created by actuated shear. Therefore, this haptic feedback channel can be used instead of providing visual feedback in order to save screen real estate.

6 Findings and Limitations

First feedback from our explorations showed that the shear “state changes” our approach provides, should be synchronous with the touch input in order to enable seamless interaction between the user’s touch input and the feedback conveyed by actuated shear. Otherwise the potential lag in feedback can easily be misinterpreted by the user. Further, an open issue to consider is that by providing feedback via actuation during shear-based touch interaction, the user’s touch itself is also influenced. This in turn may lead to potential unintended inputs or loosing the initial focus of touch. The presented prototype acts as a proof of concept and provided a base for our design space explorations of using actuated shear for rich touch interactions. As we focused on the aspect of actively locking specific axis of the possible shear input, the button-based detection of the actual shear force was simplified in our prototype (only stepwise, on/off). Further, the locking of the device’s axis movement can thus far only be switched on or off. In future iterations of the hardware we aim at continuously varying the strength of the actuated shear feedback (i.e., pressure applied against the user’s shear force vector) to enable more diverse feedback possibilities. For future iterations of the hardware we aim at making the device smaller to be truly handheld in order to explore the resulting design space of utilizing actuated shear on small touch displays (e.g., smart watches).

7 Conclusion

With actuated shear we presented a supplemental 2D feedback channel to change the shear-based input affordance of a touch device. By sketching a range of use cases that extend the possible interactions and feedback on a shear-based touch device, we showed how this approach can be applied to different shear-based interactions. The concept of actuation extends and complements the existing concept of using shear force for rich touch interaction. This new modality allows a coupling between the digital context, created by the user’s touch and the device’s movement affordance, which results in richer touch-based interactions. Existing shear-based rich touch approaches (e.g., [3, 5]) enable a wide range of new input possibilities, whereas our approach extends them by enabling new feedback and interaction mechanisms with the opportunity to lock the device’s X and Y shear input axis actively and independently. Combining our approach of actuating the possible shear input with existing approaches of dynamically processing shear force (with more detail compared to our prototype) would result in a substantial extension of what is currently possible on shear-based touch interfaces. We will further conduct a series of user studies to gain insights on the feedback provided by actuated shear and how it affects the user’s experience. At the moment only the digital context - created via touch input - is used to influence the shear actuation behaviour. In our future research we aim at incorporating the physical context (e.g., physical surroundings influencing shear input behaviour) to enable couplings between the physical context and potential physically related interactions (e.g., collaborating with other co-located humans). The fact that the affordance changes that actuated shear conveys are not visually perceivable, but have to be sensed via touch, bears potential for interesting touch-based interaction that use “touch” beyond what is currently the case (i.e., not only as input but as a means for sensing) as well as for embodied interaction design to further enrich touch-based input modalities.