Actuated Shear: Enabling Haptic Feedback on Rich Touch Interfaces
- 2.1k Downloads
We present an approach of a shear force based touch interface that provides a way of actively changing the possible shear-based input (force tangential to a screens surface) by physically locking the corresponding axis of the device. This approach of actuated shear aims at using shear not only as input, but to create a new form of output modality that changes the input affordance of the device itself. It enables a new channel of incorporating physical information and constraints into touch-based interaction (i.e., by changing the input affordance of the device and using shear as a feedback mechanism). With this actuated shear approach, we create a coupling between the digital context created via touch and the actual physical input affordance of the device. Based on the implementation of a prototype, we discuss the design space of actively changing the input affordance of a shear-based touch device, sketch interaction ideas as well as future application scenarios and domains.
KeywordsRich touch Shear force Actuated displays Haptic feedback
Within recent years touch-based interfaces have become more and more pervasive and with the introduction of multi-touch capable touch screens a new level of interaction was possible. However, the resulting interactions were mainly based on using the X and Y coordinates of a user’s touch. To further improve the interaction and to create new forms of touch interaction, much research effort has been going on to enrich touch-based interaction by using more data related with the user’s touch (e.g., [5, 6]) This research stream was coined by Harrison et al. as “Rich Touch”, as these touch interactions go beyond traditional ones. To further study the design space of rich touch Harrison et al.  introduced the concept of using shear (the force tangential to a screens surface) as a supplemental 2-dimensional input channel. Thereby, a user exerts a force on the surface of an object (e.g., touchscreen of a mobile phone) into a tangential direction, which in turn moves the object physically. Shear-based interactions on surfaces consist of two forms of force – the tangential force to move the object and friction force that is created by applying pressure on such an object. Previous concepts of incorporating shear force into touch-based interactions, however, only use shear for input purposes. Also, the challenge of how to provide feedback for shear force interactions is still an open issue . The goal of our research was to address this problem of giving feedback in shear-based touch interactions by utilizing actuated shear and explore the resulting design space for touch interaction.
Therefore, we built a prototype that extends a touch interface with actuated shear for physically limiting the shear-based input, i.e., locking the devices corresponding X and Y-axis. Actuated shear allows us to create a new way of physical output - namely changing the shear-based input affordance of the device itself to regards of the digital context created via the user’s touch (i.e., area of touch changes shear input behaviour). This approach provides an opportunity to give feedback during shear force based touch interaction, and to create a coupling between the digital context created via the user’s touch and the device’s shear input affordance.
Our concept of actuated shear extends prior work (see ) as we are able to actively change the possible axis of shear-based input and use this for feedback purposes. In this paper we present our approach of actuated shear for touch-based interaction and our exploratory prototype. We describe several use cases of how the concept of actuated shear can be utilized as a supplemental feedback and output channel. We finish with a discussion on the resulting design space and point out future work in this area.
2 Related Work
Rich touch approaches do not only use X and Y coordinates of a user’s touch: new interactions are possible by using e.g., different areas of the fingers  or thumb gestures , therefore enabling an additional layer of input. Harrison et al.  investigated how shear force (force applied tangential to the screens surface) could be applied as touch input on a tablet by linking common touch gestures (e.g., double tap) to this new layer of input. Shear force input was further applied to the field of smart watch interaction by creating a device which is capable of being physically tilted, panned, clicked, and twisted for interaction purposes . However, rich touch interactions still lack proper feedback . There is much research effort on how feedback on touch interfaces can be improved, e.g., near-surface feedback , or tactile feedback using air . These approaches are mostly concerned to guide the user’s touch input. The area of research on actuated displays shows how dynamic, on-demand control of the physical properties of interfaces can improve interaction . Previous work on actuated displays was done by researching shape changing displays  or by creating displays that are actuated to dynamically present different physical forms as feedback to the user [1, 8].
Nevertheless, as shear-based touch interfaces still lack proper feedback, we want to address this issue by extending the concept of shear force interaction using actuated shear to enable dynamic changes of the physical input affordance presented of a touch device. In that sense, we use shear not only as input but for actively locking the X and Y axis as a new form of output and supplemental feedback modality. We utilize the digital context created via the user’s touch to change the devices shear-based input behaviour. This creates a coupling between digital context and the device’s shear behavior. With our work, we contribute to the existing research efforts towards contextually aware rich touch interfaces.
3 Prototype Device
4 Utilizing Actuated Shear
In the following, we describe different aspects of how our approach of actuated shear enables new interaction concepts. These concepts are the result of exploring the design space with our prototype and applying it to different contexts and domains by building a range of application prototypes. The current prototype allows us to explore the following types of interactions: actuated shear as haptic and directional feedback, embodied interaction by coupling digital context with matching actuated shear, and multimodal interaction based on constraining shear input to specific areas of the screen.
4.1 Actuated Shear as Haptic and Directional Feedback
4.2 Changing Shear Input Affordance
This mapping between shear input affordance and the digital context created via touch input, can be used to support (semi) blind interaction scenarios in safety critical application domains like the car.
In-car scenarios are often based on peripheral interaction with a touch screen in the middle console, while visually focusing on the main driving task (interaction remains the secondary task). Touch screens become more and more present in nowadays cars, however, the interaction with such interfaces lack the haptic nature of physical interface elements making peripheral interaction potentially harder. Actuated shear can integrate haptic guidance, that would be conveyed by a physical slider or toggle, into touch interaction (e.g., touch screen incorporates movement affordance of a slider).
4.3 Constrain Shear Input to a Specific Screen Area
There are more such object manipulation interactions possible like flipping an object horizontally/vertically or turning it for 90 degrees. Such interactions can be utilized for touch interactions where exact positioning and adjusting is necessary (e.g., touch-based image manipulation). Further, actuated shear could be used for game-based interactions on touch devices. For instance, Fig. 4b illustrates another concept of limiting the possible shear force to a specific screen area. By using a typical overlay of a physical game controller and limiting the shear movement to one side of the device, we create two different parallel interaction modalities on the same touch device.
5 Design Space
Based on the exploratory prototype and the above mentioned aspects of how to utilize actuated shear, we describe three levels of information that can be conveyed by actuated shear. We do this by applying a framing introduced by Antolini et al.  that also applies to actuated shear as a feedback modality.
Actuated shear can be used to create feedback redundant to visual on-screen information (i.e., coupling of possible shear movement and digital context). This additional channel of information can be used in (semi) blind/peripheral interaction scenarios or to provide feedback in in situ touch manipulation tasks.
Actuated shear provides a way of giving feedback beyond what is felt via touch. The sensory substitution provided by our approach can be used to provide e.g., directional cues or physically perceivable virtual “boundaries” for touch-based games.
Actuated shear can be used to reduce the amount of visual on-screen information needed, based on providing information via this new layer of output. We address the lacking visual feedback of other shear-based approaches with haptic feedback created by actuated shear. Therefore, this haptic feedback channel can be used instead of providing visual feedback in order to save screen real estate.
6 Findings and Limitations
First feedback from our explorations showed that the shear “state changes” our approach provides, should be synchronous with the touch input in order to enable seamless interaction between the user’s touch input and the feedback conveyed by actuated shear. Otherwise the potential lag in feedback can easily be misinterpreted by the user. Further, an open issue to consider is that by providing feedback via actuation during shear-based touch interaction, the user’s touch itself is also influenced. This in turn may lead to potential unintended inputs or loosing the initial focus of touch. The presented prototype acts as a proof of concept and provided a base for our design space explorations of using actuated shear for rich touch interactions. As we focused on the aspect of actively locking specific axis of the possible shear input, the button-based detection of the actual shear force was simplified in our prototype (only stepwise, on/off). Further, the locking of the device’s axis movement can thus far only be switched on or off. In future iterations of the hardware we aim at continuously varying the strength of the actuated shear feedback (i.e., pressure applied against the user’s shear force vector) to enable more diverse feedback possibilities. For future iterations of the hardware we aim at making the device smaller to be truly handheld in order to explore the resulting design space of utilizing actuated shear on small touch displays (e.g., smart watches).
With actuated shear we presented a supplemental 2D feedback channel to change the shear-based input affordance of a touch device. By sketching a range of use cases that extend the possible interactions and feedback on a shear-based touch device, we showed how this approach can be applied to different shear-based interactions. The concept of actuation extends and complements the existing concept of using shear force for rich touch interaction. This new modality allows a coupling between the digital context, created by the user’s touch and the device’s movement affordance, which results in richer touch-based interactions. Existing shear-based rich touch approaches (e.g., [3, 5]) enable a wide range of new input possibilities, whereas our approach extends them by enabling new feedback and interaction mechanisms with the opportunity to lock the device’s X and Y shear input axis actively and independently. Combining our approach of actuating the possible shear input with existing approaches of dynamically processing shear force (with more detail compared to our prototype) would result in a substantial extension of what is currently possible on shear-based touch interfaces. We will further conduct a series of user studies to gain insights on the feedback provided by actuated shear and how it affects the user’s experience. At the moment only the digital context - created via touch input - is used to influence the shear actuation behaviour. In our future research we aim at incorporating the physical context (e.g., physical surroundings influencing shear input behaviour) to enable couplings between the physical context and potential physically related interactions (e.g., collaborating with other co-located humans). The fact that the affordance changes that actuated shear conveys are not visually perceivable, but have to be sensed via touch, bears potential for interesting touch-based interaction that use “touch” beyond what is currently the case (i.e., not only as input but as a means for sensing) as well as for embodied interaction design to further enrich touch-based input modalities.
The financial support by the Austrian Federal Ministry of Science, Research and Economy and the National Foundation for Research, Technology and Development is gratefully acknowledged (Christian Doppler Laboratory for Contextual Interfaces).
- 1.Alexander, J., Lucero, A., Subramanian, S.: Tilt displays: designing display surfaces with multi-axis tilting and actuation. In: MobileHCI 2012, pp. 161–170. ACM, New York (2012)Google Scholar
- 2.Antolini, M., Bordegoni, M., Cugini, U.: A haptic direction indicator using the gyro effect. In: 2011 IEEE World Haptics Conference (2011)Google Scholar
- 3.Harrison, C., Hudson, S.: Using shear as a supplemental two-dimensional input channel for rich touchscreen interaction. In: CHI 2012, pp. 3149–3152. ACM, New York (2012)Google Scholar
- 4.Hemmert, F., Joost, G., Knörig, A., Wettach, R.: Dynamic knobs: shape change as a means of interaction on a mobile phone. In: CHI 2008 EA on Human Factors in Computing Systems, CHI, pp. 2309–2314. ACM, New York (2008)Google Scholar
- 5.Heo, S., Lee, G.: ForceDrag: using pressure as a touch input modifier. Using pressure as a touch input modifier. In: Proceedings of the 24th Australian Computer-Human Interaction Conference. ACM, New York (2012)Google Scholar
- 6.Huang, D.Y., Tsai, M.C., Tung, Y.C., Tsai, M.L., Yeh, Y.T., Chan, L., Hung, Y.P., Chen, M.Y.: Touchsense: expanding touchscreen input vocabulary using different areas of users’ finger pads. In: CHI 2014, pp. 189–192. ACM, New York (2014)Google Scholar
- 7.Poupyrev, I., Nashida, T., Okabe, M.: Actuation and tangible user interfaces: the vaucanson duck, robots, and shape displays. In: TEI 2007, pp. 205–212. ACM, New York (2007)Google Scholar
- 8.Roudaut, A., Karnik, A., Löchtefeld, M., Subramanian, S.: Morphees: toward high “shape resolution” in self-actuated exible mobile devices. In: CHI 2013, pp. 593–602. ACM, New York (2013)Google Scholar
- 9.Roudaut, A., Lecolinet, E., Guiard, Y.: Microrolls: expanding touch-screen input vocabulary by distinguishing rolls vs. slides of the thumb. In: CHI 2009, pp. 927–936. ACM, New York (2009)Google Scholar
- 11.Xiao, R., Gierad, L., Harrison, C.: Expanding the input expressivity of smartwatches with mechanical pan, twist, tilt and click. In: CHI 2014, pp. 193–196. ACM, New York (2014)Google Scholar