A Taxonomy of Microinteractions: Defining Microgestures Based on Ergonomic and Scenario-Dependent Requirements

  • Katrin Wolf
  • Anja Naumann
  • Michael Rohs
  • Jörg Müller
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6946)

Abstract

This paper explores how microgestures can allow us to execute a secondary task, for example controlling mobile applications, without interrupting the manual primary task, for instance, driving a car. In order to design microgestures iteratively, we interviewed sports- and physiotherapists while asking them to use task related props, such as a steering wheel, a cash card , and a pen for simulating driving a car, an ATM scenario, and a drawing task. The primary objective here is to define microgestures that are easily performable without interrupting or interfering the primary task. Using expert interviews, we developed a taxonomy that classifies these gestures according to their task context. We also assessed the ergonomic and attentional attributes that influence the feasibility and task suitability of microinteractions, and evaluated their level of resources required. Accordingly, we defined 21 microgestures that allow performing microinteractions within a manual, dual task context. Our taxonomy poses a basis for designing microinteraction techniques.

Keywords

gestures microinteractions dual-task multitask interruption 

References

  1. 1.
    Ashbrook, D.: Enabling Mobile Microinteractions, Doctoral Theses, Georgia Institute of Technology (2010) Google Scholar
  2. 2.
    Chewar, C.M., McCrickard, D.S., Ndiwalana, A., North, C., Pryor, J., Tessendorf, D.: Secondary task display attributes: optimizing visualizations for cognitive task suitability and interference avoidance. In: Proc. Data Visualisation, pp. 165–171 (2002) Google Scholar
  3. 3.
    Czerwinski, M., Horvitz, E., Wilhite, S.: A Diary, Study of Task Switching and Interruptions. In: Proc. Conference on Human Factors in Computing Systems, pp. 175–182 (2004)Google Scholar
  4. 4.
    Feix, T., et al.: Grasp Taxonomy Comparison Sheet, http://web.student.tuwien.ac.at/~e0227312/documents/taxonomy_comparison.pdf
  5. 5.
    Harrison, C., et al.: Skinput: Appropriating the Body as an Input Surface. In: Proc. CHI 2010 (2010)Google Scholar
  6. 6.
    Howard, B, Howard, S.: Lightglove: Wrist-Worn Virtual Typing and Pointing. In: Proc. ISWC 2001 (2001)Google Scholar
  7. 7.
    Karam, M.: A Study on the Use of Semaphoric Gestures to Support Secondary Task Interactions. In: Proc. UIST 2001 (2003) Google Scholar
  8. 8.
    Loclair, C., Gustafson, S., Baudisch, P.: PinchWatch: A Wearable Device for One-Handed Microinteractions. In: Proc. MobileHCI 2010 (2010) Google Scholar
  9. 9.
    McCrickard, D.S., Chewar, C.M., Somervell, J.P., Ndiwalana, A.: A model for notification systems evaluation—assessing user goals for multitasking activity. ACM Transactions on Computer-Human Interaction (TOCHI) 10(4), 312–228Google Scholar
  10. 10.
    Norman, D.A.: Natural user interfaces are not natural. Interactions 17(3) (May-June 2010) Google Scholar
  11. 11.
    Norman, D.A.: Gestural Interfaces. A Step backwards in Usability. Interactions 17(5) (September-October 2010)Google Scholar
  12. 12.
    Oulasvirta, A., Tamminen, S., Roto, V., Kuorelahti, J.: Interaction in 4-Second Bursts: The Fragmented Nature of Attentional Resources in Mobile HCI. In. Proc. CHI 2005 (2005)Google Scholar
  13. 13.
    Quek, F., McNeill, D., Bryll, R., Duncan, S., Ma, X.-F., Kirbas, C., McCullough, K.E., Ansari, R.: Multimodal human discourse: gesture and speech. ACM Transactions on Computer-Human Interaction (TOCHI) 9(3), 171–193Google Scholar
  14. 14.
    Rekimoto, J., et al.: GestureWrist and GesturePad: Unobtrusive Wearable Interaction Devices. In: Proc. ISWC 2001, pp. 21–27 (2001)Google Scholar
  15. 15.
    Saponas, T., et al.: Enabling Always-Available Input with Muscle-Computer Interfaces. In: Proc. UIST 2009 (2009)Google Scholar
  16. 16.
    Spalteholz, W., Spanner, R.: Handatlas der Anatomie des Menschen – Erster Teil: Bewegungsapparat, Amsterdam, p. 284 (1960) Google Scholar
  17. 17.
    Tan, D., Morris, D., Saponas, T.S.: Interfaces on the Go, In XRDS. Crossroads. The ACM Magazine for Students, 30, doi:10.1145/1764848.1764856 Google Scholar
  18. 18.
    Vardy, A., et al.: The WristCam as Input Device. In: Proc. ISWC 1999, pp. 199–202 (1999) Google Scholar
  19. 19.
    Wexelblat, A.: Research Challenges in Gestures: Open issues and unsolved problems. In: Proc. International Gesture Workshop on Gesture and Sign Language in Human-Computer Interaction 1997, pp. 1–11 (1997) Google Scholar
  20. 20.
    Wickens, C.D.: Processing resources in attention. In: Parasuraman, R., Davies, D.R. (eds.) Varieties of Attention, pp. 63–102. Academic Press, New York (1984)Google Scholar
  21. 21.
    Wolf, K, Dicke, C., Grasset, R.: Touching the Void: Gestures for Auditory Interfaces. In: Proc. TEI (2010)Google Scholar

Copyright information

© IFIP International Federation for Information Processing 2011

Authors and Affiliations

  • Katrin Wolf
    • 1
  • Anja Naumann
    • 1
  • Michael Rohs
    • 2
  • Jörg Müller
    • 1
  1. 1.Deutsche Telekom LaboratoriesTU BerlinBerlinGermany
  2. 2.LMU MunichMunichGermany

Personalised recommendations