Advertisement

Tactile Interaction for Novice User

Uncolocated Gestures
  • Denis ChêneEmail author
  • Vincent Pillot
  • Marc-Éric Bobillier Chaumon
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9754)

Abstract

This paper introduces the concept of tactile interaction for novice elderly users. Cognitive difficulties, motor constraints, visual overloads and lacks of feedback lead to hardly usable tactile smartphone among elderly users. An optimized tactile interface was produced, offering continuous and secure gestures, and introducing “uncolocated gestures”. Comparative tests to a classic tactile interface show that those gestures solves interaction problems but generates other difficulties. Uncolocation is a solution of interest but has to be learned and has to be progressively acquired through activity. A final enhanced profile for Elderly users was set and solves this situation, enabling uncolocation manipulation for Back and Up-Down commands and preventing it for Validation command, until it is totally acquired by the user.

Keywords

Handheld devices for the elderly Use and design of smart phones for the elderly Novice profile MenuDFA Tactile interaction 

References

  1. 1.
    Hwangbo, H., Yoon, S.H., Jin, B.B., Han, Y.S., Ji, Y.G.: A study of pointing performance of elderly users on smartphones. Int. J. Hum. Comput. Interact. 29(9), 604–618 (2013)CrossRefGoogle Scholar
  2. 2.
    Trewin, S., Swart, C., Pettick, D.: Physical accessibility of touchscreen smartphones. In: Proceedings of the 15th International ACM SIGACCESS Conference on Computer and Accessibility, vol. 19, pp. 1–8 (2013)Google Scholar
  3. 3.
    Kobayashi, M., Hiyama, A., Miura, T., Asakawa, C., Hirose, M., Ifukube, T.: Elderly user evaluation of mobile touchscreen interactions. In: Campos, P., Graham, N., Jorge, J., Nunes, N., Palanque, P., Winckler, M. (eds.) INTERACT 2011, Part I. LNCS, vol. 6946, pp. 83–99. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  4. 4.
    MacGookin, D., Brewster, S., Jiang, W.: Investigating touchscreen accessibility for people with visual impairments. In: Proceedings of the 5th Nordic Conference on Human-Computer Interaction: Building Bridges, pp. 298–307 (2008)Google Scholar
  5. 5.
    Karlson, A.K., Bederson, B.B.: ThumbSpace: generalized one-handed input for touchscreen-based mobile devices. In: Baranauskas, C., Abascal, J., Barbosa, S.D.J. (eds.) INTERACT 2007. LNCS, vol. 4662, pp. 324–338. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  6. 6.
    Bergstrom-Lehtovirta, J., Oulasvirta, A., Brewster, S.: The effects of walking speed on target acquisition on a touchscreen interface. In: Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services, pp. 143–146 (2011)Google Scholar
  7. 7.
    Chourasia, A.O., Wiegmann, D.A., Chen, K.B., Irwin, C.B., Sesto, M.E.: Effect of sitting or standing on touch screen performance and touch characteristics. J. Hum. Factors Ergon. Soc. 55(4), 789–802 (2013)CrossRefGoogle Scholar
  8. 8.
    Karlson, A.K., Bederson, B.B., Contreras-Vidal, J.L.: Understanding single-handed mobile device interaction. In Lumsden, J., (Ed) Handbook of Research on User Interface Design and Evaluation for Mobile Technologie, pp. 86–101 (2008)Google Scholar
  9. 9.
    Trudeau, M.B., Young, J.G., Jindrich, D.L., Dennerlein, J.T.: Thumb motor performance varies with thumb and wrist posture during single-handed mobile phone use. J. Biomech. 45, 2349–2354 (2012)CrossRefGoogle Scholar
  10. 10.
    Trudeau, M.B., Udtamadilok, T., Karlson, A.K., Dennerlein, J.T.: Thumb motor performance varies by movement orientation, direction, and device size during single-handed mobile phone use. Hum. Factors J. Hum. Factors Ergon. Soc. 54(1), 52–59 (2012)CrossRefGoogle Scholar
  11. 11.
    Berolo, S., Wells, R.P., Amick, B.C.: Musculoskeletal symptoms among mobile hand-held device users and their relationship to device use: a preliminary study in a Canadian university population. Appl. Ergon. 42(2), 371–378 (2011)CrossRefGoogle Scholar
  12. 12.
    Park, J., Han, S.H.: Defining user value: a case study of a smartphone. Int. J. Ind. Ergon. 43(4), 274–282 (2013)CrossRefGoogle Scholar
  13. 13.
    Wobbrock, J.O., Myers, B.A., Aung, H.H.: The performance of hand postures in front-and back-of-device interaction for mobile computing. Int. J. Hum. Comput. Stud. 66(12), 857–875 (2008)CrossRefGoogle Scholar
  14. 14.
    Metatla, O., Martin, F., Stockman, T., Bryan-Kinns, N.: Non-visual menu navigation: the effect of and audio-tactile display. In: Proceedings of the 28th International BCS Human Computer Interaction Conference on HCI 2014-Sand, pp. 213–217 (2014)Google Scholar
  15. 15.
    Kane, S.K., Wobbrock, J.O., Ladner, R.E.: Usable gestures for blind people: understanding preference and performance. In: Proceedings of the SIGCHI Conference on Human Factors in Computing System, pp. 413–422 (2011)Google Scholar
  16. 16.
    Petit, É., Chêne, D.: MenuDfA: navigation gestuelle tactile sans contrainte dans un menu linéaire hiérarchique. Article de recherche, Orange Labs. 2014. Archive ouverte HAL <hal-01056978> (2014)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Denis Chêne
    • 1
    Email author
  • Vincent Pillot
    • 2
  • Marc-Éric Bobillier Chaumon
    • 2
  1. 1.Orange LabsMeylanFrance
  2. 2.GRePS (EA 4163)BronFrance

Personalised recommendations