Mobile accessibility: natural user interface for motion-impaired users

  • Cristina Manresa-YeeEmail author
  • Maria Francesca Roig-Maimó
  • Javier Varona
Long Paper


We designed a natural user interface to access mobile devices for motion-impaired people who cannot use the standard multi-touch input system to work with tablets and smartphones. We detect the head motion of the user by means of the frontal camera and use its position to interact with the mobile device. The purpose of this work is to evaluate the performance of the system. We conducted two laboratory studies with 12 participants without disabilities and a field study with four participants with multiple sclerosis (MS). The first laboratory study was done to test the robustness and to count with a base to compare the results of the evaluation done with the participants with MS. Once observed the results of the participants with disabilities, we conducted a new laboratory study with participants without disabilities simulating the limitations of the users with MS to tune the system. All participants completed a set of defined tasks: pointing and pointing-selecting. We logged use and conducted questionnaires post-experiment. Our results showed positive outcomes using the system as an input device, although apps should follow a set of recommendations on the size of the targets and their position to facilitate the interaction with mobile devices for motion-impaired users. The work demonstrates the interface’s possibilities for mobile accessibility for motion-impaired users who need alternative access devices to interact with mobile devices.


Natural user interface Vision-based interface Head-tracker Accessibility Assistive technology Motor impairments Mobile device Evaluation 



We acknowledge the Agencia Estatal de Investigación (AEI) and the European Regional Development Funds (ERDF) for its support to the Project TIN2012-35427 (AEI/ERDF, EU), TIN2016-81143-R (AEI/FEDER, UE) and the Grand FPI BES-2013-064652 (FPI). We thank all the volunteers who participated in this study and ABDEM staff for their support.


  1. 1.
    Stephanidis, C.: The Universal Access Handbook. CRC Press, Boca Raton (2009)CrossRefGoogle Scholar
  2. 2.
    Morris, J., Mueller, J., Jones, M.L.: Wireless technology uses and activities by people with disabilities. J. Technol. Pers. Disabil. Santiago 2, 29–45 (2014)Google Scholar
  3. 3.
    Wireless RERC: SUNspot—use of wireless devices by adults with disabilities (2013). Accessed 15 Dec 2015
  4. 4.
    Bosomworth D.: Mobile Marketing Statistics compilation (2015). Accessed 15 Dec 2015
  5. 5.
    Kane, S.K., Jayant, C., Wobbrock, J.O., Ladner, R.E.: Freedom to roam: a study of mobile device adoption and accessibility for people with visual and motor disabilities. In: Proceedings of the SIGACCESS ACM, pp. 115–122 (2009)Google Scholar
  6. 6.
    W3C: W3C Mobile Accessibility (2016). Accessed 03 April 2016
  7. 7.
    Anthony, L., Kim, Y., Findlater, L.: Analyzing user-generated YouTube videos to understand touchscreen use by people with motor impairments. In: Proceedings of the CHI, ACM, pp. 1223–1232 (2013)Google Scholar
  8. 8.
    Trewin, S., Swart, C., Pettick, D.: Physical accessibility of touchscreen smartphones. In: Proceedings of the SIGACCESS, ACM, pp. 19:1–19:8 (2013)Google Scholar
  9. 9.
    Biswas, P., Langdon, P.: Developing multimodal adaptation algorithm for mobility impaired users by evaluating their hand strength. Int. J. Hum. Comput. Interact. 28, 576–596 (2012). doi: 10.1080/10447318.2011.636294 CrossRefGoogle Scholar
  10. 10.
    Kouroupetroglou, G., Kousidis, S., Riga, P., Pino, A.: The mATHENA inventory for free mobile assistive technology applications. In: Ciuciu, I., Panetto, H., Debruyne, C., et al. (eds.) Proceedings of the Move to Meaningful Internet System. OTM 2015 Work, pp. 519–527. Springer, Cham (2015)CrossRefGoogle Scholar
  11. 11.
    Hakobyan, L., Lumsden, J., O’Sullivan, D., Bartlett, H.: Mobile assistive technologies for the visually impaired. Surv. Ophthalmol. 58, 513–528 (2013). doi: 10.1016/j.survophthal.2012.10.004 CrossRefGoogle Scholar
  12. 12.
    Jayant, C., Acuario, C., Johnson, W., et al.: V-braille: haptic braille perception using a touch-screen and vibration on mobile phones. In: Proceedings of the 12th International ACM SIGACCESS Conference on Computers and Accessibility, pp. 295–296. ACM, New York, NY (2010)Google Scholar
  13. 13.
    Bouck, E.C.: Assistive Technology. SAGE, Tyne (2016)Google Scholar
  14. 14.
    Belatar, M., Poirier. F.: Text entry for mobile devices and users with severe motor impairments: handiglyph, a primitive shapes based onscreen keyboard. In: Proceedings of the 10th international ACM SIGACCESS conference on Computers and accessibility, pp. 209–216. ACM, New York, NY, USA (2008)Google Scholar
  15. 15.
    Toyama, K.: “Look, Ma-No Hands!”: hands-free cursor control with real-time 3D face tracking. In: Proceedings of the Workshop on Perceptual User Interfaces, pp. 49–54 (1998)Google Scholar
  16. 16.
    Bradski, G.R.: Computer vision face tracking for use in a perceptual user interface. Intel Technol. J. Q2, 705 (1998)Google Scholar
  17. 17.
    Betke, M., Gips, J., Fleming, P.: The camera mouse: visual tracking of body features to provide computer access for people with severe disabilities. IEEE Trans. Neural Syst. Rehabil. Eng. 10, 1–10 (2002). doi: 10.1109/TNSRE.2002.1021581 CrossRefGoogle Scholar
  18. 18.
    Gorodnichy, D.O., Malik, S., Roth, G.: Nouse “Use Your Nose as a Mouse”: a new technology for hands-free games and interfaces. In: Proceedings of the Vision Interface, pp. 354–361 (2002)Google Scholar
  19. 19.
    Manresa-Yee, C., Varona, J., Perales, F., Salinas, I.: Design recommendations for camera-based head-controlled interfaces that replace the mouse for motion-impaired users. Univ. Access Inf. Soc. 13, 471–482 (2014). doi: 10.1007/s10209-013-0326-z CrossRefGoogle Scholar
  20. 20.
    Bulbul, A., Cipiloglu, Z., Capin, T.: A Face tracking algorithm for user interaction in mobile devices. In: CW’09 International Conference on CyberWorlds, 2009, pp. 385–390 (2009)Google Scholar
  21. 21.
    Joshi, N., Kar, A., Cohen, M.: Looking at you: fused gyro and face tracking for viewing large imagery on mobile devices. In: Proceedings of the SIGCHI Conference on Human Factors in Computer System, pp. 2211–2220. ACM, New York, NY (2012)Google Scholar
  22. 22.
    Hansen, T.R., Eriksson, E., Lykke-Olesen, A.: Use Your Head: Exploring Face Tracking for Mobile Interaction. In: CHI’06 Extended Abstracts on Human Factors in Computing Systems, pp 845–850. ACM, New York, NY (2006)Google Scholar
  23. 23.
    Bordallo Lopez, M., Hannuksela, J., Silven, O., Fan, L.: Head-tracking virtual 3-D display for mobile devices. In: 2012 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 27–34 (2012)Google Scholar
  24. 24.
    Umoove Ltd.: Umoove Experience: The 3D Face & Eye Tracking Flying Game (2015). Accessed 12 Dec 2015
  25. 25.
  26. 26.
    Shtick Studios: HeadStart (2015). Accessed 01 December 2015
  27. 27.
    Inisle Interactive Technologies Face Scape: FAce Scape (2017). Accessed 05 March 2017
  28. 28.
    Cuaresma, J., MacKenzie, I.S.: A comparison between tilt-input and facial tracking as input methods for mobile games. In: 6th IEEE Consumer Electronics Society Games, Entertainment, Media Conference, IEEE-GEM 2014, pp. 70–76. IEEE, New York (2014)Google Scholar
  29. 29.
    Fundación Vodafone: EVA Facial Mouse (2015). Accessed 01 December 2015
  30. 30.
    Google and Beit Issie Shapiro: Go ahead project (2015). Accessed 01 December 2015
  31. 31.
    Lopez-Basterretxea, A., Mendez-Zorrilla, A., Garcia-Zapirain, B.: Eye/head tracking technology to improve HCI with iPad applications. Sensors (Basel) 15, 2244–2264 (2015). doi: 10.3390/s150202244 CrossRefGoogle Scholar
  32. 32.
    Varona, J., Manresa-Yee, C., Perales, F.J.: Hands-free vision-based interface for computer accessibility. J. Netw. Comput. Appl. 31, 357–374 (2008). doi: 10.1016/j.jnca.2008.03.003 CrossRefGoogle Scholar
  33. 33.
    Manresa-Yee, C., Ponsa, P., Varona, J., Perales, F.J.: User experience to improve the usability of a vision-based interface. Interact. Comput. 22, 594–605 (2010). doi: 10.1016/j.intcom.2010.06.004 CrossRefGoogle Scholar
  34. 34.
    Roig-Maimó, M.F., Manresa-Yee, C., Varona, J.: A robust camera-based interface for mobile entertainment. Sensors 16, 254 (2016)CrossRefGoogle Scholar
  35. 35.
    Roig-Maimó, M.F., Varona Gómez, J., Manresa-Yee, C.: Face Me! Head-tracker interface evaluation on mobile devices. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, pp 1573–1578. ACM, New York, NY (2015)Google Scholar
  36. 36.
    National Multiple Sclerosis Society: What is MS? (2016). Accessed 01 December 2016
  37. 37.
    Apple Inc.: iOS Human Interface Guidelines: Designing for iOS (2016). Accessed 08 May 2016
  38. 38.
    Brooke, J.: SUS-A quick and dirty usability scale. Usability Eval. Ind. 189, 4–7 (1996)Google Scholar
  39. 39.
    ISO: ISO/TS 9241-411:2012, Ergonomics of human-system interaction-Part 411: evaluation methods for the design of physical input devices (2012)Google Scholar
  40. 40.
    MacKenzie, I.S., Kauppinen, T., Silfverberg, M.: Accuracy measures for evaluating computer pointing devices. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 9–16. ACM, New York, NY (2001)Google Scholar
  41. 41.
    Bangor, A., Kortum, P.T., Miller, J.T.: An empirical evaluation of the system usability scale. Int. J. Hum. Comput. Interact. 24, 574–594 (2008). doi: 10.1080/10447310802205776 CrossRefGoogle Scholar

Copyright information

© Springer-Verlag GmbH Germany 2017

Authors and Affiliations

  1. 1.Department of Mathematics and Computer ScienceUniversitat de les Illes BalearsPalmaSpain

Personalised recommendations