INTERACT 2017: Human-Computer Interaction - INTERACT 2017 pp 155-172 | Cite as
Blind FLM: An Enhanced Keystroke-Level Model for Visually Impaired Smartphone Interaction
Abstract
The Keystroke-Level Model (KLM) is a predictive model used to numerically predict how long it takes an expert user to accomplish a task. KLM has been successfully used to model conventional interactions, however, it does not thoroughly render smartphone touch interactions or accessible interfaces (e.g. screen readers). On the other hand, the Fingerstroke-level Model (FLM) extends KLM to describe and assess mobile-based game applications, which marks it as a candidate model for predicting smartphone touch interactions.
This paper aims to further extend FLM for visually impaired smartphone users. An initial user study identified basic elements of blind users’ interactions that were used to extend FLM; the new model is called “Blind FLM’”. Then an additional user study was conducted to determine the applicability of the new model for describing blind users’ touch interactions with a smartphone, and to compute the accuracy of the new model. Blind FLM evaluation showed that it can predict blind users’ performance with an average error of 2.36%.
Keywords
Keystroke-Level Mode (KLM) Fingerstroke-Level Model (FLM) Mobile phone Smartphone Mobile KLM Touch interaction Visually impaired users Blind usersReferences
- 1.Abdulin, E.: Using the keystroke-level model for designing user interface on middle-sized touch screens. In: Proceedings of the 2011 Annual Conference Extended Abstracts on Human Factors in Computing Systems, pp. 673–686 (2011). doi: 10.1145/1979742.1979667
- 2.Asakawa, C., Takagi, H., Ino, S., Ifukube, T.: Maximum listening speeds for the blind. In: Proceedings of the International Community for Auditory Display, pp. 276–279 (2003)Google Scholar
- 3.Batran, K.E., Dunlop, M.D.: Enhancing KLM (Keystroke-level Model) to fit touch screen mobile devices. In: Proceedings of the 16th International Conference on Human-computer Interaction with Mobile Devices & Services (MobileHCI 2014), pp. 283–286 (2014). doi: 10.1145/2628363.2628385
- 4.Card, S.K., Moran, T.P., Newell, A.: The keystroke-level model for user performance time with interactive systems. Commun. ACM 23(7), 396–410 (1980). doi: 10.1145/358886.358895 CrossRefGoogle Scholar
- 5.Card, S.K., Newell, A., Moran, T.P.: The psychology of human-computer interaction. L. Erlbaum Associates Inc., Hillsdale (1983)Google Scholar
- 6.Cox, A.L., Cairns, P.A., Walton, A., Lee, S.: Tlk or txt? Using voice input for SMS composition. Pers. Ubiquit. Comput. 12(8), 567–588 (2008). doi: 10.1007/s00779-007-0178-8 CrossRefGoogle Scholar
- 7.Dunlop, M.D., Crossan, A.: Predictive text entry methods for mobile phones. Pers. Technol. 4(2), 134–143 (2000). doi: 10.1007/BF01324120 CrossRefGoogle Scholar
- 8.Erazo, O., Pino, J.A.: Predicting task execution time on natural user interfaces based on touchless hand gestures. In: Proceedings of the 20th International Conference on Intelligent User Interfaces (IUI 2015), pp. 97–109 (2015). doi: 10.1145/2678025.2701394
- 9.Holleis, P., Otto, F., Hussmann, H., Schmidt, A.: Keystroke-level model for advanced mobile phone interaction. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI 2007), pp. 1505–1514 (2007). doi: 10.1145/1240624.1240851
- 10.Holleis, P., Scherr, M., Broll, G.: A revised mobile KLM for interaction with multiple NFC-tags. In: Proceedings of the 13th IFIP TC 13 International Conference on Human-Computer Interaction - Volume Part IV (INTERACT 2011), pp. 204–221 (2011). http://dl.acm.org/citation.cfm?id=2042283.2042306
- 11.John, B.E., Kieras, D.E.: The GOMS family of user interface analysis techniques comparison and contrast. ACM Trans. Comput.-Hum. Interact. 3(4), 320–351 (1996). doi: 10.1145/235833.236054 CrossRefGoogle Scholar
- 12.John, B.E., Kieras, D.E.: Using GOMS for user interface design and evaluation: which technique? ACM Trans. Comput.-Hum. Interact. 3(4), 287–319 (1996). doi: 10.1145/235833.236050 CrossRefGoogle Scholar
- 13.John, B.E., Prevas, K., Salvucci, D.D., Koedinger, K.: Predictive human performance modeling made easy. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI 2004), pp. 455–462 (2004). doi: 10.1145/985692.985750
- 14.Lee, A., Song, K., Ryu, H.B., Kim, J., Kwon, G.: Fingerstroke time estimates for touchscreen-based mobile gaming interaction. Hum. Mov. Sci. 44, 211–224 (2015). doi: 10.1016/j.humov.2015.09.003 CrossRefGoogle Scholar
- 15.Li, F.C.Y., Guy, R.T., Yatani, K., Truong, K.N.: The 1Line keyboard: a QWERTY layout in a single line. In: Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology (UIST 2011), pp. 461–470 (2011). doi: 10.1145/2047196.2047257
- 16.Li, H., Liu, Y., Liu, J., Wang, X., Li, Y., Rau, P.-L.P.: Extended KLM for mobile phone interaction: a user study result. In: CHI 2010 Extended Abstracts on Human Factors in Computing Systems (CHI EA 2010), pp. 3517–3522 (2010). doi: 10.1145/1753846.1754011
- 17.Liu, Y., Räihä, K.J.: Predicting chinese text entry speeds on mobile phones. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI 2010), pp. 2183–2192 (2010). doi: 10.1145/1753326.1753657
- 18.Luo, L., Siewiorek, D.P.: KLEM: a method for predicting user interaction time and system energy consumption during application design. In: Proceedings of the 2007 11th IEEE International Symposium on Wearable Computers (ISWC 2007), pp. 1–8 (2007). doi: 10.1109/ISWC.2007.4373782
- 19.MacKenzie, I.S.: Motor behaviour models for human-computer interaction. HCI models, theories, and frameworks: Toward a multidisciplinary science, 27–54 (2003)Google Scholar
- 20.Rice, A.D., Lartigue, J.W.: Touch-level model (TLM): evolving KLM-GOMS for touchscreen and mobile devices. In: Proceedings of the 2014 ACM Southeast Regional Conference (ACM SE 2014), pp. 53:1–53:6 (2014). doi: 10.1145/2638404.2638532
- 21.Schrepp, M., Fischer, P.: A GOMS model for keyboard navigation in web pages and web applications. In: Miesenberger, K., Klaus, J., Zagler, Wolfgang L., Karshmer, Arthur I. (eds.) ICCHP 2006. LNCS, vol. 4061, pp. 287–294. Springer, Heidelberg (2006). doi: 10.1007/11788713_43 CrossRefGoogle Scholar
- 22.Takagi, H., Asakawa, C., Fukuda, K., Maeda, J.: Accessibility designer: visualizing usability for the blind. In: Proceedings of the 6th International ACM SIGACCESS Conference on Computers and Accessibility (Assets 2004), pp. 177–184 (2004). doi: 10.1145/1028630.1028662
- 23.Teo, L., John, B.E.: Comparisons of keystroke-level model predictions to observed data. In: CHI 2006 Extended Abstracts on Human Factors in Computing Systems (CHI EA 2006), pp. 1421–1426 (2006). doi: 10.1145/1125451.1125713
- 24.Tonn-Eichstädt, H.: Measuring website usability for visually impaired people-a modified GOMS analysis. In: Proceedings of the 8th International ACM SIGACCESS Conference on Computers and Accessibility (Assets 2006), pp. 55–62 (2006). doi: 10.1145/1168987.1168998
- 25.Trewin, S., John, B.E. Richards, J., Swart, C., Brezin, J., Bellamy, R., Thomas, J.: Towards a tool for keystroke level modeling of skilled screen reading. In: Proceedings of the 12th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS 2010), pp. 27–34 (2010). doi: 10.1145/1878803.1878811
- 26.Web Accessibility Initiative (WAI). Introduction to Web Accessibility. https://www.w3.org/WAI/intro/accessibility.php. Accessed 21 Jan 2017
- 27.World Health Organization (WHO). Visual impairement and blindnessGoogle Scholar
- 28.Behavioral Observation Research Interactive Software (BORIS). http://www.boris.unito.it/. Accessed 8 Jan 2017