Remote Usability Testing of Data Input Methods for Web Applications

  • Krzysztof Osada
  • Patient Zihisire Muke
  • Mateusz Piwowarczyk
  • Zbigniew Telec
  • Bogdan TrawińskiEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 12033)


The main purpose of the paper was to conduct a comparative analysis and examine the usability of selected methods and patterns of data entry in web systems and websites. A dedicated web application was developed as an experimental tool for conducting remote unmoderated usability tests. Implemented data entry design patterns and tested with real users included various alignment of labels on forms, entering of logical values, small numbers, dates and time. The metrics collected during the test comprised: time to complete a task, a number of mouse clicks, number of errors committed, the Single Ease Question survey, closed and open questions regarding the subjective assessment of tested patterns. Based on the collected results, recommendations for the best patterns and methods of data entry in the specific context of use were formulated.


User experience Remote usability testing Data input Design patterns Web applications 


  1. 1.
    Yudhoatmojo, S.B., Sutendi, S.F.: Empirical study on the use of two remote asynchronous usability testing methods. J. Phys. Conf. Ser. 1193, 012017 (2019). Scholar
  2. 2.
    Bruun, A., Gull, P., Hofmeister, L., Stage, J.: Let your users do the testing: a comparison of three remote asynchronous usability testing methods. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2009, pp. 1619–1628. ACM (2009).
  3. 3.
    Tylkowska, J., Kurasińska, L.: Moderated and Unmoderated Remote Usability Testing: What Is It, how to Run It, and What Are the Benefits? (2019). Accessed 22 Nov 2019
  4. 4.
    Bastien, J.M.C.: Usability testing: a review of some methodological and technical aspects of the method. Int. J. Med. Informatics 79(4), e18–e23 (2010). Scholar
  5. 5.
    Bernacki, J., Błażejczyk, I., Indyka-Piasecka, A., Kopel, M., Kukla, E., Trawiński, B.: Responsive web design: testing usability of mobile web applications. In: Nguyen, N.T., Trawiński, B., Fujita, H., Hong, T.-P. (eds.) ACIIDS 2016. LNCS (LNAI), vol. 9621, pp. 257–269. Springer, Heidelberg (2016). Scholar
  6. 6.
    Błażejczyk, I., Trawiński, B., Indyka-Piasecka, A., Kopel, M., Kukla, E., Bernacki, J.: Usability testing of a mobile friendly web conference service. In: Nguyen, N.-T., Manolopoulos, Y., Iliadis, L., Trawiński, B. (eds.) ICCCI 2016. LNCS (LNAI), vol. 9875, pp. 565–579. Springer, Cham (2016). Scholar
  7. 7.
    Krzewińska, J., Indyka-Piasecka, A., Kopel, M., Kukla, E., Telec, Z., Trawiński, B.: Usability testing of a responsive web system for a school for disabled children. In: Nguyen, N.T., Hoang, D.H., Hong, T.-P., Pham, H., Trawiński, B. (eds.) ACIIDS 2018. LNCS (LNAI), vol. 10751, pp. 705–716. Springer, Cham (2018). Scholar
  8. 8.
    Drygielski, M., Indyka-Piasecka, A., Piwowarczyk, M., Telec, Z., Trawiński, B., Duong, T.H.: Usability testing of data entry patterns implemented according to material design guidelines for the web. In: Nguyen, N.T., Chbeir, R., Exposito, E., Aniorté, P., Trawiński, B. (eds.) ICCCI 2019. LNCS (LNAI), vol. 11683, pp. 697–711. Springer, Cham (2019). Scholar
  9. 9.
    Myka, J., Indyka-Piasecka, A., Telec, Z., Trawiński, B., Dac, H.C.: Comparative analysis of usability of data entry design patterns for mobile applications. In: Nguyen, N.T., Gaol, F.L., Hong, T.-P., Trawiński, B. (eds.) ACIIDS 2019. LNCS (LNAI), vol. 11431, pp. 737–750. Springer, Cham (2019). Scholar
  10. 10.
    Osada, K., Zihisire Muke, P., Piwowarczyk, M., Telec, Z., Trawiński, B.: Comparative usability analysis of selected data entry methods for web systems. Cybern. Syst. Int. J. 51(2), 192–213 (2020). Scholar
  11. 11.
    Alghamdi, A.S., Al-badi, A., Alroobaea, R., Mayhew, P.J.: A comparative study of synchronous and asynchronous remote usability testing methods. Int. Rev. Basic Appl. Sci. 1(3), 61–97 (2013)Google Scholar
  12. 12.
    Madathil, K.C., Greenstein, J.S.: An investigation of the efficacy of collaborative virtual reality systems for moderated remote usability testing. Appl. Ergon. 65, 501–514 (2017). Scholar
  13. 13.
    Chynal, P., Sobecki, J.: Statistical verification of remote usability testing method. In: Proceedings of the Multimedia, Interaction, Design and Innovation, MIDI 2015, pp. 12:1–12:7 (2015)Google Scholar
  14. 14.
    Sauer, J., Sonderegger, A., Heyden, K., Biller, J., Klotz, J., Uebelbacher, A.: Extra-laboratorial usability tests: an empirical comparison of remote and classical field testing with lab testing. Appl. Ergon. 74, 85–96 (2019). Scholar
  15. 15.
    Tullis, T., Fleischman, S., Mcnulty, M., Cianchette, C., Bergel, M.: An empirical comparison of lab and remote usability testing of Web sites. In: Proceedings of Usability Professionals Conference (2002). Accessed 15 Dec 2019
  16. 16.
    Wei, C., Barrick, J., Cuddihy, E., Spyridakis, J.: Conducting usability research through the internet: testing users via the WWW. In: Proceedings of the Usability Professional Association, pp. 1–8 (2005)Google Scholar
  17. 17.
    Thompson, K.E., Rozanski, E.P., Haake, A.R.: Here, there, anywhere: remote usability testing that works. In: SIGITE Conference - IT Education - The State of the Art, pp. 132–137 (2004)Google Scholar
  18. 18.
    Scholtz, J.: Adaptation of traditional usability testing methods for remote testing. In: Proceedings of the 34th Annual Hawaii International Conference on System Sciences. IEEE (2002).
  19. 19.
    Sauro, J., Lewis, J.R.: Quantifying the User Experience: Practical Statistics for User Research, 2nd edn. Morgan Kaufmann, Cambridge (2016)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  1. 1.Department of Applied InformaticsWrocław University of Science and TechnologyWrocławPoland

Personalised recommendations