A Systematic Map of Mobile Software Usability Evaluation

  • Karima Moumane
  • Ali Idri
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 746)


Usability evaluation is currently considered critical for the success of mobile interactive applications. This paper presents a Systematic Mapping Study (SMS) that has been conducted to investigate the literature related to the Mobile Usability Evaluation (MUE) techniques. This mapping study builds on the followings classification criteria for the selection of studies: research approaches, research types, research domains, Usability evaluation methods, data collection tools, types of questionnaires that have been used in the empirical evaluations of these studies, and software quality (SQ) models. Publication channels and trends were also identified and 81 papers of MUE were selected.


Systematic Mapping Study Mobile Usability Software quality 


  1. 1.
    Idri, A., Moumane, K., Abran, A.: On the use of software quality standard ISO/IEC9126 in mobile environments. In: 2013 20th Asia-Pacific Software Engineering Conference (APSEC), vol. 1, pp. 1–8 (2013)Google Scholar
  2. 2.
    Moumane, K., Idri, A.: Using ISO 9126 with QoS DiffServ model for evaluating software quality in mobile environments. In: 2014 Second World Conference on Complex Systems (WCCS), pp. 134–139 (2014)Google Scholar
  3. 3.
    Moumane, K., Idri, A., Abran, A.: Usability evaluation of mobile applications using ISO 9241 and ISO 25062 standards. SpringerPlus, vol. 5, p. 548 (2016). avr.Google Scholar
  4. 4.
    Momane, K., Idri, A., Nafil, K.: An empirical evaluation of mobile software usability using ISO 9126 and QoS DiffServ model. In: Fujita, H., Papadopoulos, G.A. (eds.) New Trends in Software Methodologies, Tools and Techniques: Proceedings of the Fifteenth SoMeT. IOS Press (2016)Google Scholar
  5. 5.
    Petersen, K., Feldt, R., Mujtaba, S., Mattsson, M.:  Systematic mapping studies in software engineering. In: Proceedings of the 12th International Conference on Evaluation and Assessment in Software Engineering, Swindon, UK, pp. 68–77 (2008)Google Scholar
  6. 6.
    Moreno García, M.N., Román, I.R., García Peñalvo, F.J., Bonilla, M.T.: An association rule mining method for estimating the impact of project management policies on software quality, development time and effort. Expert Syst. Appl. 34(1), 522–529 (2008)CrossRefGoogle Scholar
  7. 7.
    Landis, J.R., Koch, G.G.: The measurement of observer agreement for categorical data. Biometrics 33(1), 159–174 (1977)CrossRefGoogle Scholar
  8. 8.
    Brereton, P., Kitchenham, B.A., Budgen, D., Turner, M., Khalil, M.: Lessons from applying the systematic literature review process within the software engineering domain. J. Syst. Softw. 80(4), 571–583 (2007)CrossRefGoogle Scholar
  9. 9.
    Jorgensen, M., Shepperd, M.: A systematic review of software development cost estimation studies. IEEE Trans. Softw. Eng. 33(1), 33–53 (2007)CrossRefGoogle Scholar
  10. 10.
    General Electric Company et al.: Factors in software quality: final report. Information Systems Programs, General Electric Co., Sunnyvale, CA (1977)Google Scholar
  11. 11.
    Unit 4 : Quality Models in Software Engineering. msritse 2012, 27 January 2013Google Scholar
  12. 12.
    Dromey, R.G.: Cornering the chimera. IEEE Softw. 13(1), 33–43 (1996)CrossRefGoogle Scholar
  13. 13.
    ISO 9241-420:2011 - Ergonomics of human-system interaction – Part 420: Selection of physical input devices. Accessed 13 Aug 2017
  14. 14.
    ISO/IEC 25010:2011(en), Systems and software engineering — Systems and software Quality Requirements and Evaluation (SQuaRE) — System and software quality models. Accessed 12 Aug 2017
  15. 15.
    ISO/IEC 25062:2006 - Software engineering – Software product Quality Requirements and Evaluation (SQuaRE) – Common Industry Format (CIF) for usability test reports. Accessed 13 Aug 2017
  16. 16.
    Hackos, J.T.: Review of handbook of usability testing: a practical guide to usability testing,; usability engineering. Tech. Commun. 42(2), 364–366 (1995)Google Scholar
  17. 17.
    Dumas, J.S., Redish, J.: A Practical Guide to Usability Testing. Intellect Books (1999)Google Scholar
  18. 18.
    Ericsson, K.A., Simon, H.A.: Verbal reports as data. Psychol. Rev. 87(3), 215–251 (1980). HCI & UCD ReaderCrossRefGoogle Scholar
  19. 19.
    Hix, D., Hartson, H.R.: Developing User Interfaces: Ensuring Usability Through Product & Process. John Wiley & Sons Inc., New York (1993)zbMATHGoogle Scholar
  20. 20.
    Lee, Y.S., Ryu, Y.S., Smith-jackson, T.L., Shin, D.J., Nussbaum, M.A., Tomioka, K.: Abstract usability testing with cultural groups in developing a cell phone navigation systemGoogle Scholar
  21. 21.
    ANOVA - Statistics Solutions. Accessed 12 Aug 2017
  22. 22.
    Lukander, K.: Measuring gaze point on handheld mobile devices. In: CHI 2004 Extended Abstracts on Human Factors in Computing Systems, New York, NY, USA, p. 1556 (2004)Google Scholar
  23. 23.
    Kjeldskov, J., Stage, J.: New techniques for usability evaluation of mobile systems. Int. J. Hum. Comput. Stud. 60(5), 599–620 (2004)CrossRefGoogle Scholar
  24. 24.
    Creating Realistic Laboratory Settings: Comparative Studies of Three Think-Aloud Usability Evaluations of a Mobile System. ResearchGate. Accessed 30 July 2017
  25. 25.
    Johnson, P.: Usability and mobility; interactions on the move. In: Proceedings of the First Workshop on Human-Computer Interaction with Mobile Devices (1998)Google Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Software Project Management Research Team, ENSIASMohammed V UniversityRabatMorocco

Personalised recommendations