Exploring Mobile User Experience Through Code Quality Metrics

  • Gerardo Canfora
  • Andrea Di Sorbo
  • Francesco MercaldoEmail author
  • Corrado Aaron Visaggio
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10027)


Smartphones have been absorbed into everyday life at an astounding rate, and continue to become more and more widely used. Much of the success of the mobile paradigm can be attributed to the discover of a huge market. Users may pick from a large collection of software, in domains ranging from games to productivity. Each platform makes the task of installing and removing apps very simple, further inciting users to try new software. Smartphone users may download applications from the official Google Play market, but those applications do not pass any review process, and can be downloaded very shortly after submission. Google Play does not offer any mechanism to ensure the user about the quality of the installed app, and this is particularly true for user experience: the user simply downloads and runs the application. In this paper we propose a features set to evaluate the code quality of Android applications to understand how user experience varies in mobile ecosystem. Our findings show that developers need to focus on software quality in order to make their applications usable from the user point of view.


Software quality Mobile applications User experience 


  1. 1.
    Higa, D.: Walled gardens versus the wild west. Computer 41(10), 102–105 (2008)CrossRefGoogle Scholar
  2. 2.
    Mercaldo, F., Visaggio, C.A.: Evaluating mobile malware by extracting user experience-based features. In: Abrahamsson, P., Corral, L., Oivo, M., Russo, B. (eds.) PROFES 2015. LNCS, vol. 9459, pp. 497–512. Springer, Heidelberg (2015). doi: 10.1007/978-3-319-26844-6_37 Google Scholar
  3. 3.
    Canfora, G., Mercaldo, F., Visaggio, C.A., DAngelo, M., Furno, A., Manganelli, C.: A case study of automating user experience-oriented performance testing on smartphones. In: 2013 IEEE Sixth International Conference on Software Testing, Verification and Validation, pp. 66–69. IEEE (2013)Google Scholar
  4. 4.
    Kechagia, M., Spinellis, D.: Undocumented and unchecked: exceptions that spell trouble. In: Proceedings of the 11th Working Conference on Mining Software Repositories, pp. 312–315. ACM (2014)Google Scholar
  5. 5.
    Carver, J.C., Juristo Juzgado, N., Baldassarre, M.T., Vegas Hernández, S.: Replications of software engineering experiments. Empirical Softw. Eng. 19(2), 267–276 (2014)CrossRefGoogle Scholar
  6. 6.
    Ardimento, P., Caivano, D., Cimitile, M., Visaggio, G.: Empirical investigation of the efficacy and efficiency of tools for transferring software engineering knowledge. J. Inf. Knowl. Manage. 7(03), 197–207 (2008)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2016

Authors and Affiliations

  • Gerardo Canfora
    • 2
  • Andrea Di Sorbo
    • 2
  • Francesco Mercaldo
    • 1
    Email author
  • Corrado Aaron Visaggio
    • 2
  1. 1.Institute for Informatics and TelematicsNational Research Council of Italy (CNR)PisaItaly
  2. 2.Department of EngineeringUniversity of SannioBeneventoItaly

Personalised recommendations