Smartphone Applications Usability Evaluation: A Hybrid Model and Its Implementation

  • Artur H. Kronbauer
  • Celso A. S. Santos
  • Vaninha Vieira
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7623)


Evaluating the usability of smartphone applications is crucial for their success, so developers can learn how to adapt them considering the dynamicity of mobile scenarios. The HCI community recommends considering different requirements when evaluating those applications, such as quantitative data (metrics), subjective evaluation (users’ impressions) and context data (e.g. environment and devices conditions). We observed a lack in the literature of approaches that support those three requirements combined into a single experiment; generally one or a pair of them is used. Besides, performing usability evaluation on real mobile scenarios is hard to achieve and most proposals are based on laboratory-controlled experiments. In this paper, we present our proposal for a hybrid usability evaluation of smartphone applications, which is composed by a model and an infrastructure that implements it. The model describes how to automatically monitor and collect context data and usability metrics, how those data can be processed for analysis support and how users’ impressions can be collected. An infrastructure is provided to implement the model allowing it to be plugged into any smartphone Android-based application. To evaluate our proposal, we performed a field experiment, with 21 users using three mobile applications during a 6-month period, in their day-to-day scenarios.


Usability Evaluation Smartphone Application Remote Usability Evaluation Usability Testing 


  1. 1.
    Eustáquio Rangel de Queiroz, J., de Sousa Ferreira, D.: A Multidimensional Approach for the Evaluation of Mobile Application User Interfaces. In: Jacko, J.A. (ed.) HCII 2009, Part I. LNCS, vol. 5610, pp. 242–251. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  2. 2.
    Zhang, D., Adipat, B.: Challenges, Methodologies, and Issues in the Usability Testing of Mobile Applications. International Journal of Human-Computer Interaction 18(3), 293–308 (2005)CrossRefGoogle Scholar
  3. 3.
    Ivory, M.Y., Hearst, M.A.: The state of the art in automating usability evaluation of user interfaces. ACM Computer Survey 33, 470–516 (2001)CrossRefGoogle Scholar
  4. 4.
    Froehlich, J., Chen, M., Consolvo, S., Harrison, B., Landay, J.: MyExperience: A System for In Situ Tracing and Capturing of User Feedback on Mobile Phones. Mobile Systems, 57–70 (2007)Google Scholar
  5. 5.
    Meschtscherjakov, A., Weiss, A., Scherndl, T.: Utilizing Emoticons on Mobile Devices within ESM studies to Measure Emotions in the Field. In: Proc. MME in conjunction with MobileHCI 2009, pp. 3361–3366. ACM, Bonn (2009)Google Scholar
  6. 6.
    Kiczales, G., Lamping, J., Mendhekar, A., Maeda, C., Lopes, C.V., Loingtier, J.M., Irwin, J.: Aspect-oriented Programming. In: Aksit, M., Auletta, V. (eds.) ECOOP 1997. LNCS, vol. 1241, pp. 220–242. Springer, Heidelberg (1997)CrossRefGoogle Scholar
  7. 7.
    Moldovan, G.S., Tarta, A.: Automatic Usability Evaluation using AOP. In: IEEE International Conference on Automation, Quality and Testing, Robotics, vol. 2, pp. 84–89. IEEE Computer Society, Los Alamitos (2006)CrossRefGoogle Scholar
  8. 8.
    Tao, Y.: Toward Computer-Aided Usability Evaluation for Evolving Interactive Software. In: Proc. ECOOP – Workshop on Reflection, AOP and Meta-Data, pp. 9–16 (2007)Google Scholar
  9. 9.
    Bateman, S., Gutwin, C., Osgood, N., McCalla, G.: Interactive usability instrumentation. In: EICS 2009: 1st ACM SIGCHI Symposium on Engineering Interactive Computing Systems, pp. 45–54. ACM (2009)Google Scholar
  10. 10.
    Balagtas-Fernandez, F., Hussmann, H.: A methodology and framework to simplify usability analysis of mobile applications. In: ASE 2009: International Conference on Automated Software Engineering, pp. 520–524. IEEE Computer Society (2009)Google Scholar
  11. 11.
    Shepard, C., Rahmati, A., Tossell, C., Zhong, L., Kortum, P.: LiveLab: measuring wireless networks and smartphone users in the field. ACM SIGMETRICS Performance Evaluation Review, 15–20 (2011)Google Scholar
  12. 12.
    Falaki, H., Mahajan, R., Estrin, D.: SystemSens: a tool for monitoring usage in smartphone research deployments. In: 6th ACM Int. Work on Mobility in the Evolving Internet Architecture (2011)Google Scholar
  13. 13.
    Guo, R., Zhu, T., Wang, Y., Xu, X.: MobileSens: A Framework of Behavior Logger on Android Mobile Device. In: 6th International Conference Pervasive Computing and Applications (ICPCA), pp. 281–286 (2011)Google Scholar
  14. 14.
    Shin, M., Cornelius, C., Peebles, D., Kapadia, A., Kotz, D., Triandopoulos, N.: AnonySense: A System for Anonymous Opportunistic Sensing. Pervasive and Mobile Computing, 16–30 (2011)Google Scholar
  15. 15.
    Ickin, S., Wac, K., Fiedler, M., Janowski, L., Hong, J., Dey, A.K.: Factors Influencing Quality of Experience of Commonly Used Mobile Applications. IEEE Communications Magazine, 48–56 (2012)Google Scholar
  16. 16.
    Paterno, F., Russino, A., Santoro, C.: Remote evaluation of mobile applications. Task Models and Diagrams for User Interface Design, 155–169 (2007)Google Scholar
  17. 17.
    Au, F.T.W., Baker, S., Warren, I., Dobbie, G.: Automated Usability Testing Framework. In: Proceedings of the 9th Australasian User Interface Conference, vol. 76, Australian Computer Society, Inc. (2008)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Artur H. Kronbauer
    • 1
  • Celso A. S. Santos
    • 2
  • Vaninha Vieira
    • 3
  1. 1.PMCC – UFBASalvadorBrazil
  2. 2.DI – CT – UFESVitóriaBrazil
  3. 3.DCC – UFBASalvadorBrazil

Personalised recommendations