Human-Computer Interaction

INTERACT 2015: Human-Computer Interaction – INTERACT 2015 pp 203-210 | Cite as

On Applying Experience Sampling Method to A/B Testing of Mobile Applications: A Case Study

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9297)

Abstract

With the advent of mobile devices, the experience sampling method (ESM) is increasingly used as a convenient and effective way to capture user behaviors of, and evaluate mobile and environment-context dependent applications. Like any field based in situ testing methods, ESM is prone to biases from unreliable and unbalanced data, especially for A/B testing situations. Mitigating such effects can in turn incur significant costs in terms of the number of participants and sessions, and prolonged experimental time. In fact, ESM has rarely been applied to A/B testing nor do existing literatures reveal its operational details and difficulties. In this paper, as a step toward establishing concrete guidelines, we describe a case study of applying ESM to evaluating two competing interfaces for a mobile application. Based on the gathered data and direct interviews with the participants, we highlight the difficulties experienced and lessons learned. In addition, we make a proposal for a new ESM in which the experimental parameters are dynamically reconfigured based on the intermediate experimental results to overcome the aforementioned difficulties.

Keywords

Experience sampling method (ESM) A/B testing Usability 

References

  1. 1.
    Larson, R., Csikszentmihalyi, M.: The experience sampling method. New Dir. Methodol. Soc. Behav. Sci. 15, 41–56 (1983)Google Scholar
  2. 2.
    Conner, T.S.: Experience sampling and ecological momentary assessment with mobile phones (2013). http://www.otago.ac.nz/psychology/otago047475.pdf
  3. 3.
    Intille, S.S., Rondoni, J., Kukla, C., Ancona, I., Bao, L.: A context-aware experience sampling tool. In: Proceedings of SIGCHI Conference Extended Abstracts, pp. 972–973. ACM (2003)Google Scholar
  4. 4.
    Consolvo, S., Walker, M.: Using the experience sampling method to evaluate ubicomp applications. IEEE Pervasive Comput. 2(2), 24–31 (2003)CrossRefGoogle Scholar
  5. 5.
    Froehlich, J., Chen, M.Y., Consolvo, S., Harrison, B., Landay, J.A.: MyExperience: a system for in situ tracing and capturing of user feedback on mobile phones. In: Proceedings of the International Conference on Mobile Systems, Applications and Services, pp. 57–70. ACM (2007)Google Scholar
  6. 6.
    Carter, S., Mankoff, J., Heer, J.: Momento: support for situated ubicomp experimentation. In: Proceedings of the SIGCHI Conference, pp. 125–134. ACM (2007)Google Scholar
  7. 7.
    Meschtscherjakov, A., Reitberger, W., Tscheligi, M.: MAESTRO: orchestrating user behavior driven and context triggered experience sampling. In: Proceedings of International Conference on Methods and Techniques in Behavioral Research, p. 29. ACM (2010)Google Scholar
  8. 8.
    Consolvo, S., Harrison, B., Smith, I., Chen, M.Y., Everitt, K., Froehlich, J., Landay, J.A.: Conducting in situ evaluations for and with ubiquitous computing technologies. Int. J. Hum. Comput. Interact. 22(1), 107–122 (2007)Google Scholar
  9. 9.
    Downs, J.S., Holbrook, M.B., Sheng, S., Cranor, L.F.: Are your participants gaming the system?: screening mechanical turk workers. In: Proceedings of the SIGCHI Conference, pp. 2399–2402. ACM (2010)Google Scholar
  10. 10.
    Abdesslem, F.B., Parris, I., Henderson, T.N.H.: Mobile experience sampling: reaching the parts of Facebook other methods cannot reach. In: Proceedings of Privacy and Usability Methods Powwow (2010)Google Scholar

Copyright information

© IFIP International Federation for Information Processing 2015

Authors and Affiliations

  1. 1.Digital Experience LaboratoryKorea UniversitySeoulKorea

Personalised recommendations