Human-Computer Interaction

INTERACT 2015: Human-Computer Interaction – INTERACT 2015 pp 1-19 | Cite as

Assisted Interaction Data Analysis of Web-Based User Studies

  • Xabier Valencia
  • J. Eduardo Pérez
  • Unai Muñoz
  • Myriam Arrue
  • Julio Abascal
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9296)

Abstract

User behaviour analysis requires defining experimental sessions with numerous participants. In this context, the specification of experiments is a demanding task, as several issues have to be considered such as the type of experiment, the type and number of tasks, the definition of questionnaires and the user interaction data to be gathered. The analysis of collected data is also complex and often requires repeatedly examining recorded interaction videos. In order to deal with these tasks, we present a platform called RemoTest which assists researchers to specify and conduct experimental sessions as well as to gather and analyse the interaction data. This platform has been applied to define different formal user studies on the web and has assisted researchers in detecting the main interaction characteristics of different user profiles and settings.

Keywords

Web accessibility User testing User behaviour Accessibility in use 

References

  1. 1.
    Almanji, A., Davies, T.C., Stott, N.: Using cursor measures to investigate the effects of impairment severity on cursor control for youths with cerebral palsy. Int. J. Hum. Comput. Stud. 72(3), 349–357 (2014)CrossRefGoogle Scholar
  2. 2.
    Apaolaza, A., Harper, S., Jay, C.: Understanding users in the wild. In: 10th International Cross-Disciplinary Conference on Web Accessibility (W4A 2013), Article 13, p. 4. ACM, New York (2013)Google Scholar
  3. 3.
    Atterer, R., Wnuk, M., Schmidt, A.: Knowing the user’s every move: user activity tracking for website usability evaluation and implicit interaction. In: 15th International Conference on World Wide Web, pp. 203–212. ACM, New York (2006)Google Scholar
  4. 4.
    Chapuis, O., Blanch, R., Beaudouin-Lafon, M.: Fitts’ law in the wild: a field study of aimed movements. Technical report 1480, LRI, University Paris-Sud, France, p. 11 (2007)Google Scholar
  5. 5.
    Claypool, M., Le, P., Wased, M, Brown, D.: Implicit interest indicators. In: 6th International Conference on Intelligent User Interfaces (IUI 2001), pp. 33–40. ACM, New York (2001)Google Scholar
  6. 6.
    Cugini, J., Scholtz, J.: VISVIP: 3D visualization of paths through web sites. In: 10th International Workshop on Database and Expert Systems Applications, pp. 259–263, IEEE (1999)Google Scholar
  7. 7.
    Edmonds, A.: Uzilla : a new tool for web usability testing. Behav. Res. Meth. Instrum. Comput. 35(2), 194–201 (2003)CrossRefGoogle Scholar
  8. 8.
    Etgen, M., Cantor, J.: What does getting WET (Web Event-logging Tool) mean for web usability? In: 5th Conference on Human Factors and the Web (1999)Google Scholar
  9. 9.
    Gajos, K.Z., Weld, D.S., Wobbrock, J.O.: Automatically generating personalized user interfaces with Supple. Artif. Intell. 174, 910–950 (2010)CrossRefGoogle Scholar
  10. 10.
    Gajos, K., Reinecke, K., Herrmann, C.: Accurate measurements of pointing performance from in situ observations. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI 2012), pp. 3157–3166. ACM, New York (2012)Google Scholar
  11. 11.
    Google Analytics 2014. http://www.google.es/analytics
  12. 12.
    Hong, J., Heer, J., Waterson, S., Landay, J.A.: WebQuilt: a proxy-based approach to remote web usability testing. ACM Trans. Inf. Syst. 19(3), 263–285 (2001)CrossRefGoogle Scholar
  13. 13.
  14. 14.
    Hurst, A., Hudson, S.E., Mankoff, J., Trewin, S.: Distinguishing users by pointing performance in laboratory and real-world tasks. ACM Trans. Access. Comput. 5(2), 27 (2013)Google Scholar
  15. 15.
    Loop11 2014. http://www.loop11.com
  16. 16.
    Meyer, D., Smith, J., Kornblum, S., Abrams, R., Wright, C.: Optimality in human motor performance: ideal control of rapid aimed movements. Psychol. Rev. 95, 340–370 (1988)CrossRefGoogle Scholar
  17. 17.
  18. 18.
    Paganelli, L., Paternò, F.: Intelligent analysis of user interactions with web applications. In: 7th International Conference on Intelligent User Interfaces (IUI 2002), pp. 111–118.3. ACM, New York (2002)Google Scholar
  19. 19.
    Paternò, F., Mancini, C., Meniconi, S.: ConcurTaskTrees: a diagrammatic notation for specifying task models. In: Howard, S., Hammond, J., Lindgaard, G. (eds.) IFIP TC13 Interantional Conference on Human-Computer Interaction, pp. 362–369. Chapman and Hall Ltd., London, UK (1997)Google Scholar
  20. 20.
    Pérez, J.E., Arrue, M., Valencia, X., Moreno, L.: Exploratory study of web navigation strategies for users with physical disabilities. In: Proceedings of the 11th Web for All Conference, Article 20, p. 4. ACM, New York (2014)Google Scholar
  21. 21.
    Power, C., Petrie, H., Mitchell, R.: A framework for remote user evaluation of accessibility and usability of websites. In: Stephanidis, C. (ed.) Universal Access in HCI, Part I, HCII 2009. LNCS, vol. 5614, pp. 594–601. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  22. 22.
    Scholtz, J., Laskowski, S., Downey, L.: Developing usability tools and techniques for designing and testing web sites. In: 4th Conference on Human Factors and the Web (1998)Google Scholar
  23. 23.
    Trewin, S.: Physical impairment. In: Harper, S., Yesilada, Y. (eds.) Web Accessibility - A Foundation for Research, pp. 37–46. Springer, London (2008)CrossRefGoogle Scholar

Copyright information

© IFIP International Federation for Information Processing 2015

Authors and Affiliations

  • Xabier Valencia
    • 1
  • J. Eduardo Pérez
    • 1
  • Unai Muñoz
    • 1
  • Myriam Arrue
    • 1
  • Julio Abascal
    • 1
  1. 1.EGOKITUZ: Laboratory of HCI for Special NeedsUniversity of the Basque Country (UPV/EHU), Informatika FakultateaDonostiaSpain

Personalised recommendations