Advertisement

User Studies in Visualization: A Reflection on Methods

Chapter

Abstract

In this chapter I will reflect on many years of running user studies in visualization, examining my experience with how effectively different methodological approaches worked for different goals. I first introduce my own categorization of user studies based on their major goals (understanding versus evaluation, each with specific subcategories) and common methodological approaches (quantitative experiment, qualitative observational study, inspection and usability study), providing examples of each combination. I then use examples from my own experience to reflect upon the strengths and weaknesses of each methodological approach.

Keywords

User Study Visualization Tool Visualization Technique Interaction Technique Quantitative Experiment 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Amazon Mechanical Turk [online]. Available: https://www.mturk.com/mturk/welcome [Accessed: June 15, 2012].
  2. 2.
    C. Ardito, P. Buono, M.F. Costabile, and R. Lanzilotti. Systematic Inspection of Information Visualization Systems. In Proceedings of the 2006 AVI Workshop on Beyond Time and Errors: Novel Evaluation Methods for Information Visualization, pp. 1–4, 2006.Google Scholar
  3. 3.
    J.W. Creswell. Educational research: Planning, conducting, and evaluating quantitative and qualitative research. Pearson/Merrill Prentice Hall, 2007.Google Scholar
  4. 4.
    S. Easterbrook, J. Singer, M.A. Storey, and D. Damian. Selecting Empirical Methods for Software Engineering Research. In Shull et al., eds., Guide to Advanced Empirical Software Engineering, Springer-Verlag London Limited, 2008, Section III, pp. 285–311.Google Scholar
  5. 5.
    L. Grammel, M. Tory, and M.A. Storey. How information visualization novices construct visualizations. In IEEE Transactions on Visualization and Computer Graphics, volume 16, no. 6, pp. 943–952, Nov./Dec. 2010.Google Scholar
  6. 6.
    C.G. Healey and J.T. Enns. Large Datasets at a Glance: Combining Textures and Colors in Scientific Visualization. In IEEE Transactions on Visualization and Computer Graphics, volume 5, no. 2, pp. 145–167, April-June 1999.Google Scholar
  7. 7.
    D. Huang, M. Tory, S. Staub-French, and R. Pottinger. Visualization Techniques for Schedule Comparison. In Computer Graphics Forum, volume 28, no. 3, pp. 951–958, June 2009.Google Scholar
  8. 8.
    P. Isenberg, T. Zuk, C. Collins, and S. Carpendale. Grounded Evaluation of Information Visualizations. In BELIV 2008: BEyond time and errors: novel evaLuation methods for Information Visualization, 2008.Google Scholar
  9. 9.
    Y. Kang and J. Stasko. Characterizing the Intelligence Analysis Process: Informing Visual Analytics Design through a Longitudinal Field Study. In Proceedings of IEEE Visual Analytics Science and Technology, pp. 21–30, October 2011.Google Scholar
  10. 10.
    A. Kobsa. User Experiments with Tree Visualization Systems. In IEEE Symposium on Information Visualization, pp. 9–16, 2004.Google Scholar
  11. 11.
    A. Komlodi, A. Sears, and E. Stanziola. Information Visualization Evaluation Review, ISRC Tech. Report, Dept. of Information Systems, UMBC-ISRC-2004-1, 2004. Reported in [18].Google Scholar
  12. 12.
    H. Lam, E. Bertini, P. Isenberg, C. Plaisant, and S. Carpendale. Empirical Studies in Information Visualization: Seven Scenarios. In IEEE Transactions on Visualization and Computer Graphics, 30 Nov. 2011, IEEE Computer Society Digital Library, Volume: 18, no. 9 pp. 1520–1536, Sept 2012.Google Scholar
  13. 13.
    J. Lazar, J.H. Feng, and H. Hochheiser. Research Methods in Human-Computer Interaction. United Kingdom: John Wiley & Sons Ltd., 2010. ISBN-10: 0470723378, ISBN-13: 978-0470723371.Google Scholar
  14. 14.
    Q. Li and C. North. Empirical Comparison of Dynamic Query Sliders and Brushing Histograms. In IEEE Symposium on Information Visualization, pp. 147–153, 2003.Google Scholar
  15. 15.
    J.E. McGrath. Methodology Matters: Doing Research in the behavioral and social sciences. in Readings in Human-Computer Interaction: Toward the Year 2000, R.M. Baecker, J. Grudin, and W.A.S. Buxton, eds. pp. 152–169, 1995.Google Scholar
  16. 16.
    T. Munzner. A Nested Model for Visualization Design and Validation. In IEEE Transactions on Visualization and Computer Graphics, volume 15, no. 6, pp. 921–928, November 2009.Google Scholar
  17. 17.
    J. Nielsen and R.L. Mack, eds. Usability Inspection Methods. New York: John Wiley & Sons, 1994, ISBN 0-471-01877-5.Google Scholar
  18. 18.
    C. Plaisant. The Challenge of Information Visualization Evaluation. In Proceedings of the working conference on Advanced Visual Interfaces, pp. 109–116, 2004.Google Scholar
  19. 19.
    M. Sedlmair, P. Isenberg, D. Baur, M. Mauerer, C. Pigorsch, and A. Butz. Cardiogram: Visual Analytics for Automotive Engineers. In ACM Conference on Human Factors in Computing Systems, pp. 1727–1736, May 2011.Google Scholar
  20. 20.
    J. Seo and B. Shneiderman. Knowledge discovery in high dimensional data: Case studies and a user survey for the rank-by-feature framework. In IEEE Transactions on Visualization and Computer Graphics, volume 12, no. 3, pp. 311–322, May/June 2006.Google Scholar
  21. 21.
    B. Shneiderman and C. Plaisant. Strategies for Evaluating Information Visualization Tools: Multi-dimensional In-depth Long-term Case Studies. In BELIV 2006: BEyond time and errors: novel evaLuation methods for Information Visualization, 2006.Google Scholar
  22. 22.
    A. Tang, M. Tory, B. Po, P. Neumann, and S. Carpendale. Collaborative Coupling over Tabletop Displays. In ACM CHI 2006, pp. 1181–1190, Apr. 2006.Google Scholar
  23. 23.
    M. Tory and T. Möller. Evaluating Visualizations: Do Expert Reviews Work. In IEEE Computer Graphics and Applications, volume 25, no. 5, pp. 8–11, Sept./Oct. 2005.Google Scholar
  24. 24.
    M. Tory, D.W. Sprague, F. Wu, W.Y. So, and T. Munzner. Spatialization Design: Comparing Points and Landscapes. In IEEE Transactions on Visualization and Computer Graphics, volume 13, no. 6, pp. 1262–1269, Nov./Dec. 2007.Google Scholar
  25. 25.
    M. Tory, C. Swindells, and R. Dreezer. Comparing Dot and Landscape Spatializations for Visual Memory Differences. In IEEE Transactions on Visualization and Computer Graphics, volume 16, no. 6, pp. 1033–1040, Nov. / Dec. 2009.Google Scholar
  26. 26.
    M. Tory, S. Staub-French, B. Po, and F. Wu. Physical and digital artifact-mediated coordination in building design. In Journal of Computer Supported Cooperative Work, volume 17, no. 4, pp. 311–351, Aug. 2008.Google Scholar
  27. 27.
    F. Wu and M. Tory, “PhotoScope: Visualizing Spatiotemporal Coverage of Photos for Construction Management,” ACM Conference on Human Factors in Computing Systems, pp. 1103–1112, 2009.Google Scholar
  28. 28.
    T. Zuk, L. Schlesier, P. Neumann, M.S. Hancock, and S. Carpendale. Heuristics for Information Visualization Evaluation. In BELIV 2006: BEyond time and errors: novel evaLuation methods for Information Visualization, 2006.Google Scholar

Copyright information

© Springer Science+Business Media New York 2014

Authors and Affiliations

  1. 1.Department of Computer ScienceUniversity of VictoriaVictoriaCanada

Personalised recommendations