Skip to main content

Evaluating Visualization Environments: Cognitive, Social, and Cultural Perspectives

  • Chapter
  • First Online:
Handbook of Human Centric Visualization

Abstract

Computer-based visualization environments enable their users to create, manipulate, and explore visual representations of data, information, processes and phenomena. They play a prominent role in the practices and education of many science, technology, engineering, and mathematics (STEM) communities. There is a growing need to evaluate such environments empirically, in order not only to ensure that they are effective, but also to better understand how and why they are effective. How does one empirically evaluate the effectiveness of a visualization environment? I argue that choosing an approach is a matter of finding the right perspective for viewing human use of the visualization environment. This chapter presents three alternative perspectives—Cognitive, Social, and Cultural—each of which is distinguished by its own intellectual tradition and guiding theory. In so doing, the chapter has three broad goals: (a) to illustrate that different research traditions and perspectives lead to different definitions of effectiveness; (b) to show that, depending upon the research questions of interest and the situations in which a visualization environment is being used, each perspective can prove more or less useful in approaching the empirical evaluation of the environment; and (c) to provide visualization researchers with a repertoire of evaluation methods to draw from, and guidelines for matching research methods to research questions of interest.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 119.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Throughout this chapter, I use the term visualization to refer to an external representation of a phenomenon, process, idea, or data set. I use the term visualization environment to refer to a computer-based software tool that enables one to view, interact with, and explore such an external representation.

  2. 2.

    Throughout this chapter, I will capitalize the three perspectives in order to emphasize that I am not using them as general terms, but rather in the specific senses defined in this chapter.

  3. 3.

    Hence, the use of the term informant (as opposed to, say, subject) is deliberate; it underscores the fact that participants in a Consensus Study are informing the researcher of their culture, rather than the researcher subjecting them to a test.

References

  1. A. Chourasia, S. Cutchin, Y. Cui, R. W. Moore, K. Olsen, S. M. Day, J. B. Minster, P. Maechling, and T. H. Jordan, Visual Insights into High-Resolution Earthquake Simulations, IEEE Computer Graphics and Applications, vol. 27, no. 5, pp. 28 –34, 2007.

    Article  Google Scholar 

  2. K. Riley, D. Ebert, C. Hansen, and J. Levit, Visually accurate multi-field weather visualization, In Proceedingsof IEEE Visualization 2003, Los Alamitos, CA: IEEE Computer Society Pess, pp. 279–286, 2003.

    Google Scholar 

  3. S. Eick, J. Steffen, and E. Sumner, Seesoft: A Tool for Visualizing Line-Oriented Software Statistics, IEEE Transactions on Software Engineering, vol. 18, no. 11, pp. 957–968, 1992.

    Article  Google Scholar 

  4. H. Lam, E. Bertini, P. Isenberg, C. Plaisant, and S. Carpendale, Empirical Studies in Information Visualization: Seven Scenarios, IEEE Transactions on Visualization and Computer Graphics, vol. 30, no. November, pp. 2479–2488, 2011.

    Google Scholar 

  5. L. A. Treinish, Task-specific visualization design, IEEE Computer Graphics and Applications, vol. 19, no. 5, pp. 72–77, 1999.

    Article  Google Scholar 

  6. J. H. Larkin and H. A. Simon, Why a diagram is (sometimes) worth ten thousand words, Cognitive Science, vol. 11, pp. 65–99, 1987.

    Article  Google Scholar 

  7. M. Scaife and Y. Rogers, External cognition: how do graphical representations work?, International Journal of Human-Computer Studies, vol. 45, pp. 185–213, 1996.

    Article  Google Scholar 

  8. L. A. Suchman, Plans and Situated Actions: The Problem of Human-Machine Communication. New York: Cambridge University Press, 1987.

    Google Scholar 

  9. T. Winograd and F. Flores, Understanding Computers and Cognition. New York: Addison-Wesley, 1987.

    Google Scholar 

  10. J. Roschelle, Learning by collaborating: Convergent conceptual change, Journal of the Learning Sciences, vol. 2, no. 3, pp. 235–276, 1992.

    Article  Google Scholar 

  11. B. Jordan and A. Henderson, Interaction analysis: Foundations and practice, Journal of the Learning Sciences, vol. 4, no. 1, pp. 39–103, 1995.

    Article  Google Scholar 

  12. J. Lave and E. Wenger, Situated Learning: Legitimate Peripheral Participation. New York: Cambridge University Press, 1991.

    Book  Google Scholar 

  13. E. Wenger, Communities of Practice: Learning, Meaning and Identity. Cambridge: Cambridge University Press, 1998.

    Book  Google Scholar 

  14. A. K. Romney, S. C. Weller, and W. H. Batchelder, Culture as consensus: A theory of culture and informant accuracy, American Anthropologist, vol. 88, no. 2, pp. 313–338, 1986.

    Article  Google Scholar 

  15. P. C. Wong and J. Thomas, Visual analytics, IEEE Computer Graphics and Applications, vol. 24, no. 5, pp. 20–21, 2004.

    Article  Google Scholar 

  16. R. Ben-Bassat Levy, M. Ben-Ari, and P. Uronen, The Jeliot 2000 program animation system, Computers & Education, vol. 40, no. 1, pp. 1–15, 2003.

    Article  Google Scholar 

  17. C. D. Hundhausen, P. Agarwal, R. Zollars, and A. Carter, The design and experimental evaluation of a scaffolded software environment to improve engineering students’ disciplinary problem-solving skills, Journal of Engineering Education, under review.

    Google Scholar 

  18. V. Michalchik, A. Rosenquist, R. Kozma, P. Kreikemeier, P. Schank, and B. Coppola, Representational resources for constructing shared understandings in the high school chemistry classroom, In Visualization: Theory and practice in science education, J. Gilbert, M. Nakhleh, and M. Reiner, Eds. New York: Springer, pp. 233–282, 2008.

    Google Scholar 

  19. M. C. Chuah, B. E. John, and J. Pane, Analyzing graphic and textual layouts with GOMS: Results of preliminary analysis, In CHI’94 Conference Companion, New York: ACM Press, pp. 323–324, 1994.

    Google Scholar 

  20. J. R. Anderson, M. Matessa, and C. Lebiere, ACT-R: A theory of higher level cognition and its relation to visual attention, Human-Computer Interaction, vol. 12, no. 4, pp. 439–462, 1997.

    Article  Google Scholar 

  21. S. K. Card, T. P. Moran, and A. Newell, The Psychology of Human-Computer Interaction. Hillsdale, NJ: Lawrence Erlbaum Associates, 1983.

    Google Scholar 

  22. S. M. Casner and J. H. Larkin, Cognitive efficiency considerations for good graphic design, In Cognitive Science Society Proceedings, Hillsdale, NJ: Erlbaum, pp. 275–282, 1989.

    Google Scholar 

  23. M. Hegarty, Mental animation: inferring motion from static displays of mechanical systems, Journal of Experimental Psychology: Language, Memory, and Cognition, vol. 18, pp. 1084–1102, 1992.

    Article  Google Scholar 

  24. J. Zhang and D. Norman, Representations in distributed cognitive tasks, Cognitive Science, vol. 18, pp. 87–122, 1994.

    Article  Google Scholar 

  25. B. Nardi, Studying context: A comparison of activity theory, situated action models, and distributed cognition, In Context and Consciousness: Activity Theory and Human-Computer Interaction, Cambridge, MA: The MIT Press, pp. 69–102, 1996.

    Google Scholar 

  26. J. Roschelle, Designing for cognitive communication: Epistemic fidelity or mediating collaborative inquiry?, In Computers, Communication and Mental Models, D. Day and D. K. Kovacs, Eds. London: Taylor & Francis, pp. 13–25, 1996.

    Google Scholar 

  27. J. Heritage, Recent developments in conversation analysis, Sociolinguistics, vol. 15, pp. 1–16, 1985.

    Google Scholar 

  28. J. Gumperz, Discourse Strategies. Cambridge: Cambridge University Press, 1982.

    Google Scholar 

  29. T. R. G. Green, M. Petre, and R. K. E. Bellamy, Comprehensibility of Visual and Textual Programs: A Test of Superlativism Against the ‘Match-Mismatch’ Conjecture, In Empirical Studies of Programmers: Fourth Workshop, pp. 121–146, 1991.

    Google Scholar 

  30. S. A. Douglas, C. D. Hundhausen, and D. McKeown, Toward empirically-based software visualization languages, In Proceedings of the 11th IEEE Symposium on Visual Languages, Los Alamitos, CA: IEEE Computer Society Press, pp. 342–349, 1995.

    Google Scholar 

  31. Z. D. Chaabouni, A user-centered design of a visualization language for sorting algorithms, University of OregonEditor, 1996.

    Google Scholar 

  32. B. Tversky and B. Morrison, Can animations facilitate?, Int. J. Hum.-Comput. Stud., vol. 57, no. 4, pp. 247–262, 2002.

    Article  Google Scholar 

  33. A. W. Lawrence, A. N. Badre, and J. T. Stasko, Empirically evaluating the use of animations to teach algorithms, In Proceedings of the 1994 IEEE Symposium on Visual Languages, Los Alamitos, CA: IEEE Computer Society Press, pp. 48–54, 1994.

    Google Scholar 

  34. M. Ben-Ari, Constructivism in computer science education, J. Comput. Math. Sci. Teach., vol. 20, no. 1, pp. 45–73, 2001.

    Google Scholar 

  35. C. D. Hundhausen, Integrating algorithm visualization technology into an undergraduate algorithms course: Ethnographic studies of a social constructivist approach, Computers & Education, vol. 39, no. 3, pp. 237–260, 2002.

    Article  Google Scholar 

  36. D. Suthers and C. Hundhausen, An experimental study of the effects of representational guidance on collaborative learning processes, Journal of the Learning Sciences, vol. 12, no. 2, pp. 183–219, 2003.

    Article  Google Scholar 

  37. C. D. Hundhausen, Using end user visualization environments to mediate conversations: A ‘Communicative Dimensions’ framework., Journal of Visual Languages and Computing, vol. 16, no. 3, pp. 153–185, 2005.

    Google Scholar 

  38. P. E. Shrout and J. L. Fleiss, Intraclass correlations: Uses in assessing rater reliability, Psychological Bulletin, vol. 86, no. 2, pp. 420–428, 1979.

    Article  Google Scholar 

  39. A. Tversky, Features of similarity, Psychological Review, vol. 84, pp. 327–352, 1977.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Christopher D. Hundhausen .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer Science+Business Media New York

About this chapter

Cite this chapter

Hundhausen, C.D. (2014). Evaluating Visualization Environments: Cognitive, Social, and Cultural Perspectives. In: Huang, W. (eds) Handbook of Human Centric Visualization. Springer, New York, NY. https://doi.org/10.1007/978-1-4614-7485-2_5

Download citation

  • DOI: https://doi.org/10.1007/978-1-4614-7485-2_5

  • Published:

  • Publisher Name: Springer, New York, NY

  • Print ISBN: 978-1-4614-7484-5

  • Online ISBN: 978-1-4614-7485-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics