Higher Education

, Volume 49, Issue 4, pp 413–430 | Cite as

Windows into the mind

  • Richard J. Shavelson
  • Maria Araceli Ruiz-Primo
  • Edward W. Wiley


As faculty, our goals for students are often tacit, hidden not only from students but from ourselves as well. We present a conceptual framework for considering teaching goals – what we want our students to achieve – that encourages us to think more broadly about what we mean by achieving in our knowledge domains. This framework includes declarative knowledge (“knowing that”), procedural knowledge (“knowing how”), schematic knowledge (“knowing why”) and strategic knowledge (“knowing when, where and how our knowledge applies”). We link the framework to a variety of assessment methods and focus on assessing the structure of declarative knowledge – knowledge structure. From prior research, we know that experts and knowledgeable students have extensive, well-structured, declarative knowledge; not so novices. We then present two different techniques for assessing knowledge structure – cognitive and concept maps, and a combination of the two – and provide evidence on their technical quality. We show that these maps provide a window into the structure of students declarative knowledge not otherwise tapped by typical pencil-and-paper tests. These maps provide us with new teaching goals and new evidence on student learning.


Conceptual Framework Assessment Method Student Learning Knowledge Domain Knowledge Structure 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. Anderson, R.C. 1984Some reflections on the acquisition of knowledgeEducational Researcher13510Google Scholar
  2. Chi, M.T.H., Feltovich, P.J, Glaser, R. 1981‘Categorization and representation of physics problems by experts and novices’Cognitive Science5121152Google Scholar
  3. Chi, M.T.H., Glaser, R, Farr, M.J. 1988The Nature of ExpertiseLawrence Earlbaum Associates PublishersHillsdale, NJGoogle Scholar
  4. Ericsson, A.K, Simon, H.A. 1993Protocol Analysis: Verbal Reports as DataMITCambridge, MAGoogle Scholar
  5. Ericsson, A.K, Simon, H.A. 1998‘How to study thinking in everyday life: Contrasting think-aloud protocols with descriptions and explanations of thinking’Mind, Culture and Activity5178186Google Scholar
  6. Fisher, K.M. 2000‘SemNet software as an assessment tool’.Mintzes, J.J.Wandersee, J.H eds. Assessing Science Understanding: A Human Constructivist ViewAcademic PressNew York197221Google Scholar
  7. Gentner, DStevens, A.L. eds. 1983Mental ModelsErlbaumHillsdale, NJGoogle Scholar
  8. Glaser R. (1991). ‘Expertise and assessment’. In: Wittrock M.C, Baker E.L. (eds.), Testingand Cognition, pp. 17–39Google Scholar
  9. Glaser, R, Bassok, M. 1989‘Learning theory and the study of instruction’Annual Review of Psychology40631666CrossRefGoogle Scholar
  10. Goldsmith, T.E., Johnson, P.J, Acton, W.H. 1991ássessing structural knowledge’.Journal of Educational Psychology838896CrossRefGoogle Scholar
  11. Li. M, Shavelson R.J. (2001). ‘Examining the links between science achievement and assessment’. Presented at the annual meeting of the American Educational Research Association, Seattle, WAGoogle Scholar
  12. Mintzes, J.J., Wandersee, J.H, Novak, J.D. 1997Teaching Science for UnderstandingAcademic PressSan DiegoGoogle Scholar
  13. Novak, J.D. 1990‘Concept mapping: A useful tool for science education’Journal of Research in Science Teaching27937949Google Scholar
  14. Novak, J.D, Gowin, D.R. 1984Learning How to LearnCambridge PressNew YorkGoogle Scholar
  15. Pearsall, N.R., Skipper, J.E.J, Mintzes, J.J. 1997‘Knowledge restructuring in the life sciences. A longitudinal study of conceptual change in biology’.Science Education81193215CrossRefGoogle Scholar
  16. Ruiz-Primo, M.A, Shavelson, R.J. 1996a‘Problems and issues in the use of concept maps in science assessment’Journal of Research in Science Teaching33569600CrossRefGoogle Scholar
  17. Ruiz-Primo, M.A, Shavelson, R.J. 1996b‘Rhetoric and reality in science performance assessments: An update’Journal of Research in Science Teaching3310451063CrossRefGoogle Scholar
  18. Ruiz-Primo M.A., Schultz E.S, Shavelson R.J. (1996c). ‘Concept map-based assessments in science: An exploratory study’. Presentedat the annual meetingof the American Educational Research Association,New York, NYGoogle Scholar
  19. Ruiz-Primo, M.A., Schutlz, S.E., Li, M, Shavelson, R.J. 2001‘Comparison of the reliability and validity of scores from two concept-mapping techniques’Journal of Research in Science Teaching38260278CrossRefGoogle Scholar
  20. Sadler, P.M. 1998‘Psychometric models of student conceptions in science: Reconciling qualitative studies and distractor-driven assessment instruments’Journal of Research in science Teaching35265296CrossRefGoogle Scholar
  21. Schau, C, Mattern, N. 1997‘Use of map techniques in teaching applied statistics courses’The American Statistician51171175Google Scholar
  22. Schultz S.E. (1999). To Group or not to Group: Effects of Grouping on Students’. Declarative and Procedural Knowledge in Science. Unpublished doctoral dissertation, Stanford CA.Google Scholar
  23. Shavelson, R.J., Baxter, G.P, Pine, J. 1992‘Performance assessments: Political rhetoric and measurement reality’Educational Researcher212227Google Scholar
  24. Shavelson, R.J, Ruiz-Primo, M.A. 1999b‘Leistungsbewertung im naturwissenschaftlichen unterricht’, UnterrichtswissenschaftEvaluation in natural science instruction27102127Google Scholar
  25. Shavelson, R.J, Ruiz-Primo, M.A. 1999On the Psychometrics of Assessing Science UnderstandingAcademic PressNew YorkGoogle Scholar
  26. Stecher, B.M., Klein, S.P., Solano-Flores, G., McCaffrey, D., Robyn, A., Shavelson, R.J, Haertel, E. 2000‘The effects of content, format, and inquiry level on performance on science performance assessment scores’Applied Measurement in Education13139160CrossRefGoogle Scholar
  27. Shavelson, R.J. 1972‘Some aspects of the correspondence between content structure and cognitive structure in physics instruction’Journal of Educational Psychology63225234Google Scholar
  28. Shavelson, R.J. 1974‘Some methods for examining content structure and cognitive structure in instruction’Educational Psychologist11110122Google Scholar
  29. Shavelson, R.J, Stanton, G.C. 1975‘Construct validation: Methodology and application to three measures of cognitive structure’Journal of Educational Measurement126785Google Scholar
  30. Shavelson, R.J, Webb, N.M. 1991Generalizability Theory: A PrimerSAGENewbury Park, CAGoogle Scholar
  31. White, R.T, Gunstone, R. 1992Probing UnderstandingFalmer PressNew YorkGoogle Scholar
  32. Wiley, E.W. 1998Indirect and Direct Assessment of Structural Knowledge in StatisticsStanford University School of EducationStanford CAGoogle Scholar
  33. Zajchowski, R, Martin, J. 1993‘Differences in the problem solving of stronger and weaker novices in physics: Knowledge, strategies, or knowledge structure’Journal of Research in Science Teaching30459470Google Scholar

Copyright information

© Springer 2005

Authors and Affiliations

  • Richard J. Shavelson
    • 1
  • Maria Araceli Ruiz-Primo
    • 1
  • Edward W. Wiley
    • 2
  1. 1.School of EducationStanford UniversityStanfordUSA
  2. 2.McKinsey & Company, Inc.USA

Personalised recommendations