Abstract
The assessment of competence has emerged as a critical issue as nations seek to develop their education systems, work forces, and their citizens’ capacity for life-long learning. While the assessment of competence may be critical to national well-being in the 21st century, as Hartig, Klieme and Leutner (2008, p. v) pointed out, «the theoretical modeling of competencies, their assessment, and the usage of assessment results in practice present new challenges for psychological and educational research». The purpose of this paper is to move the field forward by providing one possible model for assessing competence. The model presented here underlies the Collegiate Learning Assessment (CLA). Drawing on lessons from the CLA, the model was used to develop a concrete prototype for assessing business-planning competence.
Similar content being viewed by others
References
Case, R. (1996). Changing views of knowledge and their impact on educational research and practice. In: D. R. Olson & N. Torrance (Eds.): Handbook of education and human development: New models of learning, teaching, and schooling. Oxford: Blackwell
Ericsson, K. A., & Simon, H. A. (1993). Protocol analysis: Verbal reports as data. Cambridge MA: MIT Press
Hartig, J.; Klieme, E. & Leutner, D. (Eds.) (2008). Assessment of Competencies in Educational Contexts: State of the Art and Future Prospects. Göttingen: Hogrefe & Huber
Klein, S. (2007). Characteristics of hand and machine-assigned scores to college students’ answers to open-ended tasks. In: D. Nolan & T. Speed (Eds.): Probability and Statistics: Essays in Honor of David A. Freedman. IMS Collections, Vol. 2. Beachwood, OH: Institute for Mathematical Statistics
Klein, S.; Benjamin, R.; Shavelson, R. et al. (2007). The collegiate learning assessment: Facts and fantasies. Evaluation Review, 31(5), 415–439
Klein, S.; Freedman, D.; Shavelson, R. et al. (2008). Assessing school effectiveness. Evaluation Review, 32, 511–525
Klein, S. P.; Kuh, G. D.; Chun, M. et al. (2005). An approach to measuring cognitive outcomes across higher-education institutions. Journal of Higher Education, 46(3), 251–276
Li, M.; Ruiz-Primo, M. A. & Shavelson, R. J. (2006). Towards a science achievement framework: The case of TIMSS 1999. In: S. Howie & T. Plomp (Eds.): Contexts of learning mathematics and science: Lessons learned from TIMSS. London: Routledge
McClelland, D. C. (1973). Testing for competence rather than testing for «intelligence». American Psychologist, 28(1), 1–14
Shavelson, R. J. (2007a). Assessing student learning responsibly: From history to an audacious proposal. Change (January/February), 26–33
Shavelson, R. J. (2007b). A Brief History of Student Learning: How We Got Where We Are and a Proposal for Where to Go Next. Washington, DC: Association of American Colleges and Universities
Shavelson, R. J. (2008a). Reflections on quantitative reasoning: An assessment perspective. In: B. L. Madison & L. A. Steen (Eds.): Calculation vs. context: Quantitative literacy and its implications for teacher education. Washington, DC: Mathematical Association of America.
Shavelson, R. J. (2008b) The Collegiate Learning Assessment. Forum for the Future of Higher Education: Ford Policy Forum. Cambridge, MA
Shavelson, R. J. (2010a). Measuring college learning responsibly: Accountability in a new era. Stanford, CA: Stanford University Press
Shavelson, R. J. (2010b). On the measurement of competency. Empirical Research in Vocational Education and Training, 2(1), 43–65
Shavelson, R. J. (in press). An approach to testing and modeling competence. In: O. Troitschanskaia & S. Bloemeke (Eds.): Modeling and Measurement of Competencies in Higher Education. Rotterdam: Sense
Shavelson, R. J.; Baxter, G. P. & Gao, X. (1993). Sampling variability of performance assessments. Journal of Educational Measurement, 30(3), 215–232
Shavelson, R. J. & Huang, L. (2003). Responding responsibly to the frenzy to assess learning in higher education. Change, 35(1), 10–19
Shavelson, R. J.; Ruiz-Primo, M. A. & Wiley, E. W. (1999). Note on sources of sampling variability in science performance assessments. Journal of Educational Measurement, 36 (1), 61–71
Shavelson, R. J., & Webb, N. M. (1981). Generalizability theory: 1973–1980. British Journal of Mathematical and Statistical Psychology, 34, 133–166
Shavelson, R. J., & Webb, N. M. (1991). Generalizability theory: A primer. Newbury Park, CA: Sage
Webb, N. M.; Shavelson, R. J. & Haertel, E. H. (2007). Reliability coefficients and generalizability theory. Handbook of Statistics, 26, 81–124
Weinert, F. E. (2001). Concept of Competence: A Conceptual Clarification. In: D. S. Rychen & L. H. Salganik (Eds.): Defining and Selecting Key Competencies. Göttingen: Hogrefe & Huber
Wigdor, A. K. & Green, B. F. Jr. (Eds.) (1991). Performance assessment for the workplace (Vol. I). Washington, DC: National Academy Press
Author information
Authors and Affiliations
Corresponding author
Additional information
This paper was written while I was a Humboldt Fellow at the Institute for Human Resource Education and Management, Munich School of Management, Ludwig-Maximillians-Universität. I wish to thank to both the Humboldt Foundation and the Institute, especially Prof. Dr. Susanne Weber, for their support.
Rights and permissions
About this article
Cite this article
Shavelson, R.J. Assessing business-planning competence using the Collegiate Learning Assessment as a prototype. Empirical Res Voc Ed Train 4, 77–90 (2012). https://doi.org/10.1007/BF03546509
Published:
Issue Date:
DOI: https://doi.org/10.1007/BF03546509