Abstract
This chapter outlines the procedures for calibrating and establishing the properties of the collaborative problem solving tasks in the ATC21STM project (The acronym ATC21STM has been globally trademarked. For purposes of simplicity the acronym is presented throughout the chapter as ATC21S.). The chapter deals with the interpretation of these tasks and provides an outline of how they were used, discussing the data they yielded, the interpretation of the CPS construct and the calculation of the student skill-levels measured. Using item response theory, the tasks were calibrated separately and jointly. One and two parameter item response models were used to explore the data and to determine dimensionality. The data were analysed on one, two and five dimensions, corresponding with the theoretical components of the collaborative problem solving construct. Tasks were calibrated in sets of three and these sets were used to determine that there were no significant differences between countries in the difficulty of the items. Difference in mean latent ability of Student A and Student B was also analysed, and it was concluded that there was no advantage or disadvantage to students adopting either role. The task calibrations were used to determine the hierarchy of the indicators, and describe student competency levels as measured by the tasks. Skills progressions were created for one, two and five possible dimensions as interpretations of the collaborative problem solving continuum. In this chapter we describe the methods used to develop the progressions from novice to expert, which provide a framework for teachers to use in interpreting their observations of student behaviour regarding collaborative problem solving.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Adams, R. J., & Khoo, S. T. (1993). Quest: The interactive test analysis system. Melbourne: Australian Council for Educational Research.
Adams, R., Vista, A., Scoular, C., Awwal, N., Griffin, P., & Care, E. (2015). Automatic coding procedures for collaborative problem solving. In P. Griffin & E. Care (Eds.), Assessment and teaching of 21st century skills: Methods and approach (pp. 115–132). Dordrecht: Springer.
Adamson, F., & Darling-Hammond, L. (2015). Policy pathways for twenty-first century skills. In P. Griffin & E. Care (Eds.), Assessment and teaching of 21st century skills: Methods and approach (pp. 293–310). Dordrecht: Springer.
Care, E., Griffin, P., Scoular, C., Awwal, N., & Zoanetti, N. (2015). Collaborative problem solving tasks. In P. Griffin & E. Care (Eds.), Assessment and teaching of 21st century skills: Methods and approach (pp. 85–104). Dordrecht: Springer.
Eisner, E. (1993). Why standards may not improve schools. Educational Leadership, 50(5), 22–23.
Glaser, R. (1963). Instructional technology and the measurement of learning outcomes: Some questions. American Psychologist, 18, 519–521.
Glaser, R. (1981). The future of testing: A research agenda for cognitive psychology and psychometrics. American Psychologist, 36, 923–936.
Griffin, P., Care, E., Bui, M., & Zoanetti, N. (2013). Development of the assessment design and delivery of collaborative problem solving in the Assessment and Teaching of 21st Century Skills Project. In E. McKay (Ed.), ePedagogy in online learning: New developments in web mediated human computer interaction. Hershey: IGI Global.
Hambleton, R. K., & Swaminathan, H. (1979). Item response theory. Principles and applications. Boston: Kluwer-Nijhoff.
Hesse, F., Care, E., Buder, J., Sassenberg, K., & Griffin, P. (2015). A framework for teachable collaborative problem solving skills. In P. Griffin & E. Care (Eds.), Assessment and teaching of 21st century skills: Methods and approach (pp. 37–56). Dordrecht: Springer.
Lord, F. M. (1980). Applications of item response theory to practical testing problems. Hillsdale: Erlbaum.
Masters, G. (1982). A Rasch model for partial credit scoring. Psychometrica, 47, 149–174.
Messick, S. (1994). The interplay of evidence and consequences in the validation of performance assessments. Educational Researcher, 23(2), 13–23.
Rasch, G. (1960/1980). Probabilistic models for some intelligence and attainment tests (Copenhagen: Danish Institute for Educational Research), expanded edition (1980) with foreword and afterword by B. D. Wright. Chicago: The University of Chicago Press.
Wilson, M., & Adams, R. J. (1995). Rasch models for item bundles. Psychometrika, 60(2), 181–198.
Woods, K., Mountain, R., & Griffin, P. (2015). Linking developmental progressions to teaching. In P. Griffin & E. Care (Eds.), Assessment and teaching of 21st century skills: Methods and approach (pp. 267–292). Dordrecht: Springer.
Wright, B. D., & Masters, G. N. (1982). Rating scale analysis (p. 1982). Chicago: Mesa Press.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer Science+Business Media Dordrecht
About this chapter
Cite this chapter
Griffin, P., Care, E., Harding, SM. (2015). Task Characteristics and Calibration. In: Griffin, P., Care, E. (eds) Assessment and Teaching of 21st Century Skills. Educational Assessment in an Information Age. Springer, Dordrecht. https://doi.org/10.1007/978-94-017-9395-7_7
Download citation
DOI: https://doi.org/10.1007/978-94-017-9395-7_7
Published:
Publisher Name: Springer, Dordrecht
Print ISBN: 978-94-017-9394-0
Online ISBN: 978-94-017-9395-7
eBook Packages: Humanities, Social Sciences and LawEducation (R0)