Abstract
This paper describes a quantitative analysis from an educational measurement approach to evaluating a multiple-choice test used in an introductory economics course in a Hong Kong university. The assessment was used in an undergraduate course on elementary economics topics with an enrolment of over 300 first-year students from various engineering disciplines. The results of a Rasch analysis showed how the assessment analysis provided information for evaluating students’ understanding of economics concepts at the end of the course. Investigations were made and reported on the quality of the test for assessing student mastery of the economic concepts. Benefits for university instructors to use the assessment evidence to support more targeted and effective teaching and achieve better understanding of student mastery were discussed. Recommendations and implications for future use of the assessment information were also discussed.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Bond, T. G., & Fox, C. M. (2015). Applying the Rasch Model: Fundamental measurement in the human sciences. New York and London: Routledge.
Crisp, G. T., & Palmer, E. J. (2007). Engaging academics with a simplified analysis of their multiple-choice question (MCQ) assessment results. Journal of university teaching & learning practice, 4(2), 88–106.
Knight, P. (2006). The local practices of assessment. Assessment & Evaluation in Higher Education, 31(4), 435–452.
Linacre, J. M. (2011). A user’s guide to Winsteps/Ministep Rasch-model computer program. Chicago, IL: Winsteps.com.
Popham, W. J. (2014). Looking at assessment through learner-coloured lens. In C. Wyatt-Smith, V. Klenowski, & P. Colbert (Eds.), Designing assessment for quality learning (pp. 183–194). NY: Springer.
Raîche, G. (2005). Critical eigenvalue sizes in standardized residual principal components analysis. Rasch Measurement Transactions, 19(1), 1012.
Rasch, G. (1960). Probabilistic models for some intelligence and achievement tests. Copenhagan, Denmark: Danish Institute for Educational Research. Expanded. 1980. Chicago, IL: The University of Chicago Press.
Schmid, S., Schultz, M., Priest, S., O’Brien, G., Pyke, S., Bridgeman, A., et al. (2016). Assessing the assessments: Development of a tool to evaluate assessment items in chemistry according to learning outcomes. In Madeleine Schultz, Siegbert Schmid, & Thomas Holme (Eds.), Technology and assessment strategies for improving student learning in chemistry (pp. 225–244). Washington: American Chemical Society.
Schultz, M., Lawrie, G. A., Bailey, C. H., Bedford, S. B., Dargaville, T. R., O’Brien, G., … & Wright, A. H. (2017). Evaluation of diagnostic tools that tertiary teachers can apply to profile their students’ conceptions. International Journal of Science Education, 1–22.
Scully, D. (2017). Constructing multiple-choice items to measure higher-order thinking. Practical Assessment, Research & Evaluation, 22, 1–13.
Wright, B. D., & Stone, M. H. (2004). Making measures. Chicago: Phaneron Press.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Singapore Pte Ltd.
About this chapter
Cite this chapter
Chow, J., Shiu, A. (2020). Examining an Economics Test to Inform University Student Learning Using the Rasch Model. In: Khine, M. (eds) Rasch Measurement. Springer, Singapore. https://doi.org/10.1007/978-981-15-1800-3_7
Download citation
DOI: https://doi.org/10.1007/978-981-15-1800-3_7
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-15-1799-0
Online ISBN: 978-981-15-1800-3
eBook Packages: EducationEducation (R0)