Skip to main content

A Community-Approach to Item Calibration for Testing Math-Skills in Engineering

  • 592 Accesses

Part of the Lecture Notes in Networks and Systems book series (LNNS,volume 389)

Abstract

Developing mathematical skills requires the opportunity to practice and receive immediate, individualized feedback on the misconceptions or mistakes made in the problem-solving process. Huge progress has been made in the last years in the design of feedback systems for fundamental math education. Applied mathematics education for engineering disciplines, however, lacks a large body of examples with pre-worked solution paths and known difficulty, which are necessary for providing learners with (semi-)automated feedback. This is mostly due to the need for domain-specific and situated tasks, which are not that widely deployable as generic items. The effort required for designing appropriate items, validating them in terms of the appropriateness for specific learning outcomes, and calibrating their difficulty cannot be borne by individual teachers and is also hardly justifiable for commercial providers of item pools. In this paper, we strive to show how these challenges can be addressed via a community approach to item design and calibration, supported by the methods from the computerized adaptive testing field.

Keywords

  • Mathematics
  • Computerized adaptive testing
  • Item pool calibration
  • Crowdsourcing
  • Technology-enhanced learning

This is a preview of subscription content, access via your institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • DOI: 10.1007/978-3-030-93904-5_46
  • Chapter length: 13 pages
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
eBook
USD   229.00
Price excludes VAT (USA)
  • ISBN: 978-3-030-93904-5
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
Softcover Book
USD   299.99
Price excludes VAT (USA)
Fig. 1.
Fig. 2.
Fig. 3.
Fig. 4.
Fig. 5.

Notes

  1. 1.

    http://www.ncme.org/resources/glossary.

References

  1. Alonso, O.: The practice of crowdsourcing. Synth. Lect. Inf. Concepts Retr. Serv. 11(1), 1–149 (2019)

    MathSciNet  Google Scholar 

  2. Arai, S., Mayekawa, S.I.: A comparison of equating methods and linking designs for developing an item pool under item response theory. Behaviormetrika 38(1), 1–16 (2011)

    CrossRef  Google Scholar 

  3. Aral, N., Oppl, S.: Towards comprehensive technology-supported formative assessment in math education – a literature review. In: ERME Topic Conference on Mathematics Education in the Digital Age (MEDA) (2020)

    Google Scholar 

  4. Ban, J.C., Hanson, B.A., Wang, T., Yi, Q., Harris, D.J.: A comparative study of on-line pretest item-calibration/scaling methods in computerized adaptive testing. J. Educ. Meas. 38(3), 191–212 (2001)

    CrossRef  Google Scholar 

  5. Bjorner, J.B., Kosinski, M., Ware, J.E., Jr.: Calibration of an item pool for assessing the burden of headaches: an application of IRT to the headache impact test. Qual. Life Res. 12(8), 913–933 (2003). https://doi.org/10.1023/A:1026163113446

    CrossRef  Google Scholar 

  6. Brabham, D.C.: Crowdsourcing. MIT Press, Cambridge (2013)

    CrossRef  Google Scholar 

  7. Brinkhuis, M.J., Maris, G.: Dynamic parameter estimation in student monitoring systems. Measurement and Research Department Reports (Rep. No. 2009-1). Arnhem: Cito 146 (2009)

    Google Scholar 

  8. Costello, E., et al.: Is it in the bin? Seeking authentic assessment in STEM: ATSSTEM. In: The 38th Pupils’ Attitudes Towards Technology Conference, p. 31 (2021)

    Google Scholar 

  9. Eggen, T.J., Verschoor, A.J.: Optimal testing with easy or difficult items in computerized adaptive testing. Appl. Psychol. Meas. 30(5), 379–393 (2006)

    MathSciNet  CrossRef  Google Scholar 

  10. Embretson, S.E., Reise, S.P.: Item Response Theory. Psychology Press, New York (2000). https://www.taylorfrancis.com/books/mono/10.4324/9781410605269/item-response-theory-susan-embretson-steven-reise

  11. Faber, J.M., Luyten, H., Visscher, A.J.: The effects of a digital formative assessment tool on mathematics achievement and student motivation: results of a randomized experiment. Comput. Educ. 106, 83–96 (2017)

    CrossRef  Google Scholar 

  12. Hohenwarter, M., Fuchs, K.: Combination of dynamic geometry, algebra and calculus in the software system GeoGebra. In: Computer Algebra Systems and Dynamic Geometry Systems in Mathematics Teaching Conference, pp. 1–6 (2004)

    Google Scholar 

  13. Hohenwarter, M., Hohenwarter, J., Kreis, Y., Lavicza, Z.: Teaching and learning calculus with free dynamic mathematics software GeoGebra. In: 11th International Congress on Mathematical Education (2008)

    Google Scholar 

  14. Jia, J., Le, H.: The design and implementation of a computerized adaptive testing system for school mathematics based on item response theory. In: Lee, L.K., U, L.H., Wang, F.L., Cheung, S.K.S., Au O., Li, K.C. (eds.) Technology in Education. Innovations for Online Teaching and Learning. ICTE 2020. CCIS, vol. 1302, pp. 100–111 (2020). Springer, Singapore. https://doi.org/10.1007/978-981-33-4594-2_9

  15. Kingsbury, G.G.: Adaptive item calibration: a process for estimating item parameters within a computerized adaptive test. In: Proceedings of the 2009 GMAC Conference on Computerized Adaptive Testing (2009)

    Google Scholar 

  16. Lavicza, Z.: Factors influencing the integration of computer algebra systems into university-level mathematics education. Int. J Technol. Math. Educ. 14(3), 121 (2007)

    Google Scholar 

  17. Linacre, J.M.: Computer-adaptive testing: A methodology whose time has come. Technical report, MESA memorandum, Seoul, South Korea (2000)

    Google Scholar 

  18. van der Linden, W.J., Glas, C.A.: Capitalization on item calibration error in adaptive testing. Appl. Meas. Educ. 13(1), 35–53 (2000)

    CrossRef  Google Scholar 

  19. van der Linden, W.J., Pashley, P.J.: Item selection and ability estimation in adaptive testing. In: van der Linden, W., Glas, C. (eds.) Elements of Adaptive Testing. Statistics for Social and Behavioral Sciences, pp. 3–30. Springer, New York (2009). https://doi.org/10.1007/978-0-387-85461-8_1

  20. Lord, F., Novick, M.R.: Statistical Theories of Mental Test Scores. Addison-Wesley, Reading (1968)

    MATH  Google Scholar 

  21. Olsher, S., Yerushalmy, M., Chazan, D.: How might the use of technology in formative assessment support changes in mathematics teaching? Learn. Math. 36(3), 11–18 (2016)

    Google Scholar 

  22. Oppl, S., Reisinger, F., Eckmaier, A., Helm, C.: A flexible online platform for computerized adaptive testing. Int. J. Educ. Technol. High. Educ. 14(1), 2 (2017). https://doi.org/10.1186/s41239-017-0039-0

    CrossRef  Google Scholar 

  23. Peffers, K., Tuunanen, T., Rothenberger, M.A., Chatterjee, S.: A design science research methodology for information systems research. J. Manage. Inf. Syst. 24(3), 45–77 (2007)

    CrossRef  Google Scholar 

  24. Segall, D.O.: Computerized adaptive testing. In: Encyclopedia of Social Measurement (2004)

    Google Scholar 

  25. Tackett, S., et al.: Crowdsourcing for assessment items to support adaptive learning. Med. Teach. 40(8), 838–841 (2018)

    CrossRef  Google Scholar 

  26. Veldkamp, B.P., van der Linden, W.J.: Designing item pools for computerized adaptive testing. In: van der Linden, W.J., Glas, G.A. (eds.) Computerized Adaptive Testing: Theory and Practice, pp. 149–162 (2000). Springer, Dordrecht. https://doi.org/10.1007/0-306-47531-6_8

  27. Verschoor, A., Berger, S., Moser, U., Kleintjes, F.: On-the-fly calibration in computerized adaptive testing. In: Theoretical and Practical Advances in Computer-based Educational Measurement, pp. 307–323 (2019)

    Google Scholar 

  28. Wainer, H., Dorans, N.J., Flaugher, R., Green, B.F., Mislevy, R.J.: Computerized Adaptive Testing: A Primer. Routledge, Abingdon (2000)

    CrossRef  Google Scholar 

  29. Wiliam, D.: Formative assessment in mathematics part 3: the learner’s role. Equals Math. Spec. Educ. Needs 6(1), 19–22 (2000)

    Google Scholar 

  30. Zieba, A.: The item information function in one and two-parameter logistic models - a comparison and use in the analysis of the results of school tests. Didactics Math. 10(14), 87–96 (2013)

    Google Scholar 

Download references

Acknowledgements

The authors would like to thank Eva-Maria Infanger for providing the mock-ups of items in Geogebra classroom.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nilay Aral .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and Permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Verify currency and authenticity via CrossMark

Cite this paper

Aral, N., Oppl, S. (2022). A Community-Approach to Item Calibration for Testing Math-Skills in Engineering. In: Auer, M.E., Hortsch, H., Michler, O., Köhler, T. (eds) Mobility for Smart Cities and Regional Development - Challenges for Higher Education. ICL 2021. Lecture Notes in Networks and Systems, vol 389. Springer, Cham. https://doi.org/10.1007/978-3-030-93904-5_46

Download citation