Skip to main content
Log in

Modeling Unproductive Behavior in Online Homework in Terms of Latent Student Traits: An Approach Based on Item Response Theory

  • Published:
Journal of Science Education and Technology Aims and scope Submit manuscript

Abstract

Homework is an important component of most physics courses. One of the functions it serves is to provide meaningful formative assessment in preparation for examinations. However, correlations between homework and examination scores tend to be low, likely due to unproductive student behavior such as copying and random guessing of answers. In this study, we attempt to model these two counterproductive learner behaviors within the framework of Item Response Theory in order to provide an ability measurement that strongly correlates with examination scores. We find that introducing additional item parameters leads to worse predictions of examination grades, while introducing additional learner traits is a more promising approach.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  • Ackerman TA (1994) Using multidimensional item response theory to understand what items and tests are measuring. Appl Meas Educ 7(4):255–278

    Article  Google Scholar 

  • Bergner Y, Droschler S, Kortemeyer G, Rayyan S, Seaton D, Pritchard DE (2012) Model-based collaborative filtering analysis of student response data: machine-learning item response theory. Int Educ Data Min Soc 5:95–102

    Google Scholar 

  • Birnbaum A (1968) Some latent trait models and their use in inferring an examinee’ s ability. In: Lord FM, Novick MR (eds) Statistical theories of mental test scores. Addison-Wesley, Reading, pp 374–472

    Google Scholar 

  • Bock RD, Aitkin M (1981) Marginal maximum likelihood estimation of item parameters: application of an em algorithm. Psychometrika 46(4):443–459

    Article  Google Scholar 

  • Cao J, Stokes SL (2008) Bayesian irt guessing models for partial guessing behaviors. Psychometrika 73(2):209–230

    Article  Google Scholar 

  • Cardamone CN, Abbott JE, Rayyan S, Seaton DT, Pawl A, Pritchard DE (2011) Item response theory analysis of the mechanics baseline test. In: Physics Education Research Conference 2011 aip Conference Proceedings 135–138

  • Coombs CH, Milholland JE, Womer FB (1956) The assessment of partial knowledge. Educ Psychol Meas 16(1):13–37

    Article  Google Scholar 

  • Crouch CH, Mazur E (2001) Peer instruction: ten years of experience and results. Am J Phys 69(9):970–977

    Article  Google Scholar 

  • de la Torre J, Douglas JA (2004) Higher-order latent trait models for cognitive diagnosis. Psychometrika 69(3):333–353

    Article  Google Scholar 

  • de la Torre J, Hong Y (2010) Parameter estimation with small sample size a higher-order irt model approach. Appl Psychol Meas 34(4):267–285

    Article  Google Scholar 

  • Ding L, Beichner R (2009) Approaches to data analysis of multiple-choice questions. Phys Rev Spec Top Phys Educ Res 5:020103. doi:10.1103/PhysRevSTPER.5.020103

    Article  Google Scholar 

  • Embretson SE, Reise SP (2013) Item response theory for psychologists. Psychology Press, Routledge

    Google Scholar 

  • Fredericks C (2007) Patterns of behavior in online homework for introductory physics. University of Massachusetts, Ph.D. dissertation

  • Gray KE, Adams WK, Wieman CE, Perkins KK (2008) Students know what physicists believe, but they don’t agree: a study using the class survey. Phys Rev Spec Top Phys Educ Res 4:020106. doi:10.1103/PhysRevSTPER.4.020106

    Article  Google Scholar 

  • Hambleton RK, Swaminathan H, Rogers HJ (1991) Fundamentals of item response theory. Sage, London

    Google Scholar 

  • Harwell M, Stone CA, Hsu TC, Kirisci L (1996) Monte carlo studies in item response theory. Appl Psychol Meas 20(2):101–125

    Article  Google Scholar 

  • Kashy DA, Albertelli G, Bauer W, Kashy E, Thoennessen M (2003) Influence of nonmoderated and moderated discussion sites on student success. J Asynchron Learn Netw 7(1):31–36

    Google Scholar 

  • Kontur FJ, de La Harpe K, Terry NB (2015) Benefits of completing homework for students with different aptitudes in an introductory electricity and magnetism course. Phys Rev Spec Top Phys Educ Res 11:010105. doi:10.1103/PhysRevSTPER.11.010105

    Article  Google Scholar 

  • Kortemeyer G (2005) An analysis of asynchronous online homework discussions in introductory physics courses. Am J Phys 74:526–536

    Article  Google Scholar 

  • Kortemeyer G (2007) Correlations between student discussion behavior, attitudes, and learning. Phys Rev Spec Top Phys Educ Res 3:010101

    Article  Google Scholar 

  • Kortemeyer G (2009) Gender differences in the use of an online homework system in an introductory physics course. Phys Rev Spec Top Phys Educ Res 5:010107

    Article  Google Scholar 

  • Kortemeyer G (2014) Extending item response theory to online homework. Phys Rev Spec Top Phys Educ Res 10:010118. doi:10.1103/PhysRevSTPER.10.010118

    Article  Google Scholar 

  • Kortemeyer G (2015) An empirical study of the effect of granting multiple tries for online homework. Am J Phys 83:646–653

    Article  Google Scholar 

  • Kortemeyer G (2016) Scalable continual quality control of formative assessment items in an educational digital library: an empirical study. Int J Digit Lib 17(2):143–155. doi:10.1007/s00799-015-0145-3

    Article  Google Scholar 

  • Kortemeyer G, Hall M, Parker J, Minaei-Bidgoli B, Albertelli G II, Bauer W, Kashy E (2005) Effective feedback to the instructor from online homework. J Asyn Learn Netw 9:19

    Google Scholar 

  • Kortemeyer G, Kashy E, Benenson W, Bauer W (2008) Experiences using the open-source learning content management and assessment system LON-CAPA in introductory physics courses. Am J Phys 76:438–444

    Article  Google Scholar 

  • Lasry N, Dugdale M, Charles E (2014) Just in time to flip your classroom. Phys Teach 52(1):34–37

    Article  Google Scholar 

  • Laverty JT, Bauer W, Kortemeyer G, Westfall G (2012) Want to reduce guessing and cheating while making students happier? Give more exams!. Phys Teach 50:540–543

    Article  Google Scholar 

  • Lee YJ, Palazzo DJ, Warnakulasooriya R, Pritchard DE (2008) Measuring student learning with item response theory. Phys Rev Spec Top Phys Educ Res 4:010102. doi:10.1103/PhysRevSTPER.4.010102

    Article  Google Scholar 

  • Levine MV, Drasgow F (1988) Optimal appropriateness measurement. Psychometrika 53(2):161–176

    Article  Google Scholar 

  • Levy R, Mislevy RJ, Sinharay S (2009) Posterior predictive model checking for multidimensionality in item response theory. Appl Psychol Meas 33(7):519–537

    Article  Google Scholar 

  • Lin H (1982) Learning physics vs. passing courses. Phys Teach 20:151–157

    Article  Google Scholar 

  • Loken E, Rulison KL (2010) Estimation of a four-parameter item response theory model. Br J Math Stat Psychol 63(3):509–525

    Article  Google Scholar 

  • Lord FM, Novick MR (1968) Statistical theories of mental test scores. Addison-Wesley, Reading

    Google Scholar 

  • Maris E (1999) Estimating multiple classification latent class models. Psychometrika 64(2):187–212

    Article  Google Scholar 

  • Martín ES, del Pino G, De Boeck P (2006) Irt models for ability-based guessing. Appl Psychol Meas 30(3):183–203

    Article  Google Scholar 

  • Mislevy RJ (1982) Foundations of a new test theory. ETS Res Rep Series 1982(2):32

    Article  Google Scholar 

  • Mislevy RJ, Stocking ML (1989) A consumer’s guide to logist and bilog. Appl Psychol Meas 13(1):57–75

    Article  Google Scholar 

  • Mislevy RJ, Verhelst N (1990) Modeling item responses when different subjects employ different solution strategies. Psychometrika 55(2):195–215

    Article  Google Scholar 

  • Molenaar IW, Hoijtink H (1990) The many null distributions of person fit indices. Psychometrika 55(1):75–106

    Article  Google Scholar 

  • Morris GA, Branum-Martin L, Harshman N, Baker SD, Mazur Eric, Dutta Suvendra, Mzoughi Taha, McCauley Veronica (2006) Testing the test: item response curves and test quality. Am J Phys 74(5):449–453

    Article  Google Scholar 

  • Morris GA, Harshman N, Branum-Martin L, Mazur E, Mzoughi Taha, Baker Stephen D (2012) An item response curves analysis of the force concept inventory. Am J Phys 80(9):825–831

    Article  Google Scholar 

  • Muraki E (1992) A generalized partial credit model: application of an em algorithm. Appl Psychol Meas 16:159–176

    Article  Google Scholar 

  • Novak GM, Patterson ET, Gavrin AD, Christian W (1999) Just-in-time teaching: blending active learning with web technology. Prentice Hall, Upper Saddle River

    Google Scholar 

  • Palazzo DJ, Lee YJ, Warnakulasooriya R, Pritchard DE (2010) Patterns, correlates, and reduction of homework copying. Phys Rev Spec Top Phys Educ Res 6:010104. doi:10.1103/PhysRevSTPER.6.010104

    Article  Google Scholar 

  • Pascarella AM (2002) CAPA (computer-assisted personalized assignments) in a large university setting. University of Colorado at Boulder, Ph.D. dissertation

  • Pascarella AM (2004) The influence of web-based homework on quantitative problem-solving in a university physics class. In: Proceedings of NARST Annual Meeting

  • Patz RJ, Junker BW (1999) A straightforward approach to markov chain monte carlo methods for item response models. J Educ Behav Stat 24:146–178

    Article  Google Scholar 

  • Perdian DC (2013) Early identification of student performance and effort using an online homework system: a pilot study. J Sci Educ Technol 22(5):697–701. doi:10.1007/s10956-012-9423-7

    Article  Google Scholar 

  • R Development Core Team (2008) R: A language and environment for statistical computing. Vienna, Austria. R Foundation for Statistical Computing. ISBN 3-900051-07-0. http://www.R-project.org

  • Rasch G (1960) Probabilistic models for some intelligence and attainment tests. Danmarks Paedagogiske Institut, Copenhagen (University of Chicago Press, Chicago, 1980)

    Google Scholar 

  • Reckase M (2009) Multidimensional item response theory. Springer, Berlin

    Book  Google Scholar 

  • Reckase MD (1997) The past and future of multidimensional item response theory. Appl Psychol Meas 21(1):25–36

    Article  Google Scholar 

  • Rizopoulos D (2006) ltm: An R package for latent variable modeling and item response theory analyses. J Stat Softw 17 (5): 1–25. http://www.jstatsoft.org/v17/i05/

  • Rulison KL, Loken E (2009) I’ve fallen and i can’t get up: Can high-ability students recover from early mistakes in cat? Appl Psychol Meas 33(2):83–101

    Article  Google Scholar 

  • Setzer JC, Wise SL, van den Heuvel JR, Ling G (2013) An investigation of examinee test-taking effort on a large-scale assessment. Appl Meas Educ 26(1):34–49

    Article  Google Scholar 

  • Sinharay S (2005) Assessing fit of unidimensional item response theory models using a Bayesian approach. J Educ Meas 42(4):375–394

    Article  Google Scholar 

  • Sinharay S, Johnson MS, Stern HS (2006) Posterior predictive assessment of item response theory models. Appl Psychol Meas 30(4):298–321

    Article  Google Scholar 

  • Stone CA, Yeh CC (2006) Assessing the dimensionality and factor structure of multiple-choice exams an empirical comparison of methods using the multistate bar examination. Educ Psychol Meas 66(2):193–214

    Article  Google Scholar 

  • Swerdzewski PJ, Christine Harmes J, Finney SJ (2011) Two approaches for identifying low-motivated students in a low-stakes assessment context. Appl Meas Educ 24(2):162–188. doi:10.1080/08957347.2011.555217

    Article  Google Scholar 

  • van der Linden WJ, Sotaridona L (2006) Detecting answer copying when the regular response process follows a known response model. J Educ Behav Stat 31(3):283–304

    Article  Google Scholar 

  • Wainer H, Bradlow ET, Wang X (2007) Testlet response theory and its applications. Cambridge University Press, Cambridge

    Book  Google Scholar 

  • Waller MI (1989) Modeling guessing behavior: a comparison of two irt models. Appl Psychol Meas 13 (3): 233–243. doi:10.1177/014662168901300302. http://apm.sagepub.com/content/13/3/233.abstract

  • Wang J, Bao L (2010) Analyzing force concept inventory with item response theory. Am J Phys 78(10):1064–1070

    Article  Google Scholar 

  • Wang X, Bradlow ET, Wainer H (2002) A general bayesian model for testlets: theory and applications. ETS Res Rep Series 2002(1):37

    Article  Google Scholar 

  • Wells CS, Subkoviak MJ, Serlin RC (2002) The effect of item parameter drift on examinee ability estimates. Appl Psychol Meas 26(1):77–87

    Article  Google Scholar 

  • Wise Steven L, DeMars Christine E (2005) Low examinee effort in low-stakes assessment: problems and potential solutions. Educ Assess 10(1):1–17

    Article  Google Scholar 

  • Wollack JA (1997) A nominal response model approach for detecting answer copying. Appl Psychol Meas 21(4):307–320

    Article  Google Scholar 

  • Wu M (2013) Using item response theory as a tool in educational measurement. In: Mok MMC (ed) Dordrecht: Springer. 978-94-007-4507-0, pp 157–185

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Gerd Kortemeyer.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Gönülateş, E., Kortemeyer, G. Modeling Unproductive Behavior in Online Homework in Terms of Latent Student Traits: An Approach Based on Item Response Theory. J Sci Educ Technol 26, 139–150 (2017). https://doi.org/10.1007/s10956-016-9659-8

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10956-016-9659-8

Keywords

Navigation