Skip to main content

Mode Comparability Studies for a High-Stakes Testing Program

  • Conference paper
Quantitative Psychology (IMPS 2016)

Part of the book series: Springer Proceedings in Mathematics & Statistics ((PROMS,volume 196))

Included in the following conference series:

  • 1288 Accesses

Abstract

Mode comparability between paper and online versions of a test cannot be simply assumed. This paper presents the research designs, statistical analyses, and major findings from a series of special studies intended to ensure score comparability for a high-stakes testing program, including an online timing study and two mode comparability studies as well as a general framework that guided the design of these studies. The framework views score comparability as a matter of degree and the evaluation of score comparability as a matter of score validation. The high-stakes uses of the test scores required stringent score comparability which was obtained by applying test-equating methodologies under a randomly equivalent groups design. Meanwhile, score equivalency and construct equivalency were examined through statistical analyses of test results and responses to survey questions. The comparability framework and results from these studies may provide guidance for other testing programs transitioning from paper to online or when evaluating score comparability in general.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • G. Camilli, L.A. Shepard, Methods for Identifying Biased Test Items (SAGE Publications, Thousand Oaks, CA, 1994)

    Google Scholar 

  • M.T. Kane, in Educational Measurement, 4th edn., ed. by R. L. Brennan. Validation (American Council on Education and Praeger, Westport, CT, 2006), pp. 17–64

    Google Scholar 

  • N.M. Kingston, Comparability of computer- and paper-administered multiple-choice tests for k-12 populations: a synthesis. Appl. Meas. Educ. 22, 22–37 (2009)

    Article  Google Scholar 

  • M.J. Kolen, Threats to score comparability with applications to performance assessments and computerized adaptive tests. Educ. Assess. 6(2), 73–96 (1999)

    Article  MathSciNet  Google Scholar 

  • M. Kolen, R. Brennan, Test Equating, Scaling, and Linking: Methods and Practices, 3rd edn. (Springer, New York, NY, 2014)

    Book  Google Scholar 

  • H.V. Leeson, The mode effect: a literature review of human and technological issues in computerized testing. Int. J. Test. 6(1), 1–24 (2006)

    Article  Google Scholar 

  • S. Lottridge, A. Nicewander, M. Schulz, H. Mitzel, Comparability of paper-based and computer-based tests: a review of the methodology. Paper submitted to the CCSSO Technical Issues in Large Scale Assessment Comparability Research Group, 2008. Retrieved fromhttps://docs.google.com/viewer?a=v&pid=sites&srcid=ZGVmYXVsdGRvbWFpbnxwYXBlcn ZlcnN1c3NjcmVlbnxneDoxNTU0NmE0NDY0NTQ4MzA4

  • N. Mantel, W. Haenszel, Statistical aspects of the analysis of data from retrospective studies of disease. J. Natl. Cancer Inst. 22, 719–748 (1959)

    Google Scholar 

  • J. Mazzeo, A.L. Harvey, The Equivalence of Scores from Automated and Conventional Educational and Psychological Tests: A Review of the Literature (College Board Report No. 88–8; ETS RR No. 88–21) (College Entrance Examination Board, New York, NY, 1988)

    Google Scholar 

  • A.D. Mead, F. Drasgow, Equivalence of computerized and paper-and-pencil cognitive ability tests: a meta-analysis. Psychol. Bull. 114, 449–458 (1993)

    Article  Google Scholar 

  • A. Mroch, D. Li, T. Thompson, A framework for evaluating score comparability. Paper presented at the Annual Meeting of the National Council on Measurement in Education, Chicago, IL, 2015

    Google Scholar 

  • P. Mutler, in Cognitive Aspects of Electronic Next Processing, ed. by H. van Oostendorp, S. de Mul. Interface design and optimization of reading of continuous text (Ablex, Norwood, NJ, 1996), pp. 161–180

    Google Scholar 

  • C.G. Parshall, J.A. Spray, J.C. Kalohn, T. Davey, Practical Considerations in Computer-Based Testing (Springer-Verlag, New York, NY, 2002)

    Book  Google Scholar 

  • M. Pommerich, Developing computerized versions of paper-and-pencil tests: mode effects for passage-based tests. J. Technol. Learn. Assess. 2(6) (2004.) Available from http://ejournals.bc.edu/ojs/index.php/jtla/article/view/1666/1508

  • U. Schroeders, O. Wilhelm, Equivalence of reading and listening comprehension across test media. Educ. Psychol. Meas. 71(5), 849–869 (2011)

    Article  Google Scholar 

  • T. Wang, M.J. Kolen, Evaluating comparability in computerized adaptive testing: issues, criteria and an example. J. Educ. Meas. 38, 19–49 (2001)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dongmei Li .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this paper

Cite this paper

Li, D., Yi, Q., Harris, D.J. (2017). Mode Comparability Studies for a High-Stakes Testing Program. In: van der Ark, L.A., Wiberg, M., Culpepper, S.A., Douglas, J.A., Wang, WC. (eds) Quantitative Psychology. IMPS 2016. Springer Proceedings in Mathematics & Statistics, vol 196. Springer, Cham. https://doi.org/10.1007/978-3-319-56294-0_31

Download citation

Publish with us

Policies and ethics