Advertisement

Examinee Issues

  • Cynthia G. Parshall
  • Judith A. Spray
  • John C. Kalohn
  • Tim Davey
Part of the Statistics for Social and Behavioral Sciences book series (SSBS)

Abstract

A computer-based testing program has numerous psychometric needs, and it can be easy to get so involved with these psychometric details that other essential points of view are lost. It is important to remember that every time an exam is given there is a person on the other end of the computer screen and to consider the reality of the testing experience for the examinee. Various steps can be taken to reduce examinee stress and make the experience more pleasant (or at least less unpleasant). Hopefully, these steps will reduce confounding variance and help produce a test that is as fair as possible. Those testing programs that are elective (i.e., that examinees are not required to take) have a particular need to ensure that the computerized testing experience is not onerous. There is little benefit in developing a wonderful test that no one chooses to take. This chapter will briefly address some computerized testing issues from the examinees’ point of view.

Keywords

Mental Model Item Type Examinee Issue Computerize Adaptive Test Adaptive Test 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Association of Test Publishers (ATP). (2000). Computer-Based Testing Guidelines.Google Scholar
  2. Bennett, R. E., & Bejar, I. I. (1998). Validity and automated scoring: It’s not only the scoring. Educational Measurement Issues and Practice, 17, 9–17.CrossRefGoogle Scholar
  3. Bunderson, V. C, Inouye, D. I., & Olsen, J. B. (1989). The four generations of computerized educational measurement. In Linn, R. (Ed.) Educational Measurement, 3rd edition. New York: American Council on Education and Macmillan Publishing Co.Google Scholar
  4. Gallagher, A., Bridgeman, B., & Cahalan, C. (1999, April). The Effect of CBT on Racial, Gender, and Language Groups. Paper presented at the annual meeting of the National Council on Measurement in Education, Montreal.Google Scholar
  5. Harmes, J. C, & Parshall, C. G. (2000, November). An Iterative Process for Computerized Test Development: Integrating Usability Methods. Paper presented at the annual meeting of the Florida Educational Research Association, Tallahassee.Google Scholar
  6. Kingsbury, G. G. (1996, April). Item Review and Adaptive Testing. Paper presented at the annual meeting of the National Council on Measurement in Education, New York.Google Scholar
  7. Landauer, T. K. (1996). The Trouble with Computers: Usefulness, Usability, and Productivity. Cambridge, Mass: MIT Press.Google Scholar
  8. Lunz, M. E., & Bergstrom, B. A. (1994). An empirical study of computerized adaptive test administration conditions. Journal of Educational Measurement, 31, 251–263.CrossRefGoogle Scholar
  9. Lunz, M. E., Bergstrom, B. A., & Wright, B. D. (1992). The effect of review on student ability and test efficiency for computer adaptive testing. Applied Psychological Measurement, 16, 33–40.CrossRefGoogle Scholar
  10. Millman, J., & Greene, J. (1989). The specification and development of tests of achievement and ability. In Linn, R. (ed.) Educational Measurement, 3rd edition. New York: American Council on Education and Macmillan Publishing Co.Google Scholar
  11. Norman, D. A. (1990). The Design of Everyday Things. New York: Doubleday.Google Scholar
  12. O’Neill, K., & Kubiak, A. (1992, April). Lessons Learned from Examinees about Computer-Based Tests: Attitude Analyses. Paper presented at the annual meeting of the National Council on Measurement in Education, San Francisco.Google Scholar
  13. O’Neill, K., & Powers, D. E. (1993, April). The Performance of Examinee Subgroups on a Computer-Administered Test of Basic Academic Skills. Paper presented at the annual meeting of the National Council on Measurement in Education, Atlanta.Google Scholar
  14. Perlman, M., Berger, K., & Tyler, L. (1993). An Application of Multimedia Software to Standardized Testing in Music. (Research Rep. No. 93-36) Princeton, NJ: Educational Testing Service.Google Scholar
  15. Pommerich, M., & Burden, T. (2000, April). From Simulation to Application: Examinees React to Computerized Testing. Paper presented at the annual meeting of the National Council on Measurement in Education, New Orleans.Google Scholar
  16. Rosen, G. A. (2000, April). Computer-Based Testing: Test Site Security. Paper presented at the annual meeting of the National Council on Measurement in Education, New Orleans.Google Scholar
  17. Stocking, M. L. (1997). Revising item responses in computerized adaptive tests: A comparison of three models. Applied Psychological Measurement, 21, 129–142.CrossRefGoogle Scholar
  18. Sutton, R. E. (1993, April). Equity Issues in High Stakes Computerized Testing. Paper presented at the annual meeting of the American Educational Research Association, Atlanta.Google Scholar
  19. Tullis, T. (1997). Screen Design. In Heiander, M., Landauer, T. K., & Prabhu, P. (eds). Handbook of Human-Computer Interaction, 2nd completely revised edition, (503–531). Amsterdam: Elsevier.Google Scholar
  20. Vispoel, W. P., Hendrickson, A. B., & Bleiler, T. (2000). Limiting answer review and change on computerized adaptive vocabulary tests: Psychometric and attitudinal results. Journal of Educational Measurement, 37, 21–38.CrossRefGoogle Scholar
  21. Vispoel, W. P., Rocklin, T. R., Wang, T., & Bleiler, T. (1999). Can examinees use a review option to obtain positively biased ability estimates on a computerized adaptive test? Journal of Educational Measurement, 36, 141–157.CrossRefGoogle Scholar
  22. Way, W. D. (1994). Psychometric Models for Computer-Based Licensure Testing. Paper presented at the annual meeting of CLEAR, Boston.Google Scholar
  23. Wise, S. (1996, April). A Critical Analysis of the Arguments for and Against Item Review in Computerized Adaptive Testing. Paper presented at the annual meeting of the National Council on Measurement in Education, New York.Google Scholar
  24. Wise, S. (1997, April). Examinee Issues in CAT. Paper presented at the annual meeting of the National Council on Measurement in Education, Chicago.Google Scholar

Additional Readings

  1. Becker, H. J., & Sterling, C. W. (1987). Equity in school computer use: National data and neglected considerations. Journal of Educational Computing Research, 3, 289–311.CrossRefGoogle Scholar
  2. Burke, M. J., Normand, J., & Raju, N. S. (1987). Examinee attitudes toward computer-administered ability testing. Computers in Human Behavior, 3, 95–107.CrossRefGoogle Scholar
  3. Koch, B. R., & Patience, W. M. (1978). Student attitudes toward tailored testing. In D. J. Weiss (ed.), Proceedings of the 1977 Computerized Adaptive Testing Conference. Minneapolis: University of Minnesota, Department of Psychology.Google Scholar
  4. Llabre, M. M., & Froman, T. W. (1987). Allocation of time to test items: A study of ethnic differences. Journal of Experimental Education, 55, 137–140.Google Scholar
  5. Moe, K. C, & Johnson, M. F. (1988). Participants’ reactions to computerized testing. Journal of Educational Computing Research, 4, 79–86.CrossRefGoogle Scholar
  6. Ward, T. J. Jr., Hooper, S. R., & Hannafin, K. M. (1989). The effects of computerized tests on the performance and attitudes of college students. Journal of Educational Computing Research, 5, 327–333.CrossRefGoogle Scholar
  7. Wise, S. L., Barnes, L. B., Harvey, A. L., & Plake, B. S. (1989). Effects of computer anxiety and computer experience on the computer-based achievement test performance of college students. Applied Measurement in Education, 2, 235–241.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2002

Authors and Affiliations

  • Cynthia G. Parshall
    • 1
  • Judith A. Spray
    • 2
  • John C. Kalohn
    • 2
  • Tim Davey
    • 3
  1. 1.University of South FloridaTampaUSA
  2. 2.ACT, Inc.Iowa CityUSA
  3. 3.Educational Testing ServicePrincetonUSA

Personalised recommendations