Skip to main content
Log in

Performance and duration differences between online and paper–pencil tests

  • Published:
Asia Pacific Education Review Aims and scope Submit manuscript

Abstract

Digital technologies have been used for measurement purposes and whether the test medium influences the user is an important issue. The aim of this study is to investigate students’ performances and duration differences between online and paper–pencil tests. An online testing tool was developed and administered in order to determine the differences between the traditional paper–pencil tests and online tests concerning students’ performances and the duration on tests. This tool enables to add questions that utilize an online database and which are in the form of multiple choice (with 5 or 4 options), true–false, matching, filling in the blanks, with multiple answers, with short answers, with long answers, and it also enables to prepare tests and to turn them into paper–pencil test mode. Performance test was applied with both online and paper–pencil modes on junior students at one of the universities in Turkey. Besides, the online testing tool developed within the context of the study was evaluated by instructors with respect to usability, relevance to the purpose and design. Instructor and student questionnaires are developed to determine the opinions on the online testing tool and online tests. Results showed that there was no significant differences between the performances on online and paper–pencil tests. On the other hand, the time they spent on the online test has been longer than the time they spent on paper–pencil test. Students found the online testing tool easy to use and stated that online test medium is more comfortable than paper–pencil tests. However, they complained about external noises, tiredness, and focusing problems regarding the online examination mediums. Generally, instructors have also appreciated the online testing tool’s design and they agree on the fact that it serves for its purposes.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

References

  • Bernt, F. M., & Bugbee, A. C. (1988). Your time is up! An assessment of time limits for American College students. Examination Research Report No. 88–1. Bryn Mawr, PA: The American College.

    Google Scholar 

  • Bernt, F. M., & Bugbee, A. C., Jr. (1990). Factors influencing student resistance to computer administered testing. Journal of Research on Computing in Education, 22(3), 265–275.

    Google Scholar 

  • Bodmann, S. M., & Robinson, D. H. (2004). Speed and performance differences among computer-based and paper–pencil tests. Journal of Educational Computing Research, 31(1), 51–60.

    Article  Google Scholar 

  • Clariana, R., & Wallace, P. (2002). Paper-based versus computer-based assessment: Key factors associated with the test mode effect. Br J Educ Technol, 33(5), 593–602.

    Article  Google Scholar 

  • Cole, R. P., MacIsaac, D. (2001). A comparison of paper-based and web-based testing. (ERIC Document Reproduction Service No. ED453224).

  • Kruk, R. S., & Muter, P. (1984). Reading continuous text on video screens. Human Factors, 26, 339–345.

    Google Scholar 

  • Noyes, J., Garland, K., & Norris, L. (2004). Paper-based versus computer-based: Is workload another test mode effect? British Journal of Educational Technology, 35(1), 111–113.

    Article  Google Scholar 

  • Puhan. G., Boughton, K., Kim, S. (2007). Examining differences in examinee performance in paper and pencil and computerized testing. Journal of Technology Learning, and Assessment, 6(3), 4–20. (ERIC Document Reproduction Service No: EJ838613).

    Google Scholar 

  • Russell, M. (1999). Testing on computers: A follow up study comparing performance on computer and on paper. Education Policy Analysis Archives, 7, 20.

    Google Scholar 

  • Wang, T. H., Wang, H., Wang, W. L., Huang, S. C., & Chen, S. Y. (2004). Web based assessment and test analyses (WATA) system development and evaluation. Journal of Computer Assisted Learning, 20, 59–71.

    Article  Google Scholar 

  • Whiting, H., & Kline, T. J. B. (2009). Assessment of the equivalence of conventional versus computer administration of the test of workplace essential skills. International Journal of Training and Development, 10(4), 285–290. (ERIC Document Reproduction Service No: EJ839769).

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Alper Bayazit.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Bayazit, A., Aşkar, P. Performance and duration differences between online and paper–pencil tests. Asia Pacific Educ. Rev. 13, 219–226 (2012). https://doi.org/10.1007/s12564-011-9190-9

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12564-011-9190-9

Keywords

Navigation