Skip to main content

Computer-Based Assessment: From Objective Tests to Automated Essay Grading. Now for Automated Essay Writing?

  • Conference paper

Part of the book series: Lecture Notes in Business Information Processing ((LNBIP,volume 20))

Abstract

Assessment of student learning is an important task undertaken by educators. However it can be time consuming and costly for humans to grade student work. Technology has been available to assist teachers in grading objective tests for several decades; however these true-false and multiple choice tests do not capture the deeper aspects of student learning. Essay writing can be used to assess this deeper learning, which includes a student’s ability to synthesize his/her thoughts, and argue for propositions. Automated essay grading systems are now starting to be used in the educational sector with some success. They can reduce the cost of grading, and they also eliminate the inconsistencies that are found amongst human graders when marking the same essay. The next development in essay processing technology is automated essay writing. This development will present a new set of challenges for educators. The detection of automatically generated essays may be difficult, and students may be given credit for writing which does not reflect their true ability. An understanding of how these systems would work, and the characteristics of the generated essays, is thus needed in order to detect them. This paper describes the components we believe an automated essay generator would need to have, and the results of building a prototype of the first of these components, the Gatherer.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Baker, F.B.: Automation of Test Scoring, Reporting, and Analysis. In: Thorndike, R. (ed.) Educational Measurement, 2nd edn., American Council on Education, Washington, D. C (1976)

    Google Scholar 

  2. Blue Wren Software Pty. Ltd., http://trial.essaygrading.com

  3. Burstein, J., Kaplan, R., Wolff, S., Lu, C.: Using Lexical Semantic Techniques to Classify Free-Responses. In: Proceedings from the SIGLEX 1996 Workshop. ACL, Santa Cruz (1996)

    Google Scholar 

  4. Burstein, J., Kukich, K., Wolff, S., Lu, C., Chodorow, M.: Enriching Automated Essay Scoring Using Discourse Marking. In: Proceedings of the Workshop on Discourse Relations and Discourse Markers, Annual Meeting of the Association of Computational Linguistics, Montreal, Canada (August 1998)

    Google Scholar 

  5. Burstein, J., Leacock, C., Swartz, R.: Automated Evaluation of Essay and Short Answers. In: Danson, M. (ed.) Proceedings of the Sixth International Computer Assisted Assessment Conference, Loughborough University, Loughborough, UK (2001)

    Google Scholar 

  6. California Electronic Writer/V.A.F., http://www.rxnetwriter.com/product_sheet.html

  7. Christie, J.R.: Automated Essay Marking – For Both Style and Content. In: Danson, M. (ed.) Proceedings of the Third International Computer Assisted Assessment Conference, Loughborough University, Leicestershire, UK (1999)

    Google Scholar 

  8. Content Analyst Company, http://www.contentanalyst.com/solutions/essay.htm

  9. Ebel, R.L.: Essentials of Educational Measurement, 3rd edn., p. 96. Prentice-Hall, Englewood Cliffs (1979)

    Google Scholar 

  10. Ebel, R.L.: Essentials of Educational Measurement, 3rd edn. Prentice-Hall, Englewood Cliffs (1979)

    Google Scholar 

  11. Ebel, R.L.: Essentials of Educational Measurement, 3rd edn., p. 100. Prentice-Hall, Englewood Cliffs (1979)

    Google Scholar 

  12. Ericsson, P.F., Haswell, R. (eds.): Machine Scoring of Student Essays - Truth and Consequences. Utah State University Press, Logan (2006)

    Google Scholar 

  13. Gronlund, N.E., Linn, R.L.: Measurement and Evaluation in Teaching, 6th edn., p. 211. Macmillan, New York (1990)

    Google Scholar 

  14. Idea Works, http://www.ideaworks.com/sagrader/

  15. Landauer, T.K., Foltz, P.W., Laham, D.: An Introduction to Latent Semantic Analysis. Discourse Processes 25, 259–284 (1998)

    Article  Google Scholar 

  16. Larkey, L.S.: Automatic Essay Grading Using Text Categorization Techniques. In: Proceedings of the Twenty First Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, Melbourne, Australia, pp. 90–95 (1998)

    Google Scholar 

  17. Mason, O., Grove-Stephenson, I.: Automated Free Text Marking with Paperless School. In: Danson, M. (ed.) Proceedings of the Sixth International Computer Assisted Assessment Conference, Loughborough University, Leicestershire, UK (2002)

    Google Scholar 

  18. McGee, T.: Taking a Spin on the Intelligent Essay Assessor. In: Ericsson, P.F., Haswell, R. (eds.) Machine Scoring of Student Essays - Truth and Consequences. Utah State University Press, Logan (2006)

    Google Scholar 

  19. Ming, P.Y., Mikhailov, A.A., Kuan, T.L.: Intelligent Essay Marking System. In: Cheers, C. (ed.) Learners Together. NgeeANN Polytechnic, Singapore (2000)

    Google Scholar 

  20. Mitchell, T., Russell, T., Broomhead, P., Aldridge, N.: Towards Robust Computerised Marking of Free-Text Responses. In: Danson, M. (ed.) Proceedings of the Sixth International Computer Assisted Assessment Conference, Loughborough University, Leicestershire, UK (2002)

    Google Scholar 

  21. Nikšić, H., Cowan, M.: GNU Wget, http://www.gnu.org/software/wget/

  22. Page, E.B.: The Imminence of Grading Essays by Computer. Phi Delta Kappan, 238–243 (January 1966)

    Google Scholar 

  23. Page, E.B.: Computer Grading of Student Prose, Using Modern Concepts and Software. Journal of Experimental Education 62, 127–142 (1994)

    Article  Google Scholar 

  24. Rudner, L.M., Liang, T.: Automated Essay Scoring Using Bayes’ Theorem. Journal of Technology, Learning, and Assessment 1(2) (2002)

    Google Scholar 

  25. Shermis, M.D., Burstein, J.C. (eds.): Automated Essay Scoring - A Cross-Disciplinary Perspective. Lawrence Erlbaum Associates, Mahwah (2003)

    Google Scholar 

  26. Valenti, S., Neri, F., Cucchiarelli, A.: An Overview of Current Research on Automated Essay Grading. Journal of Information Technology Education 2, 319–330 (2003)

    Google Scholar 

  27. Vantage Learning, http://www.vantage.com/pdfs/intellimetric.pdf

  28. Williams, R.: The Power of Normalised Word Vectors for Automatically Grading Essays. Journal of Issues in Informing Science and Information Technology 3, 721–729 (2006)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2009 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Williams, R., Nash, J. (2009). Computer-Based Assessment: From Objective Tests to Automated Essay Grading. Now for Automated Essay Writing?. In: Yang, J., Ginige, A., Mayr, H.C., Kutsche, RD. (eds) Information Systems: Modeling, Development, and Integration. UNISCON 2009. Lecture Notes in Business Information Processing, vol 20. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-01112-2_22

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-01112-2_22

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-01111-5

  • Online ISBN: 978-3-642-01112-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics