Skip to main content
Log in

Generating timely individualized feedback to support student learning of conceptual knowledge in Writing-To-Learn activities

  • Published:
Journal of Computers in Education Aims and scope Submit manuscript

Abstract

As a pedagogical strategy, Writing-to-Learn uses writing to improve students’ understanding of course content, but most existing writing feedback systems focus on improving students’ writing skills rather than their conceptual development. In this article, we propose an automatic approach to generate individualized feedback based on comparing knowledge representations extracted from lecture slides and individual students’ writing assignments. The novelty of our approach lies in the feedback generation: to help students assimilate new knowledge into their existing knowledge better, their current knowledge is modeled as a set of matching concepts, and suggested concepts and concept relationships for inclusion are generated as feedback by combing two factors: importance and relevance of feedback candidates to the matching concepts in the domain knowledge. A total of 88 students were recruited to participate in a repeated measures study. Results show that most participants felt the feedback they received was relevant (78.4%), easy to understand (82.9%), accurate (76.1%) and useful (79.5%); they also felt that the proposed system made it easier to study course concepts (80.7%) and was useful in learning course concepts (77.3%). Analyses of students’ submitted assignments reveal that more course concepts and concept relationships were included when they used the proposed system.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

References

  • Aguiar, C. Z., Cury, D., & Zouaq, A. (2016). Automatic construction of concept maps from texts. In Paper presented at the seventh international conference on concept mapping, 2016, Tallinn, Estonia.

  • Anderson, J. R. (1993). Problem solving and learning. American Psychologist, 48(1), 35.

    Article  Google Scholar 

  • Andrews, K., Wohlfahrt, M., & Wurzinger, G. (2009). Visual graph comparison. In Paper presented at the 13th international conference information visualisation, 2009, Barcelona, Spain.

  • Atapattu, T., Falkner, K., & Falkner, N. (2012). Automated extraction of semantic concepts from semi-structured data: Supporting computer-based education through the analysis of lecture notes. In Paper presented at the 23rd international conference on database and expert systems application, 2012, Vienna, Austria.

  • Atapattu, T., Falkner, K., & Falkner, N. (2014a). Acquisition of triples of knowledge from lecture notes: A natural language processing approach. In Paper presented at the 7th international conference on educational data mining, 2014a, London, United Kingdom.

  • Atapattu, T., Falkner, K., & Falkner, N. (2014b). Evaluation of concept importance in concept maps mined from lecture notes: Computer vs human. In Paper presented at the 6th international conference on computer supported education, 2014b, Barcelona, Spain.

  • Ausubel, D. P. (2000). The acquisition and retention of knowledge: A cognitive view. Kluwer Academic.

  • Ausubel, D. P., Novak, J. D., & Hanesian, H. (1968). Educational psychology: A cognitive view. Holt, Rinehart & Winston.

  • Berlanga, A. J., Van Rosmalen, P., Boshuizen, H. P., & Sloep, P. B. (2012). Exploring formative feedback on textual assignments with the help of automatically created visual representations. Journal of Computer Assisted Learning, 28(2), 146–160.

    Article  Google Scholar 

  • Boud, D. (2000). Sustainable assessment: Rethinking assessment for the learning society. Studies in Continuing Education, 22(2), 151–167.

    Article  Google Scholar 

  • Boud, D., & Molloy, E. (2013). Rethinking models of feedback for learning: The challenge of design. Assessment and Evaluation in Higher Education, 38(6), 698–712.

    Article  Google Scholar 

  • Bransford, J. D., Brown, A. L., & Cocking, R. R. (1999). How people learn: Brain, mind, experience, and school (expanded edition). National Research Council.

  • Chatel, R. G. (1997). Writing to learn in science: A curriculum guide. Coventry Science Center, ERIC.

  • Chen, N. S., Wei, C. W., & Chen, H. J. (2008). Mining e-Learning domain concept map from academic articles. Computers and Education, 50(3), 1009–1021.

    Article  Google Scholar 

  • Chi, M. T. H., Feltovich, P. J., & Glaser, R. (1981). Categorization and representation of physics problems by experts and novices. Cognitive Science, 5(2), 121–152.

    Article  Google Scholar 

  • Croasdell, D. T., Freeman, L. A., & Urbaczewski, A. (2003). Concept maps for teaching and assessment. Communications of the Association for Information Systems, 12(1), 24.

    Google Scholar 

  • Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Quarterly, 13(3), 319–340.

    Article  Google Scholar 

  • De Souza, F., Boeres, M., Cury, D., De Menezes, C., & Carlesso, G. (2008). An approach to comparison of concept maps represented by graphs. In Paper presented at the third international conference on concept mapping, 2008, Tallinn, Estonia and Helsinki, Finland.

  • Emig, J. (1977). Writing as a mode of learning. College Composition and Communication, 28(2), 122–128.

    Article  Google Scholar 

  • Ferrara, L., & Butcher, K. R. (2011). Visualizing feedback: Using graphical cues to promote self-regulated learning. In Paper presented at the thirty-third annual conference of the Cognitive Science Society, 2011, Austin, TX.

  • Floyd, R. W. (1962). Algorithm 97: Shortest path. Communications of the ACM, 5(6), 345.

    Article  Google Scholar 

  • Foltz, P. W., & Rosenstein, M. (2015). Analysis of a large-scale formative writing assessment system with automated feedback. In Paper presented at the proceedings of the second ACM conference on learning at scale, 2015, Vancouver, BC, Canada.

  • Forsman, S. (1985). Writing to learn means learning to think. In Roots in the sawdust (pp. 162–174). National Council of Teachers of English.

  • Gantayat, N., & Iyer, S. (2011). Automated building of domain ontologies from lecture notes in courseware. In Paper presented at the 2011 IEEE international conference on technology for education, 2011, IIT Madras, India.

  • Getchell, K. M., & Pachamanova, D. A. (2022). Writing to learn: A framework for structuring writing assignments to support analytics course learning goals. INFORMS Transactions on Education, 22(2), 103–120.

    Article  Google Scholar 

  • Glynn, S. M., & Muth, K. D. (1994). Reading and writing to learn science: Achieving scientific literacy. Journal of Research in Science Teaching, 31(9), 1057–1073.

    Article  Google Scholar 

  • Gupte, T., Watts, F. M., Schmidt-McCormack, J. A., Zaimi, I., Gere, A. R., & Shultz, G. V. (2021). Students’ meaningful learning experiences from participating in organic chemistry writing-to-learning activities. Chemistry Education Research and Practice, 22(2), 396–414.

    Article  Google Scholar 

  • Hartmann, D. P. (1977). Considerations in the choice of interobserver reliability estimates. Journal of Applied Behavior Analysis, 10(1), 103–116.

    Article  Google Scholar 

  • Hartmann, S., Szarvas, G., & Gurevych, I. (2012). Mining multiword terms from Wikipedia. In Semi-automatic ontology development: Processes and resources (pp. 226–258). IGI Global.

  • Hiebert, J., & Carpenter, T. P. (1992). Learning and teaching with understanding. In D. Grouws (Ed.), Handbook of research on mathematics learning and teaching (pp. 65–97). Macmillan Publishing Company.

    Google Scholar 

  • Hsieh, S.-H., Lin, H.-T., Chi, N.-W., Chou, K.-W., & Lin, K.-Y. (2011). Enabling the development of base domain ontology through extraction of knowledge from engineering domain handbooks. Advanced Engineering Informatics, 25(2), 288–296.

    Article  Google Scholar 

  • Ifenthaler, D. (2010). Bridging the gap between expert-novice differences: The model-based feedback approach. Journal of Research on Technology in Education, 43(2), 103–117.

    Article  Google Scholar 

  • Larranaga, M., Conde, A., Calvo, I., Elorriaga, J. A., & Arruarte, A. (2014). Automatic generation of the domain module from electronic textbooks: Method and validation. IEEE Transactions on Knowledge and Data Engineering, 26(1), 69–82.

    Article  Google Scholar 

  • Manning, C. D., Surdeanu, M., Bauer, J., Finkel, J. R., Bethard, S., & McClosky, D. (2014). The Stanford CoreNLP natural language processing toolkit. In Paper presented at the 52nd annual meeting of the Association for Computational Linguistics: System demonstrations, 2014, Baltimore, MD.

  • Martin, F., Chen, Y., Moore, R., & Westine, C. D. (2020). Systematic review of adaptive learning research designs, context, strategies, and technologies from 2009 to 2018. Education Technology Research and Development, 68(4), 1903–1929.

    Article  Google Scholar 

  • Nathawitharana, N., Huang, Q., Ong, K.-L., Vitartas, P., Jayaratne, M., Alahakoon, D., . . . Ahmed, T. (2017). Towards next generation rubrics: An automated assignment feedback system. Australasian Journal of Information Systems, 21. https://doi.org/10.3127/ajis.v21i0.1553.

  • National Research Council. (2001). In J. Pelligrino, N. Chudowsky & R. Glaser (Eds.), Knowing what students know: The science and design of educational assessment. Committee on the Foundations of Assessment. Board on Testing and Assessment, Center for Education. Division of Behavioral and Social Sciences and Education. National Academy Press.

  • Nicol, D. J., & Macfarlane Dick, D. (2006). Formative assessment and self-regulated learning: A model and seven principles of good feedback practice. Studies in Higher Education, 31(2), 199–218.

    Article  Google Scholar 

  • Novak, J. D. (2002). Meaningful learning: The essential factor for conceptual change in limited or inappropriate propositional hierarchies leading to empowerment of learners. Science Education, 86(4), 548–571.

    Article  Google Scholar 

  • Novak, J. D., & Canas, A. J. (2007). Theoretical origins of concept maps, how to construct them and uses in education. Reflecting Education, 3(1), 29–42.

    Google Scholar 

  • Novak, J. D., & Cañas, A. J. (2008). The theory underlying concept maps and how to construct and use them (Technical Report IHMC CmapTools 2006-01 Rev 01-2008). Florida Institute for Human and Machine Cognition.

  • Novak, J. D., & Gowin, D. B. (1984). Learning how to learn. Cambridge University Press.

    Book  Google Scholar 

  • Olney, A. M., Cade, W. L., & Williams, C. (2011). Generating concept map exercises from textbooks. In Paper presented at the 6th workshop on innovative use of NLP for building educational applications, 2011, Portland, Oregon.

  • Ono, M., Harada, F., & Shimakawa, H. (2011). Semantic network to formalize learning items from lecture notes. International Journal of Advanced Computer Science, 1(1), 10–15.

    Google Scholar 

  • Palermo, C., & Thomson, M. M. (2019). Classroom applications of automated writing evaluation: A qualitative examination of automated feedback. In Educational technology and the new world of persistent learning (pp. 145–175). IGI Global.

  • Pintrich, P. R., Marx, R. W., & Boyle, R. A. (1993). Beyond cold conceptual change: The role of motivational beliefs and classroom contextual factors in the process of conceptual change. Review of Educational Research, 63(2), 167–199.

    Article  Google Scholar 

  • Romance, N. R., & Vitale, M. R. (1999). Concept mapping as a tool for learning: Broadening the framework for student-centered instruction. College Teaching, 47(2), 74–79.

    Article  Google Scholar 

  • Roscoe, R. D., Allen, L. K., Johnson, A. C., & McNamara, D. S. (2018). Automated writing instruction and feedback: Instructional mode, attitudes, and revising. In Paper presented at the Human Factors and Ergonomics Society annual meeting, 2018, Los Angeles, CA.

  • Royer, J. M., Cisero, C. A., & Carlo, M. S. (1993). Techniques and procedures for assessing cognitive skills. Review of Educational Research, 63(2), 201–243.

    Article  Google Scholar 

  • Sadler, D. R. (1989). Formative assessment and the design of instructional systems. Instructional Science, 18(2), 119–144.

    Article  Google Scholar 

  • Stemler, S. E. (2004). A comparison of consensus, consistency, and measurement approaches to estimating interrater reliability. Practical Assessment, Research, and Evaluation, 9, 4.

    Google Scholar 

  • Thurlings, M., Vermeulen, M., Bastiaens, T., & Stijnen, S. (2013). Understanding feedback: A learning theory perspective. Educational Research Review, 9, 1–15.

    Article  Google Scholar 

  • Venkatesh, V., Morris, M. G., Davis, G. B., & Davis, F. D. (2003). User acceptance of information technology: Toward a unified view. MIS Quarterly, 27(3), 425–478.

    Article  Google Scholar 

  • Ventura, J., & Silva, J. (2012). Mining concepts from texts. Procedia Computer Science, 9, 27–36.

    Article  Google Scholar 

  • Villalon, J. J., & Calvo, R. A. (2008). Concept map mining: A definition and a framework for its evaluation. In Paper presented at the IEEE/WIC/ACM international conference on web intelligence and intelligent agent technology, 2008, Sydney, NSW, Australia

  • Villalon, J. J., & Calvo, R. A. (2009). Concept extraction from student essays, towards concept map mining. In Paper presented at the ninth IEEE international conference on advanced learning technologies, Riga, Latvia.

  • Villalon, J. J., & Calvo, R. A. (2011). Concept maps as cognitive visualizations of writing assignments. Educational Technology and Society, 14(3), 16–27.

    Google Scholar 

  • Villalon, J. J., Kearney, P., Calvo, R., & Reimann, P. (2008). Glosser: Enhanced feedback for student writing tasks. In Paper presented at the eighth IEEE international conference on advanced learning technologies, 2008, Santander, Cantabria.

  • Wambsganss, T., Niklaus, C., Cetto, M., Sollner, M., Handschuh, S., & Leimeister, J. M. (2020). AL: An adaptive learning support system for argumentation skills. In Proceedings of the 2020 CHI conference on human factors in computing systems, 2020 (pp. 1–14).

  • Wang, T.-I., Su, C.-Y., & Hsieh, T.-C. (2011). Accumulating and visualising tacit knowledge of teachers on educational assessments. Computers and Education, 57(4), 2212–2223.

    Article  Google Scholar 

  • Whitelock, D., Twiner, A., Richardson, J. T., Field, D., & Pulman, S. (2015). OpenEssayist: A supply and demand learning analytics tool for drafting academic essays. In Paper presented at the fifth international conference on learning analytics and knowledge, 2015, Poughkeepsie, NY, USA.

  • Williams, C. G. (1998). Using concept maps to assess conceptual knowledge of function. Journal for Research in Mathematics Education, 29(4), 414–421.

    Article  Google Scholar 

  • Wilson, J., & Andrada, G. N. (2016). Using automated feedback to improve writing quality: Opportunities and challenges. In Handbook of research on technology tools for real-world skill development (pp. 679–704). IGI Global.

  • Xie, H., Chu, H. C., Hwang, G. J., & Wang, C. C. (2019). Trends and development in technology-enhanced adaptive/personalized learning: A systematic review of journal publications from 2007 to 2017. Computers and Education, 140, 103599.

    Article  Google Scholar 

  • Xiong, Y. (2020). An automated feedback system to support student learning of conceptual knowledge in writing-to-learn activities. Dissertations 1485. https://digitalcommons.njit.edu/dissertations/1485

  • Zubrinic, K., Kalpic, D., & Milicevic, M. (2012). The automatic creation of concept maps from documents written using morphologically rich languages. Expert Systems with Applications, 39(16), 12709–12718.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Wei Xiong.

Ethics declarations

Conflict of interest

On behalf of all authors, the corresponding author states that there is no conflict of interest.

Ethical statement

The study was granted ethical approval from the Institutional Review Board (IRB) at the New Jersey Institute of Technology. All subjects gave informed consent to participate in the study. The data were collected and analyzed anonymously.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Liu, Y., Xiong, W., Xiong, Y. et al. Generating timely individualized feedback to support student learning of conceptual knowledge in Writing-To-Learn activities. J. Comput. Educ. 11, 367–399 (2024). https://doi.org/10.1007/s40692-023-00261-3

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s40692-023-00261-3

Keywords

Navigation