Abstract
Students are often challenged by the demand of writing cohesive explanatory texts. Prior research has shown that providing students with concept map feedback that visualizes explanatory cohesion deficits helped students generate more cohesive explanations. We conducted an experiment to investigate whether the accuracy of the provided information within the concept map feedback affected students’ improvements of cohesion. Accordingly, we varied the represented accuracy of information within such concept maps: Students either received accurate concept map feedback that depicted the real relations between concepts, as well as the authentic cohesion gaps in their explanations, or students received inaccurate concept map feedback, which depicted randomly drawn relations and random cohesion gaps. Additionally, in a baseline condition, students did not receive any feedback. We found that the students in the accurate feedback condition generated more cohesive explanations than the students in the no-feedback condition, whereas the students in the inaccurate feedback condition lay in-between. Evidently, providing feedback in general can be regarded as beneficial to enhance students’ writing. However, the accuracy of the provided feedback further impacts the effectiveness of computer-generated concept maps.
Similar content being viewed by others
References
Ainsworth, S. (2006). DeFT: a conceptual framework for considering learning with multiple representations. Learning and Instruction, 16(3), 183–198. https://doi.org/10.1016/j.learninstruc.2006.03.001.
Berlanga, A. J., Van Rosmalen, P., Boshuizen, H. P., & Sloep, P. B. (2012). Exploring formative feedback on textual assignments with the help of automatically created visual representations. Journal of Computer Assisted Learning, 28(2), 146–160. https://doi.org/10.1111/j.1365-2729.2011.00425.x.
Cho, K., & MacArthur, C. (2010). Student revision with peer and expert reviewing. Learning and Instruction, 20(4), 328–338. https://doi.org/10.1016/j.learninstruc.2009.08.006.
Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Hillsdale, NJ: Erlbaum.
Concha, S., & Paratore, J. R. (2011). Local coherence in persuasive writing: An exploration of Chilean students’ metalinguistic knowledge, writing process, and writing products. Written Communication, 28(1), 34–69. https://doi.org/10.1177/0741088310383383.
Furr, R. M., & Rosenthal, R. (2003). Evaluating theories efficiently: The nuts and bolts of contrast analysis. Understanding Statistics: Statistical Issues in Psychology, Education, and the Social Sciences, 2(1), 33–67. https://doi.org/10.1207/S15328031US0201_03.
Gielen, S., Peeters, E., Dochy, F., Onghena, P., & Struyven, K. (2010). Improving the effectiveness of peer feedback for learning. Learning and Instruction, 20(4), 304–315. https://doi.org/10.1016/j.learninstruc.2009.08.007.
Graesser, A. C., Millis, K. K., & Zwaan, R. A. (1997). Discourse comprehension. Annual Review of Psychology, 48(1), 163–189. https://doi.org/10.1146/annurev.psych.48.1.163.
Graham, S., Harris, K., & Hebert, M. A. (2011). Informing writing: The benefits of formative assessment. A Carnegie Corporation Time to Act report. Washington, DC: Alliance for Excellent Education.
Hall, S. S., Kowalski, R., Paterson, K. B., Basran, J., Filik, R., & Maltby, J. (2014). Local text cohesion, reading ability and individual science aspirations: Key factors influencing comprehension in science classes. British Educational Research Journal, 41, 122–142. https://doi.org/10.1002/berj.3134.
Halliday, M. A. K., & Hasan, R. (2014). Cohesion in english. London: Longman.
Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational Research, 77(1), 81–112. https://doi.org/10.3102/003465430298487.
Hirst, J. M., Reed, F. D. D., & Reed, D. D. (2013). Effects of varying feedback accuracy on task acquisition: A computerized translational study. Journal of Behavioral Education, 22(1), 1–15. https://doi.org/10.1007/s10864-012-9162-0.
Kellogg, R. T., & Whiteford, A. P. (2009). Training advanced writing skills: The case for deliberate practice. Educational Psychologist, 44(4), 250–266. https://doi.org/10.1080/00461520903213600.
Kim, M. (2013). Concept map engineering: methods and tools based on the semantic relation approach. Educational Technology Research and Development, 61(6), 951–978. https://doi.org/10.1007/s11423-013-9316-3.
Lachner, A., Burkhart, C., & Nückles, M. (2017a). Formative computer-based feedback in the university classroom: Specific concept maps scaffold students’ writing. Computers in Human Behavior, 72(4), 459–469. https://doi.org/10.1016/j.chb.2017.03.008.
Lachner, A., Burkhart, C., & Nückles, M. (2017b). Mind the gap! Automated concept map feedback supports students in writing cohesive explanations. Journal of Experimental Psychology: Applied, 23(1), 29–46. https://doi.org/10.1037/xap0000111.
Lachner, A., & Nückles, M. (2015). Bothered by abstractness or engaged by cohesion? Experts’ explanations enhance novices’ deep-learning. Journal of Experimental Psychology: Applied, 21(1), 101–115. https://doi.org/10.1037/xap0000038.
Larkin, J. H., & Simon, H. A. (1987). Why a diagram is (sometimes) worth ten thousand words. Cognitive Science, 11(1), 65–100. https://doi.org/10.1111/j.1551-6708.1987.tb00863.x.
McNamara, D. S. (2013). The epistemic stance between the author and reader: A driving force in the cohesion of text and writing. Discourse Studies, 15(5), 579–595. https://doi.org/10.1177/1461445613501446.
McNamara, D. S., Crossley, S. A., Roscoe, R., Allen, L., & Dai, J. (2015). A hierarchical classification approach to automated essay scoring. Assessing Writing, 23(1), 35–59. https://doi.org/10.1016/j.asw.2014.09.002.
McNamara, D. S., & Kintsch, W. (1996). Learning from text: Effects of prior knowledge and text coherence. Discourse Processes, 22(3), 247–287. https://doi.org/10.1080/01638539609544975.
McNamara, D. S., Louwerse, M. M., McCarthy, P. M., & Graesser, A. C. (2010). Coh-metrix: Capturing linguistic features of cohesion. Discourse Processes, 47(4), 292–330. https://doi.org/10.1080/01638530902959943.
Molloy, E., & Boud, D. (2013). Feedback models for learning, teaching and performance. In J. M. Spector, D. Merrill, J. Elen, & M. J. Bishop (Eds.), Handbook of research on educational communications and technology (4th ed., pp. 413–424). New York: Springer.
Narciss, S. (2008). Feedback strategies for interactive learning tasks. In J. M. Spector, M. D. Merrill, J. J. G. van Merrienboer, & M. P. Driscoll (Eds.), Handbook of research on educational communications and technology (3rd ed., pp. 125–144). Mahaw, NJ: Lawrence Erlbaum Associates.
Nathan, M. J., Rummel, N., & Hay, K. E. (2016). Growing the learning sciences: brand or big tent? Implications for graduate education. In M. A. Evans, M. J. Packer, & R. K. Sawyer (Eds.), Reflections on the learning sciences (pp. 191–209). Cambridge: Cambridge University Press.
National Commission on Writing. (2004). Writing: A ticket to work… or a ticket out. Retrieved from www.collegeboard.com.
Ozuru, Y., Briner, S., Best, R., & McNamara, D. S. (2010). Contributions of self-explanation to comprehension of high- and low-cohesion texts. Discourse Processes, 47(8), 641–667. https://doi.org/10.1080/01638531003628809.
Patchan, M. M., Schunn, C. D., & Correnti, R. J. (2016). The nature of feedback: How peer feedback features affect students’ implementation rate and quality of revisions. Journal of Educational Psychology. https://doi.org/10.1037/edu0000103.
Pirnay-Dummer, P., Ifenthaler, D., & Spector, J. M. (2010). Highly integrated model assessment technology and tools. Educational Technology Research and Development, 58(1), 3–18. https://doi.org/10.1007/s11423-009-9119-8.
Rau, M. A., Michaelis, J. E., & Fay, N. (2015). Connection making between multiple graphical representations: A multi-methods approach for domain-specific grounding of an intelligent tutoring system for chemistry. Computers & Education, 82, 460–485. https://doi.org/10.1016/j.compedu.2014.12.009.
Reilly, E. D., Stafford, R. E., Williams, K. M., & Corliss, S. B. (2014). Evaluating the validity and applicability of automated essay scoring in two massive open online courses. The International Review of Research in Open and Distributed Learning. https://doi.org/10.19173/irrodl.v15i5.1857.
Roscoe, R., & McNamara, D. (2013). Writing Pal: Feasibility of an intelligent writing strategy tutor in the high school classroom. Journal of Educational Psychology, 105, 1010–1025. https://doi.org/10.1037/a0032340.
Rowan, K. E. (1988). A contemporary theory of explanatory writing. Written Communication, 5(1), 23–56. https://doi.org/10.1177/0741088388005001002.
Schmid, H., & Laws, F. (2008). Estimation of conditional probabilities with decision trees and an application to fine-grained POS tagging. In Proceedings of the 22nd international conference on computational linguistics.
Schnotz, W., & Bannert, M. (2003). Construction and interference in learning from multiple representation. Learning and Instruction, 13(2), 141–156. https://doi.org/10.1016/S0959-4752(02)00017-8.
Shute, V. J. (2008). Focus on formative feedback. Review of Educational Research, 78(1), 153–189. https://doi.org/10.3102/0034654307313795.
Sung, Y. T., Liao, C. N., Chang, T. H., Chen, C. L., & Chang, K. E. (2015). The effect of online summary assessment and feedback system on the summary writing on 6th graders: The LSA-based technique. Computers & Education. https://doi.org/10.1016/j.compedu.2015.12.003.
Van Valin, R. D. (2001). An introduction to syntax. New York: Cambridge University Press.
Wäschle, K., Lachner, A., Stucke, B., Rey, S., Frömmel, C., & Nückles, N. (2014). Effects of visual feedback on medical students’ procrastination within web-based planning and reflection protocols. Computers in Human Behavior, 41, 120–136. https://doi.org/10.1016/j.chb.2014.09.022.
Wittwer, J., & Ihme, N. (2014). Reading skill moderates the impact of semantic similarity and causal specificity on the coherence of explanations. Discourse Processes, 51(1–2), 143–166. https://doi.org/10.1080/0163853X.2013.855577.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no conflict of interest.
Ethical statement
All procedures performed in this study were in accordance with the 1964 Helsinki declaration, and the German Psychological Society’s (DGPS) ethical guidelines. According to the DGPS guidelines, experimental studies only need approval from an institutional review board if participants are exposed to risks that are related to high emotional or physical stress or when participants are not informed about the goals and procedures included in the study. As none of these conditions applied to the current study, we did not seek approval from an institutional review board.
Informed consent
Informed consent was obtained from all individual participants included in the study.
Additional information
We would like to thank Christian Burkhart for the programming of the concept map feedback tool; Nathanael Kautz, and Tim Steininger for helping us with collecting and coding the data.
Rights and permissions
About this article
Cite this article
Lachner, A., Backfisch, I. & Nückles, M. Does the accuracy matter? Accurate concept map feedback helps students improve the cohesion of their explanations. Education Tech Research Dev 66, 1051–1067 (2018). https://doi.org/10.1007/s11423-018-9571-4
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11423-018-9571-4