Abstract
Students write tens of thousands of open-ended comments in student evaluation questionnaires, which are collected as part of institutional and national surveys. Often as part of a quality enhancement strategy, teachers analyse these comments in order to gain insights into student perspectives and guide revisions of modules. While institutions have access to enormous amounts of qualitative data, to date limited efforts have been made to analyse and disseminate these data, which could be used by academics and administrative leaders to identify areas of good practice and areas needing improvement. This chapter will examine several innovative uses of qualitative data with automated text analytics (i.e., natural language processing) used to assess and enhance the student experience. Using four case studies from the Open University UK, we will discuss the affordances and limitations of such methods. We found strong differences in quality and quantity of contributions to student comments based upon individual and disciplinary factors.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Arbaugh, J. B. (2014). System, scholar, or students? Which most influences online MBA course effectiveness? Journal of Computer Assisted Learning, 30(4), 349–362. https://doi.org/10.1111/jcal.12048
Ashby, A., Richardson, J. T. E., & Woodley, A. (2011). National student feedback surveys in distance education: An investigation at the UK Open University. Open Learning: The Journal of Open, Distance and e-Learning, 26(1), 5–25. https://doi.org/10.1080/02680513.2011.538560
Boring, A., Ottoboni, K., & Stark, P. (2016). Student evaluations of teaching (mostly) do not measure teaching effectiveness. ScienceOpen Research.
Borrego, M., & Newswander, L. K. (2010). Definitions of interdisciplinary research: Toward graduate-level interdisciplinary learning outcomes. The Review of Higher Education, 34(1), 61–84. https://doi.org/10.1353/rhe.2010.0006
Clow, D., Coughlan, T., Cross, S., Edwards, C., Gaved, M., Herodotou, C., … Ullmann, T. (2019). Scholarly insight Winter 2019: A Data wrangler perspective. Open University UK.
Coughlan, T., Ullmann, T. D., & Lister, K. (2017). Understanding accessibility as a process through the analysis of feedback from disabled students. Paper presented at the W4A’17 International Web for All Conference, New York. http://oro.open.ac.uk/48991/
Dommeyer, C. J., Baum, P., Hanna, R. W., & Chapman, K. S. (2004). Gathering faculty teaching evaluations by in-class and online surveys: Their effects on response rates and evaluations. Assessment & Evaluation in Higher Education, 29(5), 611–623. https://doi.org/10.1080/02602930410001689171
Gamliel, E., & Davidovitz, L. (2005). Online versus traditional teaching evaluation: Mode can matter. Assessment & Evaluation in Higher Education, 30(6), 581–592. https://doi.org/10.1080/02602930500260647
Grebennikov, L., & Shah, M. (2013). Student voice: Using qualitative feedback from students to enhance their university experience. Teaching in Higher Education, 18(6), 606–618. https://doi.org/10.1080/13562517.2013.774353
HEFCE. (2016). Review of information about learning and teaching and the student experience. Results and analysis of for the 2016 pilot of the National Student Survey. HEFCE.
Jeonghee, Y., Nasukawa, T., Bunescu, R., & Niblack, W. (2003, November 19–22). Sentiment analyzer: Extracting sentiments about a given topic using natural language processing techniques. Paper presented at the Third IEEE International Conference on Data Mining.
Kember, D., & Ginns, P. (2012). Evaluating teaching and learning. Routledge.
Langan, A. M., & Harris, W. E. (2019). National student survey metrics: Where is the room for improvement? Higher Education, 78(6), 1075–1089. https://doi.org/10.1007/s10734-019-00389-1
Leong, C. K., Lee, Y. H., & Mak, W. K. (2012). Mining sentiments in SMS texts for teaching evaluation. Expert Systems with Applications, 39(3), 2584–2589. https://doi.org/10.1016/j.eswa.2011.08.113
Li, N., Marsh, V., & Rienties, B. (2016). Modeling and managing learner satisfaction: Use of learner feedback to enhance blended and online learning experience. Decision Sciences Journal of Innovative Education, 14(2), 216–242. https://doi.org/10.1111/dsji.12096
Li, N., Marsh, V., Rienties, B., & Whitelock, D. (2017). Online learning experiences of new versus continuing learners: A large scale replication study. Assessment & Evaluation in Higher Education, 42(4), 657–672. https://doi.org/10.1080/02602938.2016.1176989
McDonald, J., Moskal, A. C. M., Goodchild, A., Stein, S., & Terry, S. (2020). Advancing text-analysis to tap into the student voice: A proof-of-concept study. Assessment & Evaluation in Higher Education, 45(1), 154–164. https://doi.org/10.1080/02602938.2019.1614524
Moskal, A. C. M., Stein, S. J., & Golding, C. (2015). Can you increase teacher engagement with evaluation simply by improving the evaluation system? Assessment & Evaluation in Higher Education, 41(2), 286–300. https://doi.org/10.1080/02602938.2015.1007838
Nguyen, Q., Rienties, B., Toetenel, L., Ferguson, F., & Whitelock, D. (2017). Examining the designs of computer-based assessment and its impact on student engagement, satisfaction, and pass rates. Computers in Human Behavior, 76, 703–714. https://doi.org/10.1016/j.chb.2017.03.028
Open University UK. (2014). Ethical use of student data for learning analytics policy. Retrieved June 23, 2016, from http://www.open.ac.uk/students/charter/essential-documents/ethical-use-student-data-learning-analytics-policy
Rayson, P. (2008). From key words to key semantic domains. International Journal of Corpus Linguistics, 13(4), 519–549. https://doi.org/10.1075/ijcl.13.4.06ray
Richardson, J. T. E. (2005). Instruments for obtaining student feedback: A review of the literature. Assessment & Evaluation in Higher Education, 30(4), 387–415. https://doi.org/10.1080/02602930500099193
Richardson, J. T. E. (2006). Investigating the relationship between variations in students’ perceptions of their academic environment and variations in study behaviour in distance education. British Journal of Educational Psychology, 76(4), 867–893. https://doi.org/10.1348/000709905X69690
Richardson, J. T. E. (2013). The National Student Survey and its impact on UK higher education. In M. Shah & C. S. Nair (Eds.), Enhancing student feedback and improvement systems in tertiary education (Vol. 5, pp. 76–84). Commission for Academic Accreditation.
Richardson, J. T. E., Mittelmeier, J., & Rienties, B. (2020). The role of gender, social class and ethnicity in participation and academic attainment in UK higher education: An update. Oxford Review of Education, 46(3), 346–362. https://doi.org/10.1080/03054985.2019.1702012
Richardson, J. T. E., Slater, J. B., & Wilson, J. (2007). The National Student Survey: Development, findings and implications. Studies in Higher Education, 32(5), 557–580. https://doi.org/10.1080/03075070701573757
Rienties, B. (2014). Understanding academics’ resistance towards (online) student evaluation. Assessment & Evaluation in Higher Education, 39(8), 987–1001. https://doi.org/10.1080/02602938.2014.880777
Rienties, B., Boroowa, A., Cross, S., Kubiak, C., Mayles, K., & Murphy, S. (2016). Analytics4Action Evaluation Framework: A review of evidence-based learning analytics interventions at Open University UK. Journal of Interactive Media in Education, 1(2), 1–12. https://doi.org/10.5334/jime.394
Rienties, B., & Héliot, Y. (2018). Enhancing (in)formal learning ties in interdisciplinary management courses: A quasi-experimental social network study. Studies in Higher Education, 43(3), 437–451. https://doi.org/10.1080/03075079.2016.1174986
Rienties, B., & Toetenel, L. (2016). The impact of learning design on student behaviour, satisfaction and performance: A cross-institutional comparison across 151 modules. Computers in Human Behavior, 60, 333–341. https://doi.org/10.1016/j.chb.2016.02.074
Shah, M. (2019). Making the student voice count: Using qualitative student feedback to enhance the student experience. Journal of Applied Research in Higher Education, ahead-of-print (ahead-of-print). https://doi.org/10.1108/JARHE-02-2019-0030
Shah, M., Nair, C. S., & Richardson, J. T. E. (2017). Chapter 8 – Accessing student voice: Using qualitative student feedback. In M. Shah, C. S. Nair, & J. T. E. Richardson (Eds.), Measuring and enhancing the student experience (pp. 91–101). Chandos Publishing.
Socher, R., Perelygin, A., Wu, J., Chuang, J., Manning, C. D., Ng, A. Y., & Potts, C. (2013). Recursive deep models for semantic compositionality over a sentiment treebank. Paper presented at the Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing, Seattle, Washington.
Ullmann, T. (2015a). Automated detection of reflection in texts. A machine learning based approach. (PhD). Open University UK. Retrieved from http://oro.open.ac.uk/45402/
Ullmann, T. (2015b). Keywords of written reflection – A comparison between reflective and descriptive datasets. Paper presented at the Proceedings of the 5th Workshop on Awareness and Reflection in Technology Enhanced Learning, Toledo, Spain. http://ceur-ws.org/Vol-1465/paper8.pdf
Ullmann, T. (2017). Reflective writing analytics – Empirically determined keywords of written reflection. Paper presented at the Seventh International Learning Analytics & Knowledge Conference, Vancouver, Canada. http://oro.open.ac.uk/48840/
Ullmann, T., Lay, S., Cross, S., Edwards, C., Gaved, M., Jones, E., … Rienties, B. (2018). Scholarly insight Spring 2018: A Data wrangler perspective. Open University.
Ullmann, T., Wild, F., & Scott, P. (2012). Comparing automatically detected reflective texts with human judgements. Paper presented at the 2nd Workshop on Awareness and Reflection in Technology Enhanced Learning Saarbrucken, Germany.
Wen, M., Yang, D., & Rosé, C. P. (2014). Sentiment Analysis in MOOC Discussion Forums: What does it tell us. Paper presented at the 7th Educational Data Mining Conference.
Yin, R. K. (2009). Case study research: Design and methods (5th ed.). Sage.
Zaitseva, E., Milsom, C., & Stewart, M. (2013). Connecting the dots: Using concept maps for interpreting student satisfaction. Quality in Higher Education, 19(2), 225–247. https://doi.org/10.1080/13538322.2013.802576
Acknowledgement
This paper is in part based upon two Data Wrangler reports (Clow et al., 2019; Ullmann et al., 2018) that were published under a creative common license at the Open University. We are extremely grateful for all the Open University staff who have helped to support the building of the dictionary approaches, as well as their kind suggestions which data could be useful to explore. Many thanks to the Text Analytics of Student Comments (TASC) Initiative in IET for its support of this work.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this chapter
Cite this chapter
Ullmann, T., Rienties, B. (2021). Using Text Analytics to Understand Open-Ended Student Comments at Scale: Insights from Four Case Studies. In: Shah, M., Richardson, J.T.E., Pabel, A., Oliver, B. (eds) Assessing and Enhancing Student Experience in Higher Education. Palgrave Macmillan, Cham. https://doi.org/10.1007/978-3-030-80889-1_9
Download citation
DOI: https://doi.org/10.1007/978-3-030-80889-1_9
Published:
Publisher Name: Palgrave Macmillan, Cham
Print ISBN: 978-3-030-80888-4
Online ISBN: 978-3-030-80889-1
eBook Packages: EducationEducation (R0)