Abstract
The field of Academic Analytics offers considerable potential to Higher Education institutions (HEIs), the academic staff who work for them and, most importantly, the students they teach. This approach to data-led decision-making is starting to have an influence and impact on what is arguably the core business of Higher Education: student learning. As well as being nascent, Learning Analytics is, potentially at least, a very broad area of inquiry and development; the field, necessarily, therefore has significant gaps. It is also just one of a large number of changes and developments that are affecting the way that Higher Education operates. These changes include such things as the introduction of standards-based assessment and outcomes-based education, and the identification and warranting of core competencies and capabilities of university graduates. It is also happening at a time when the affordances of a wide variety of eLearning tools are introducing new possibilities and opportunities to the pedagogy of Higher Education in ways that are demonstrably challenging traditional approaches to teaching and learning, something Sharpe and Oliver famously refer to as the ‘trojan mouse’ (Sharpe and Oliver In Designing courses for e-learning. Rethinking Pedagogy for a Digital Age, Designing and delivering e-learning, pp. 41–51, 2007, p. 49). This chapter considers the role that one such eLearning tool—the e-portfolio—can play in the implementation of a student-facing Learning Analytics strategy in this ambitious new approach to conceptualising, facilitating, structuring, supporting and assuring student learning achievement.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
See for instance, the 2011 Horizon report. It identifies that Learning Analytics is ‘still in its early stages’ (Johnson et al. 2011, p. 28). The first conference devoted entirely to Learning Analytics (the Learning Analytics and Knowledge (LAK11) Conference) was held in Banff in the same year (LAK n.d.). As Ferguson points out, however, there is evidence that it has been taking place in some form since the 1970s (Ferguson 2012).
- 2.
These include (but are not limited to): Educational Data Mining (EDM): “concerned with developing methods for exploring the unique types of data that come from educational settings, and using these methods to better understand students, and the settings which they learn in” (Ferguson 2012); Social Network Analysis (SNA): “explicitly situated within the constructivist paradigm that considers knowledge to be constructed through social negotiation […SNA allows] detailed investigations of networks made up of ‘actors’ and the relations between them” (Aviv et al. 2003; De Laat et al. 2006; Ferguson 2012); Content Analytics: “a broad heading for the variety of automated methods that can be used to examine, index and filter online media assets, with the intention of guiding learners through the ocean of potential resources available to them” (Drachsler et al. 2010; Ferguson 2012; Verbert et al. 2011).
- 3.
For example, SNA draws on the social constructivist pedagogical theories of Dewey and Vygotsky. In contrast, Discourse Analytics draws on, as Ferguson notes, “extensive previous work in such areas as exploratory dialogue, latent semantic analysis and computer-supported argumentation” (Dawson and McWilliam 2008; Ferguson 2012).
- 4.
It is worth noting that, within Maton’s research into cumulative knowledge, Assessment Analytics are used as part of the analytical methodology in the form of the ‘analyses of students’ work products’ (Maton 2009, p. 43).
- 5.
For more information on building customised Quickmarks and Quickmark sets within Grademark, see the user guide available on the iParadigms website (“Home - Guides.turnitin.com,” n.d.).
References
Avis, J. (2000). Policing the subject: Learning outcomes, managerialism and research in PCET. British Journal of Educational Studies, 48(1), 38–57. doi:10.1111/1467-8527.00132
Aviv, R., Erlich, Z., Ravid, G., & Geva, A. (2003). Network analysis of knowledge construction in asynchronous learning networks. Journal of Asynchronous Learning Networks, 7(3), 1–23.
Bach, C. (2010). Learning analytics: Targeting instruction, curricula and student support. Office of the Provost: Drexel University.
Biggs, J. (1996). Enhancing teaching through constructive alignment. Higher Education, 32(3), 347–364.
Biggs, J. (1999). What the student does: Teaching for enhanced learning. Higher Education Research & Development, 18(1), 57–75.
Boud, D., & Molloy, E. (2013). Feedback in higher and professional education: understanding it and doing it well. London and New York: Routledge.
Buckley, S., Coleman, J., Davison, I., Khan, K. S., Zamora, J., Malick, S., … & Sayers, J. (2009). The educational effects of portfolios on undergraduate student learning: A Best Evidence Medical Education (BEME) systematic review. BEME guide no. 11. Medical teacher, 31(4), 282–298. http://doi.org/10.1080/01421590902889897
Campbell, J., & Oblinger, D. (2007). Academic analytics. EDUCAUSE Centre for Applied Research. Retrieved from http://connect.educause.edu/library/abstract/AcademicAnalytics/45275
Clegg, S. (2011). Cultural capital and agency: Connecting critique and curriculum in higher education. British Journal of Sociology of Education, 32(1), 93–108. doi:10.1080/01425692.2011.527723
Dawson, S., & McWilliam, E. (2008). Investigating the application of IT generated data as an indicator of learning and teaching performance. Australian Learning and Teaching Council. Retrieved from http://olt.ubc.ca/learning_tools/research_1/research/
De Laat, M., Lally, V., Lipponen, L., & Simons, R. J. (2006). Analysing student engagement with learning and tutoring activities in networked learning communities: A multi-method approach. International Journal of Web Based Communities, 2(4), 394–412.
Drachsler, H., Bogers, T., Vuorikari, R., Verbert, K., Duval, E., Manouselis, N., … others. (2010). Issues and considerations regarding sharable data sets for recommender systems in technology enhanced learning. Procedia Computer Science, 1(2), 2849–2858.
Ellis, C. (2013). Broadening the scope and increasing the usefulness of learning analytics: The case for assessment analytics. British Journal of Educational Technology, 44(4), 662–664. doi:10.1111/bjet.12028
Ferguson, R. (2012). The state of learning analytics in 2012: A review and future challenges (technical report). UK: Knowledge Media Institute, The Open University.
Home—Guides.turnitin.com. (n.d.). Retrieved April 22, 2016, from https://guides.turnitin.com/
Hughes, J. (2008). Letting in the Trojan mouse: Using an e-portfolio system to re-think pedagogy. Retrieved from https://wlv.openrepository.com/wlv/handle/2436/47434
Johnson, L., Smith, R., Willis, H., Levine, A., & Haywood, K. (2011). The 2011 Horizon report. Austin, Texas: The New Media Consortium.
LAK. (n.d.). First international conference on learning analytics and knowledge 2011 (conference). Retrieved from https://tekri.athabascau.ca/analytics/call-papers
Martinez-Maldonado, R., Pardo, A., Mirriahi, N., Yacef, K., Kay, J., & Clayphan, A. (2015). The LATUX workflow: Designing and deploying awareness tools in technology-enabled learning settings. In Proceedings of the Fifth International Conference on Learning Analytics and Knowledge (pp. 1–10). ACM. Retrieved from http://dl.acm.org/citation.cfm?id=2723583
Martinez-Maldonado, R., Pardo, A., Mirriahi, N., Yacef, K., Kay, J., & Clayphan, A. (2016). Latux: An iterative workflow for designing, validating and deploying learning analytics visualisations. Journal of Learning Analytics, 2(3), 9–39. doi:10.18608/jla.2015.23.3
Maryika, J., Chui, M., Brown, B., Bughin, J., Dobbs, R., Roxburgh, C., & Byers, A. H. (n.d.). Big data: The next frontier for innovation, competition, and productivity| McKinsey Global Institute| Technology & Innovation| McKinsey & Company. Retrieved April 22, 2012, from http://www.mckinsey.com/Insights/MGI/Research/Technology_and_Innovation/Big_data_The_next_frontier_for_innovation
Maton, K. (2009). Cumulative and segmented learning: Exploring the role of curriculum structures in knowledge-building. British Journal of Sociology of Education, 30(1), 43–57. doi:10.1080/01425690802514342
Moon, J. A. (2007). Learning journals (2nd ed.). Taylor & Francis.
Sadler, D. R. (2007). Perils in the meticulous specification of goals and assessment criteria. Assessment in Education, 14(3), 387–392. doi:10.1080/09695940701592097
Sadler, D. R. (2009a). Grade integrity and the representation of academic achievement. Studies in Higher Education, 34(7), 807–826. doi:10.1080/03075070802706553
Sadler, D. R. (2009b). Indeterminacy in the use of preset criteria for assessment and grading. Assessment & Evaluation in Higher Education, 34(2), 159–179. doi:10.1080/02602930801956059
Sadler, D. R. (2010a). Beyond feedback: Developing student capability in complex appraisal. Assessment & Evaluation in Higher Education, 35(5), 535–550. doi:10.1080/02602930903541015
Sadler, D. R. (2010b). Fidelity as a precondition for integrity in grading academic achievement. Assessment & Evaluation in Higher Education, 35(6), 727–743. doi:10.1080/02602930902977756.
Sadler, D. R. (2015). Backwards assessment explanations: Implications for teaching and assessment practice. Springer International Publishing. Retrieved from http://link.springer.com/chapter/10.1007/978-3-319-10274-0_2
Sharpe, R., & Oliver, M. (2007). Designing courses for e-learning (pp. 41–51). Rethinking Pedagogy for a Digital Age: Designing and delivering e-learning.
Siemens, G., Gasevic, D., Haythornthwaite, C., Dawson, S., Buckingham Shum, S., Ferguson, R., … Baker, R. S. J. d. (2011). Open learning analytics: An integrated and modularized platform. Society for learning analytics research. Retrieved from http://solaresearch.org/OpenLearningAnalytics.pdf
Taras, M. (2001). The use of tutor feedback and student self-assessment in summative assessment tasks: Towards transparency for students and for tutors. Assessment & Evaluation in Higher Education, 26(6), 605–614.
Van Tartwijk, J., & Driessen, E. W. (2009). Portfolios for assessment and learning: AMEE guide no. 45. Medical Teacher, 31(9), 790–801. doi:10.1080/01421590903139201
Verbert, K., Drachsler, H., Manouselis, N., Wolpers, M., Vuorikari, R., & Duval, E. (2011). Dataset-driven research for improving recommender systems for learning. In (LAK 2011). Retrieved from http://dl.acm.org/ft_gateway.cfm?id=2090122&type=pdf
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer Nature Singapore Pte Ltd.
About this chapter
Cite this chapter
Ellis, C. (2017). The Importance of E-Portfolios for Effective Student-Facing Learning Analytics. In: Chaudhuri, T., Cabau, B. (eds) E-Portfolios in Higher Education. Springer, Singapore. https://doi.org/10.1007/978-981-10-3803-7_3
Download citation
DOI: https://doi.org/10.1007/978-981-10-3803-7_3
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-10-3802-0
Online ISBN: 978-981-10-3803-7
eBook Packages: EducationEducation (R0)