Educational Technology Research and Development

, Volume 64, Issue 5, pp 881–901 | Cite as

Ethical oversight of student data in learning analytics: a typology derived from a cross-continental, cross-institutional perspective

Development Article

Abstract

The growth of learning analytics as a means to improve student learning outcomes means that student data is being collected, analyzed, and applied in previously unforeseen ways. As the use of this data continues to shape academic and support interventions, there is increasing need for ethical reflection on operational approvals for learning analytics research. Though there are clear processes for vetting studies resulting in publication of student-gathered data, there is little comparable oversight of internally generated student-focused research. Increasingly, ethical concerns about the collection and harvesting of student data have been raised, but there is no clear indication how to address or oversee these ethical concerns. In addition, staff members who are not typical researchers may be less familiar with approvals processes and the need to demonstrate potential for harm, etc. If current trends point to a range of individuals harvesting and analyzing student data (mostly without students’ informed consent or knowledge), how can the real danger of unethical behavior be curbed to mitigate the risk of unintended consequences? A systematic appraisal of the policy frameworks and processes of ethical review at three research institutions (namely, the University of South Africa, the Open University in the United Kingdom, and Indiana University in the United States) provides an opportunity to compare practices, values, and priorities. From this cross-institutional review, a working typology of ethical approaches is suggested within the scope of determining the moral intersection of internal student data usage and application.

Keywords

Ethics Learning analytics Application Ethical typology Research protections Review boards 

References

  1. Abbott, L., & Grady, C. (2011). A systematic review of the empirical literature evaluating IRBs: What we know and what we still need to learn. Journal of Empirical Research on Human Research Ethics, 6(1), 3–19.CrossRefGoogle Scholar
  2. Bauman, Z. (2004). Wasted lives. Modernity and its outcasts. Cambridge, UK: Polity Press.Google Scholar
  3. Bledsoe, C. H., Sherin, B., Galinsky, A. G., & Headley, N. M. (2007). Regulating creativity: research and survival in the IRB iron cage. Northwestern University Law Review, 101, 593.Google Scholar
  4. Bos, W., & Tarnai, C. (1999). Content analysis in empirical social research. International Journal of Educational Research, 31, 659–671.CrossRefGoogle Scholar
  5. Bowker, G. C., & Star, S. L. (1999). Sorting things out: Classification and its consequences. London: The MIT Press.Google Scholar
  6. Brumbaugh, A. J. (1960). Research designed to improve institutions of higher learning. Retrieved from http://files.eric.ed.gov/fulltext/ED017141.pdf
  7. Buchanan, E. A., & Ess, C. M. (2009). Internet research ethics and the institutional review board: Current practices and issues. SIGCAS Computers and Society, 39(3), 43–49.CrossRefGoogle Scholar
  8. Buchanan, E. A., & Hvizdak, E. E. (2009). Online survey tools: Ethical and methodological concerns of human research ethics committees. Journal of Empirical Research on Human Research Ethics, 4(2), 37–48.CrossRefGoogle Scholar
  9. Buckingham Shum, S., & Ferguson, R. (2012). Social learning analytics. Educational Technology & Society, 15(3), 3–26.Google Scholar
  10. Campbell, J., DeBlois, P. B., & Oblinger, D. G. (2007). Academic analytics: A new tool for a new era. EDUCAUSE Review.Google Scholar
  11. Carr, C. T. (2015). Spotlight on ethics: institutional review boards as systemic bullies. Journal of Higher Education Policy and Management, 37(1), 14–29.CrossRefGoogle Scholar
  12. Clark, K., Duckham, M., Guillemin, M., Hunter, A., McVernon, J., O’Keefe, C., et al. (2015). Guidelines for the ethical use of digital data in human research. Melbourne: Carlton Connect Initiative, The University of Melbourne.Google Scholar
  13. Colvin, C., Rogers, T., Wade, A., Dawson, S., Gašević, D., Buckingham Shum, S., Nelson, K., Alexander, S., Lockyer, L., Kennedy, G., Corrin, L., & Fisher, J. (2016). Student retention and learning analytics: a snapshot of Australian practices and a framework for advancement. Canberra, ACT: Australian Government Office for Learning and Teaching Retrieved from http://www.olt.gov.au/project-student-retention-and-learning-analytics-snapshot-current-australian-practices-and-framework.
  14. Cross, K. P. (1967). When will research improve education? Retrieved from http://eric.ed.gov/?id=ED025206
  15. Dawson, S., Gašević, D., Siemens, G., & Joksimović, S. (2014). Current state and future trends: A citation network analysis of the learning analytics field. In Proceedings of the fourth international conference on learning analytics and knowledge (pp. 231-240). ACM.Google Scholar
  16. de Freitas, S., Gibson, D., Plessis, C. D., Halloran, P., Williams, E., Ambrose, M., et al. (2014). Foundations of dynamic learning analytics: Using university student data to increase retention. British Journal of Educational Technology, 46(6), 1175–1188.CrossRefGoogle Scholar
  17. Downe-Wamboldt, B. (1992). Content analysis: method, applications, and issues. Health Care for Women International, 13(3), 313–321. doi:10.1080/07399339209516006.CrossRefGoogle Scholar
  18. Ehrenberg, R. G., Jakubson, G. H., Groen, J. A., So, E., & Price, J. (2007). Inside the black box of doctoral education: What program characteristics influence doctoral students’ attrition and graduation probabilities? Educational Evaluation and Policy Analysis, 29(2), 134–150.CrossRefGoogle Scholar
  19. Elo, S., & Kyngäs, H. (2007). The qualitative content analysis process. Journal of Advanced Nursing, 62(1), 107–115. doi:10.1111/j.1365-2648.2007.04569.x.CrossRefGoogle Scholar
  20. Ferguson, R. (2012). Learning analytics: Drivers, developments and challenges. International Journal of Technology Enhanced Learning, 4(5–6), 304–317.CrossRefGoogle Scholar
  21. Greller, W., & Drachsler, H. (2012). Translating learning into numbers: A generic framework for learning analytics. Journal of Educational Technology and Society, 15(3), 42–57.Google Scholar
  22. Guta, A., Nixon, S. A., & Wilson, M. G. (2013). Resisting the seduction of “ethics creep”: Using Foucault to surface complexity and contradiction in research ethics review. Social Science and Medicine, 98, 301–310.CrossRefGoogle Scholar
  23. Hack, C. (2015a). Applying learning analytics to smart learning — ethics and policy. In Andrew Middleton (Ed.), Smart Learningteaching and learning with smartphones and tablets in post-compulsory education (57-62). Retrieved from http://melsig.shu.ac.uk/wpcontent/uploads/2015/04/Smart-Learning.pdf
  24. Hack, C. (2015b, April 30). Does pedagogic research require ethical review? [Web log]. Retrieved from https://www.heacademy.ac.uk/does-pedagogic-research-require-ethical-review
  25. Howard, R. D., McLaughlin, G. W., & Knight, W. E. (2012). The handbook of institutional research. San Francisco: Jossey-Bass.Google Scholar
  26. Hsieh, H. F., & Shannon, S. E. (2005). Three approaches to qualitative content analysis. Qualitative Health Research, 15(9), 1277–1288. doi:10.1177/1049732305276687.CrossRefGoogle Scholar
  27. Indiana University. (2015). Office of Research Compliance. Retrieved from: http://researchcompliance.iu.edu/hso/index.html
  28. Kay, D., Korn, N., & Oppenheim, C. (2012). Legal, risk and ethical aspects of analytics of analytics in higher education. CETIS Analytics Series, 1(6), 1–30.Google Scholar
  29. Kelly, A. E., & Seppälä, M. (2015). Changing policies concerning student privacy and ethics in online education. International Journal of Information and Education Technology, 6(8), 652–656.CrossRefGoogle Scholar
  30. Kitchin, R. (2013). Big data and human geography: Opportunities, challenges and risks. Dialogues in Human Geography, 3(3), 262–267.CrossRefGoogle Scholar
  31. Kluge, S. (2000). Empirically grounded construction of types and typologies in qualitative social research. Forum Qualitative Sozialforschung/Forum: Qualitative Social Research, 1(1), 1.Google Scholar
  32. Knobelsdorf, M. (2008). A typology of CS students’ preconditions for learning. In Proceedings of the 8th international conference on computing education research (pp. 62-71). ACM.Google Scholar
  33. Knox, D. (2010). Spies in the house of learning: A typology of surveillance in online learning environments. In Proceedings from e-Learning: The Horizon and Beyond conference. St. John’s, Newfoundland.Google Scholar
  34. Koops, B-J., Newell, B. C., Timan, T., Škorvánek, I., Chokrevski, T., & Galič, M. (forthcoming). A typology of privacy. University of Pennsylvania Journal of International Law. Retrieved from http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2754043
  35. Kovanović, V., Joksimović, S., Gašević, D., Hatala, M., & Siemens, G. (2015). Content Analytics: the definition, scope, and an overview of published research. Handbook of Learning Analytics. Retrieved from http://www.researchgate.net/profile/Vitomir_Kovanovic/publication/283462803_Content_Analytics_the_definition_scope_and_an_overview_of_published_research/links/5639163008ae4624b75efb36.pdf
  36. Kruse, A. & Pongsajapan, R. (2012). Student-centered learning analytics. CNDLS Thought Paper. 1-12. Retrieved from: https://cndls.georgetown.edu/m/documents/thoughtpaper-krusepongsajapan.pdf
  37. Largent, E. A., Grady, C., Miller, F. G., & Wertheimer, A. (2012). Money, coercion, and undue inducement: A survey of attitudes about payments to research participants. IRB, 34(1), 1–8.Google Scholar
  38. Locke, S., Ovando, C. J., & Montecinos, C. (2016). Institutional power and the IRB: Saving souls or silencing the other in international field work. In K. Bhopal & R. Deuchar (Eds.), Researching Marginalized Groups. New York: Routledge.Google Scholar
  39. Macfadyn, L. P., & Dawson, S. (2012). Numbers are not enough. Why e-learning analytics failed to inform an institutional strategic plan. Educational Technology and Society, 15(3), 149–163.Google Scholar
  40. Macfadyn, L. P., Dawson, S., Pardo, A., & Gašević, D. (2014). Embracing big data in complex educational systems: The learning analytics imperative and the policy challenge. Research and Practice in Assessment, 9(2), 17–28.Google Scholar
  41. Moskal, P. D. (2016). Incorporating the scholarship of teaching and learning (SoTL) into instruction. In C. D. Dziuban, A. G. Picciano, C. R. Graham, & P. D. Moskal (Eds.), Conducting Research in Online and Blended Environments: New Pedagogical Frontiers. New York: Routledge.Google Scholar
  42. Moxley, J. (2013). Big data, learning analytics, and social assessment. The Journal of Writing Assessment 6(1)Google Scholar
  43. Open University (2014). Policy on ethical use of student data for learning analytics. Retrieved from https://learn3.open.ac.uk/mod/url/view.php?id=85812
  44. Precision Medicine Initiative (2015). Privacy and Trust Principles. Retrieved from https://www.whitehouse.gov/precision-medicine
  45. Prinsloo, P., & Slade, S. (2014). Student data privacy and institutional accountability in an age of surveillance. In M. E. Menon, D. G. Terkla, & P. Gibbs (Eds.), Using data to improve higher education. Rotterdam: Sense Publishers.Google Scholar
  46. Prinsloo, P. & Slade, S. (2015). Student privacy self-management: Implications for learning analytics. In Proceedings of the fifth international conference on learning analytics and knowledge (pp. 83-92). ACM.Google Scholar
  47. Redeker, G. (2000). Coherence and structure in text and discourse.Abduction, Belief and Context in Dialogue, 233-263. Retrieved from http://www.let.rug.nl/~redeker/redeker2000.pdf
  48. Regan, J., Baldwin, M. A., & Peters, L. (2012). Ethical issues in pedagogical research. Journal of Pedagogic Research, 2(3), 44–54.Google Scholar
  49. Rule, P., & John, V. (2011). Case study research. Pretoria: Van Schaik Publishers.Google Scholar
  50. Schildkamp, K., Lay, M. K., & Earl, L. (Eds.). (2013). Data-based decision making in education. Challenges and opportunities. London: Springer.Google Scholar
  51. Scott, J. (Ed.). (2005). Documentary research. London: Sage.Google Scholar
  52. Siemens, G. (2012). Learning analytics: Envisioning a research discipline and a domain of practice. In Second international conference on learning analytics & knowledge, Vancouver, BC, 29 April–2 May 2012.Google Scholar
  53. Siemens, G., & Long, P. (2011). Penetrating the fog: Analytics in learning and education. EDUCAUSE review, 46(5), 30–40.Google Scholar
  54. Slade, S., & Prinsloo, P. (2013). Learning analytics ethical issues and dilemmas. American Behavioral Scientist, 57(10), 1510–1529.CrossRefGoogle Scholar
  55. Subotzky, G., & Prinsloo, P. (2011). Turning the tide: A socio-critical model and framework for improving student success in open distance learning at the University of South Africa. Distance Education, 32(2), 177–193.CrossRefGoogle Scholar
  56. Thomas, G. (2011). How to do your case study. A guide for students and researchers. London: Sage.Google Scholar
  57. Tinto, V. (2006). Research and practice of student retention: What next? Journal of College Student Retention, 8, 1–19.CrossRefGoogle Scholar
  58. Tinto, V. (2012). Enhancing student success: Taking the classroom success seriously. The International Journal of the First Year in Higher Education, 3(1), 1.CrossRefGoogle Scholar
  59. Unisa. (2014). Data privacy policy. Approved by Council, November. Pretoria, South Africa: Unisa.Google Scholar
  60. Vitak, J., Shilton, K., & Ashktorab, Z. (2016). Beyond the Belmont principles: Ethical challenges, practices, and beliefs in the online data research community. Forthcoming for CSCW 2016, 27 February–2 March. Retrieved from: http://www.researchgate.net/profile/Jessica_Vitak/publication/282348909_Beyond_the_Belmont_Principles_Ethical_Challenges_Practices_and_Beliefs_in_the_Online_Data_Research_Community/links/560d69bd08ae6cf68153efcd.pdf
  61. Willis, III, J. E. (2014). Learning analytics and ethics: A framework beyond utilitarianism. EDUCAUSE Review.Google Scholar
  62. Willis, III, J. E., Campbell, J., & Pistilli, M. (2013). Ethics, big data, and analytics: A model for application. EDUCAUSE Review Online.Google Scholar
  63. Willis, III, J. E., & Strunk, V. A. (2015). Ethical responsibilities of preserving academecians in an age of mechanized learning: Balancing the demands of educating at capacity and preserving human interactivity. In J. White & R. Searle (Eds.), Rethinking Machine Ethics in the Age of Ubiquitous Technology (pp. 166–195). Hershey: IGI Global.CrossRefGoogle Scholar
  64. Yanqing, D., Guangming, C., One, V., & Woolley, M. (2013). Big data in higher education: An action research on managing student engagement with business intelligence. In second international conference on emerging research paradigm in business and social science, Middlesex University, Dubai, 26-28 November 2013Google Scholar
  65. Yin, R. K. (2009). Case study research. Design and methods (4th ed., Vol. 5)., Applied Social Research Methods Series London: Sage.Google Scholar

Copyright information

© Association for Educational Communications and Technology 2016

Authors and Affiliations

  • James E. WillisIII
    • 1
  • Sharon Slade
    • 2
  • Paul Prinsloo
    • 3
  1. 1.Indiana UniversityBloomingtonUSA
  2. 2.The Open UniversityMilton KeynesUK
  3. 3.University of South AfricaPretoriaSouth Africa

Personalised recommendations