Advertisement

Open Research and Observational Study for 21st Century Learning

  • Vivekanandan S. KumarEmail author
  • Shawn Fraser
  • David Boulanger
Conference paper
Part of the Lecture Notes in Educational Technology book series (LNET)

Abstract

Contemporary research practice unreasonably obscures formative research outcomes from public notice. Indeed, this exclusion – often unintentional – holds true even when the research is publicly funded. Accordingly, the Public must search scholarly channels, such as academic journals, for research information that is not composed for general comprehension. Essentially, a breach in information transmission separates researchers and society at large. In education, a similar communication gap exists between students and instructors, given that instructors rely on traditional assessment activities to measure student performance and rarely realize the corresponding study efforts. Consequently, certain important formative evidences go largely unnoticed. Today, researchers are exploring smart learning processes that exploit opportunities triggered by environmental affordance, personal need, and/or professional expectation, and mitigate various assessment difficulties.

This presentation introduces Open Research in the context of Smart Learning. First, it discusses the advantages of opening the research process to an authorized public, fellow students, educators and policymakers. For example, it argues that greater accessibility can promote research growth and integrity. Second, it uses observational study methods to illustrate the ways students and educators can conduct their own experiments using continuously arriving data. This second section introduces three matching techniques (i.e. Coarsened Exact Matching, Mahalanobis Distance Matching, and Propensity Score Matching) and three data imbalance metrics (i.e. L1 vector norm, Average Mahalanobis Imbalance, and Difference in Means) to assess the level of data imbalance within matched sample datasets. Ultimately, the presentation promotes Smart Learning Environments that incorporate automated tools for opportunistic capture, analysis and remediation of various formative study processes. Such environments can enable students to ethically share and receive study data that help them conduct personal observational studies on individual study related questions. Moreover, it explains key traits of observational studies that are relevant for smart learning environments, considering the comparable traits of blocked randomized experiments. Remarkably, this presentation proposes a novel idea to connect Open Research with Persistent Observational Study methods. It explores how open research can support adaptive and self-regulated learning. It advocates for innovative research practices that can produce better and smarter learning.

Keywords

matching in smart learning environments propensity score matching randomized experiment interactive analysis observational study learning analytics data imbalance persistent observational study 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [1]
    Olmos, A., & Govindasamy, P. (2015). Propensity Scores: A Practical Introduction Using R. Journal of MultiDisciplinary Evaluation, 11(25), 68–88.Google Scholar
  2. [2]
    Iacus, S. M., King, G., Porro, G., & Katz, J. N. (2012). Causal inference without balance checking: Coarsened exact matching. Political Analysis, 1–24.Google Scholar
  3. [3]
    King, G., Nielsen, R., Coberley, C., & Pope, J. E. (2011). Comparative Effectiveness of Matching Methods for Causal Inference. Unpublished Manuscript, 15, 1–26. http://doi.org/10.1.1.230.3451
  4. [4]
    LaLonde, R. J. (1986). Evaluating the econometric evaluations of training programs with experimental data. The American Economic Review, 604–620.Google Scholar
  5. [5]
    Dehejia, R. H., & Wahba, S. (1999). Causal effects in nonexperimental studies: Reevaluating the evaluation of training programs. Journal of the American Statistical Association, 94(448), 1053–1062.Google Scholar
  6. [6]
    Dehejia, R. H., & Wahba, S. (2002). Propensity score-matching methods for nonexperimental causal studies. Review of Economics and Statistics, 84(1), 151–161.Google Scholar
  7. [7]
    King, G., Lucas, C., & Nielsen, R. (2014). The Balance-Sample Size Frontier in Matching Methods for Causal Inference. American Journal of Political Science.Google Scholar
  8. [8]
    King, G., & Nielsen, R. (2016). Why propensity score should not be used for matching, (617).Google Scholar
  9. [9]
    Hannan, E. L. (2008). Randomized Clinical Trials and Observational Studies: Guidelines for Assessing Respective Strengths and Limitations. JACC: Cardiovascular Interventions, 1(3), 211–217. http://dx.doi.org/10.1016/j.jcin.2008.01.008
  10. [10]
    Concato, J., Shah, N., & Horwitz, R. I. (2000). Randomized, Controlled Trials, Observational Studies, and the Hierarchy of Research Designs. The New England Journal of Medicine, 342(25), 1887–1892.Google Scholar
  11. [11]
    Medical Publishing Internet, Kent W. The advantages and disadvantages of observational and randomised controlled trials in evaluating new interventions in medicine. Educational article [Internet]. Version 1. Clinical Sciences. 2011 Jun 9. Available from: https://clinicalsciences.wordpress.com/article/the-advantages-and-disadvantages-of-1blm6ty1i8a7z-8/.
  12. [12]
    Silverman, S. L. (2009). From Randomized Controlled Trials to Observational Studies. The American Journal of Medicine, 122(2), 114–120. http://dx.doi.org/10.1016/j.amjmed.2008.09.030
  13. [13]
    At Work, Issue 83, Winter 2016: Institute for Work & Health, Toronto.Google Scholar
  14. [14]
    Sullivan, G. M. (2011). Getting Off the “Gold Standard”: Randomized Controlled Trials and Education Research. Journal of Graduate Medical Education, 3(3), 285–289. http://doi.org/10.4300/JGME-D-11-00147.1
  15. [15]
    Lindholm, M. (2015), Public Commitment to Research, VA Barometer 2015/16 – VA Report 2015:6, Vetenskap & Allmänhet, http://v-a.se/downloads/varapport2015_6_eng.pdf
  16. [16]
    Pardo, R., Calvo, S. (2002), Attitudes toward science among the European public: a methodological analysis, Public Understand. Sci. 11, 155–195, https://www.upf.edu/pcstacademy/_docs/155.pdf
  17. [17]
  18. [18]
  19. [19]
  20. [20]
  21. [21]
  22. [22]
  23. [23]
  24. [24]
  25. [25]
  26. [26]
    Giannakos, M., Sampson, D. G. and Kidzinski, L. (2016), Introduction to smart learning analytics: Foundations and developments in video-based learning, Smart Learning Environment, Vol. 3 No. 12,  https://doi.org/10.1186/s40561-016-0034-2
  27. [27]
    Gros, B. (2016), The design of smart educational environments, Smart Learning Environment, Vol. 3 No. 15,  https://doi.org/10.1186/s40561-016-0039-x
  28. [28]
    Kinshuk, Chen, N. S. and Cheng, I. L. (2016), Evolution is not enough: Revolutionizing current learning environments to smart learning environments, International Journal of Artificial Intelligence in Education, Vol. 26 No. 2, pp. 561–581.Google Scholar
  29. [29]
  30. [30]
    Kumar, V.S., Fraser, S.N., Boulanger, D. (2017). Discovering the predictive power of five baseline writing competences, Journal of Writing Analytics, 1 (1), pp. N/A, https://journals.colostate.edu/analytics/article/view/107.
  31. [31]
    Bartling, S and Friesike, S. (2017), Opening Science, Springer Open, http://book.openingscience.org

Copyright information

© Springer Nature Singapore Pte Ltd. 2018

Authors and Affiliations

  • Vivekanandan S. Kumar
    • 1
    Email author
  • Shawn Fraser
    • 1
  • David Boulanger
    • 1
  1. 1.Athabasca UniversityAthabascaCanada

Personalised recommendations