Statistical Discourse Analysis of Online Discussions: Informal Cognition, Social Metacognition, and Knowledge Creation

Chapter
Part of the Education Innovation Series book series (EDIN)

Abstract

To statistically model large data sets of sequences of knowledge processes during asynchronous, online forums, we must address analytic difficulties involving the whole data set (missing data, nested data, and the tree structure of online messages), dependent variables (multiple, infrequent, discrete outcomes and similar adjacent messages), and explanatory variables (sequences, indirect effects, false-positives, and robustness). Statistical discourse analysis (SDA) addresses all of these issues, as shown in an analysis of 1,330 asynchronous messages written by 17 students during a 13-week online educational technology course. The results showed how attributes at multiple levels (individual and message) affected knowledge creation processes. Men were more likely than women to theorize. Asynchronous messages created a micro-time context; opinions and asking about purpose preceded new information; and anecdotes, opinions, different opinions, elaborating ideas, and asking about purpose or information preceded theorizing. These results show how informal thinking precedes formal thinking and how social metacognition affects knowledge creation.

References

  1. Benjamini, Y., Krieger, A. M., & Yekutieli, D. (2006). Adaptive linear step-up procedures that control the false discovery rate. Biometrika, 93, 491–507.CrossRefGoogle Scholar
  2. Bereiter, C. (1994). Implications of postmodernism for science, or science as progressive discourse. Educational Psychologist, 29(1), 3–12.CrossRefGoogle Scholar
  3. Bereiter, C. (2002). Education and mind in the knowledge age. Mahwah: Lawrence Erlbaum Associates.Google Scholar
  4. Bereiter, C., & Scardamalia, M. (2006). Education for the knowledge age: Design-centered models of teaching and instruction. In P. A. Alexander & P. H. Winne (Eds.), Handbook of educational psychology (2nd ed., pp. 695–713). Mahwah: Lawrence Erlbaum.Google Scholar
  5. Bryk, A. S., & Raudenbush, S. W. (1992). Hierarchical linear models. London: Sage.Google Scholar
  6. Chen, G., & Chiu, M. M. (2008). Online discussion processes. Computers & Education, 50(3), 678–692.CrossRefGoogle Scholar
  7. Chen, G., Chiu, M. M., & Wang, Z. (2012). Social metacognition and the creation of correct, new ideas: A statistical discourse analysis of online mathematics discussions. Computers in Human Behavior, 28(3), 868–880.CrossRefGoogle Scholar
  8. Chiu, M. M. (1996). Exploring the origins, uses and interactions of student intuitions: Comparing the lengths of paths. Journal for Research in Mathematics Education, 27(4), 478–504.CrossRefGoogle Scholar
  9. Chiu, M. M. (2000). Group problem solving processes: Social interactions and individual actions. Journal for the Theory of Social Behavior, 30(1), 27–50.CrossRefGoogle Scholar
  10. Chiu, M. M. (2001). Analyzing group work processes: Towards a conceptual framework and systematic statistical analyses. In F. Columbus (Ed.), Advances in psychology research (Vol. 4, pp. 193–222). Huntington: Nova Science.Google Scholar
  11. Chiu, M. M. (2008a). Effects of argumentation on group micro-creativity: Statistical discourse analyses of algebra students’ collaborative problem solving. Contemporary Educational Psychology, 33, 382–402.CrossRefGoogle Scholar
  12. Chiu, M. M. (2008b). Flowing toward correct contributions during group problem solving: A statistical discourse analysis. Journal of the Learning Sciences, 17(3), 415–463.CrossRefGoogle Scholar
  13. Chiu, M. M., & Khoo, L. (2003). Rudeness and status effects during group problem solving: Do they bias evaluations and reduce the likelihood of correct solutions? Journal of Educational Psychology, 95, 506–523.CrossRefGoogle Scholar
  14. Chiu, M. M., & Khoo, L. (2005). A new method for analyzing sequential processes: Dynamic multi-level analysis. Small Group Research, 36, 1–32.CrossRefGoogle Scholar
  15. Chiu, M. M., & Kuo, S. W. (2009). From metacognition to social metacognition: Similarities, differences, and learning. Journal of Education Research, 3(4), 1–19.Google Scholar
  16. Fujita, N. (2009). Group processes supporting the development of progressive discourse in online graduate courses. Unpublished Doctoral Dissertation, University of Toronto, Toronto. Retrieved from http://hdl.handle.net/1807/43778
  17. Glassner, A., Weinstoc, M., & Neuman, Y. (2005). Pupils’ evaluation and generation of evidence and explanation in argumentation. British Journal of Educational Psychology, 75, 105–118.CrossRefGoogle Scholar
  18. Goldstein, H. (1995). Multilevel statistical models. Sydney: Edward Arnold.Google Scholar
  19. Goldstein, H., Healy, M., & Rasbash, J. (1994). Multilevel models with applications to repeated measures data. Statistics in Medicine, 13, 1643–1655.CrossRefGoogle Scholar
  20. Green, S. B. (1991). How many subjects does it take to do a regression analysis? Multivariate Behavioral Research, 26, 499–510.CrossRefGoogle Scholar
  21. Gunawardena, C. N., Lowe, C. A., & Anderson, T. (1997). Analysis of a global online debate and the development of an interaction analysis model for examining social construction of knowledge in computer conferencing. Journal of Educational Computing Research, 17(4), 397–431.CrossRefGoogle Scholar
  22. Hacker, D. J., & Bol, L. (2004). Metacognitive theory. In D. M. McInerney & S. Van Etten (Eds.), Big theories revisited (Vol. 4, pp. 275–297). Greenwich: Information Age.Google Scholar
  23. Hakkarainen, K. (2003). Emergence of progressive-inquiry culture in computer-supported collaborative learning. Learning Environments Research, 6(2), 199–220.CrossRefGoogle Scholar
  24. Hara, N., Bonk, C. J., & Angeli, C. (2000). Content analysis of online discussion in an applied educational psychology course. Instructional Science, 28, 115–152.CrossRefGoogle Scholar
  25. Howe, C. (2009). Collaborative group work in middle childhood. Human Development, 52(4), 215–239.CrossRefGoogle Scholar
  26. Huedo-Medina, T. B., Sanchez-Meca, J., Marin-Martinez, F., & Botella, J. (2006). Assessing heterogeneity in meta-analysis. Psychological Methods, 11, 193–206.CrossRefGoogle Scholar
  27. Kennedy, P. (2008). Guide to econometrics. Cambridge: Wiley-Blackwell.Google Scholar
  28. King, G., & Zeng, L. (2001). Logistic regression in rare events data. Political Analysis, 9, 137–163.CrossRefGoogle Scholar
  29. Lee, E., Chan, C., & van Aalst, J. (2006). Students assessing their own collaborative knowledge building. International Journal of Computer-Supported Collaborative Learning, 1(1), 57–87.CrossRefGoogle Scholar
  30. Lin, X., & Lehman, J. D. (1999). Supporting learning of variable control in a computer-based biology environment. Journal of Research in Science Teaching, 36, 837–858.CrossRefGoogle Scholar
  31. Ljung, G., & Box, G. (1979). On a measure of lack of fit in time series models. Biometrika, 66, 265–270.CrossRefGoogle Scholar
  32. Lu, J., Chiu, M., & Law, N. (2011). Collaborative argumentation and justifications: A statistical discourse analysis of online discussions. Computers in Human Behavior, 27, 946–955.CrossRefGoogle Scholar
  33. Luppicini, R. (2007). Review of computer mediated communication research for education. Instructional Science, 35(2), 141–185.CrossRefGoogle Scholar
  34. MacKinnon, D. P., Lockwood, C. M., & Williams, J. (2004). Confidence limits for the indirect effect: Distribution of the product and resampling methods. Multivariate Behavioral Research, 39, 99–128.CrossRefGoogle Scholar
  35. Nijstad, B. A., Diehl, M., & Stroebe, W. (2003). Cognitive stimulation and interference in idea generating groups. In P. B. Paulus & B. A. Nijstad (Eds.), Group creativity: Innovation through collaboration (pp. 137–159). New York: Oxford University Press.CrossRefGoogle Scholar
  36. Peugh, J. L., & Enders, C. K. (2004). Missing data in educational research. Review of Educational Research, 74, 525–556.CrossRefGoogle Scholar
  37. Reimann, P. (2009). Time is precious: Variable and event-centred approaches to process analysis in CSCL research. International Journal of Computer-Supported Collaborative Learning, 4(3), 239–257.CrossRefGoogle Scholar
  38. Scardamalia, M. (2002). Collective cognitive responsibility for the advancement of knowledge. In B. Smith (Ed.), Liberal education in a knowledge society (pp. 67–98). Chicago: Open Court.Google Scholar
  39. Scardamalia, M., & Bereiter, C. (1994). Computer support for knowledge-building communities. The Journal of the Learning Sciences, 3(3), 265–283.CrossRefGoogle Scholar
  40. Tallent-Runnels, M. K., Thomas, J. A., Lan, W. Y., Cooper, S., Ahern, T. C., Shaw, S. M., & Liu, X. (2006). Teaching courses online. Review of Educational Research, 76(1), 93–135.CrossRefGoogle Scholar
  41. Thagard, P. (1989). Explanatory coherence. Behavioral and Brain Sciences, 1989(12), 435–502.CrossRefGoogle Scholar
  42. Thomas, M. J. W. (2002). Learning within incoherent structures: The space of online discussion forums. Journal of Computer Assisted Learning, 18, 351–366.CrossRefGoogle Scholar
  43. Wise, A., & Chiu, M. M. (2011). Analyzing temporal patterns of knowledge construction in a role-based online discussion. International Journal of Computer-Supported Collaborative Learning, 6(6), 445–470.CrossRefGoogle Scholar
  44. Woodruff, E., & Brett, C. (1999). Collaborative knowledge building: Preservice teachers and elementary students talking to learn. Language and Education, 13(4), 280–302.CrossRefGoogle Scholar
  45. Zhang, J., Scardamalia, M., Reeve, R., & Messina, R. (2009). Designs for collective cognitive responsibility in knowledge building communities. Journal of the Learning Sciences, 18(1), 7–44.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media Singapore 2014

Authors and Affiliations

  1. 1.Department of Educational StudiesPurdue UniversityWest LafayetteUSA
  2. 2.Odette School of BusinessUniversity of WindsorWindsorCanada

Personalised recommendations