Skip to main content
Log in

The development of a content analysis model for assessing students’ cognitive learning in asynchronous online discussions

  • Research Article
  • Published:
Educational Technology Research and Development Aims and scope Submit manuscript

Abstract

The purpose of this study was to develop and validate a content analysis model for assessing students’ cognitive learning in asynchronous online discussions. It adopted a fully mixed methods design, in which qualitative and quantitative methods were employed sequentially for data analysis and interpretation. Specifically, the design was a “sequential exploratory” (QUAL → quan) design with priority given to qualitative data and methods. Qualitative data were 800 online postings collected in two online courses. Quantitative data were 803 online postings from the same two courses but from different discussion topics and different weeks. During the qualitative process, a grounded theory approach was adopted to construct a content analysis model based on qualitative data. During the quantitative process, χ2 tests and confirmative factor analysis (CFA) which used online postings as cases or observations and was the first of its kind were performed to test if the new model fit the quantitative data.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  • Allen, E., & Seaman, J. (2008). Staying the course: Online education in the United States, 2008. Retrieved December 29, 2008, from http://sloan-c.org/publications/survey/pdf/staying_the_course.pdf.

  • Anderson, T. (2008). Teaching in an online learning context. In T. Anderson (Ed.), The theory and practice of online learning (2nd ed., pp. 343–365). Athabasca: Athabasca University, AU.

    Google Scholar 

  • Anderson, L. W., & Krathwohl, D. R. (Eds.). (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom’s taxonomy of educational objectives. New York: Longman.

    Google Scholar 

  • Anderson, J. G. (n.d.). Measurement models: Exploratory and confirmatory factor analysis. Retrieved March 28, 2008, from http://web.ics.purdue.edu/~janders1/Soc681/Measurement%20Models.ppt.

  • Armour-Thomas, E. (1986). Toward a qualitative analysis of standardized tests using an information processing model. ERIC Document Reproduction Service No. ED286933.

  • Barbera, E. (2006). Collaborative knowledge construction in highly structured virtual discussions. Quarterly Review of Distance Education, 7(1), 1–12.

    Google Scholar 

  • Bloom, B. S. (1956). Taxonomy of educational objectives. Handbook I: The cognitive domain. New York: David McKay.

    Google Scholar 

  • Brown, J. D. (1996). Testing in language programs. Upper Saddle River, NJ: Prentice Hall Regents.

    Google Scholar 

  • Clariana, R. B., Wallace, P. E., & Godshalk, V. M. (2009). Deriving and measuring group knowledge structure from essays: The effects of anaphoric reference. Educational Technology Research and Development. Retrieved April 12, 2009, from http://www.springerlink.com/content/x4t20q1936521447/fulltext.pdf.

  • Cognitive Definition. (2010). LearningRX.com. Inc. Available from http://www.learningrx.com/cognitive-definition-faq.htm.

  • Corich, S., Kinshuk, & Hunt, L. M. (2006). Measuring critical thinking within discussion forums using a computerised content analysis tool. Retrieved November 12, 2006, from http://www.networkedlearningconference.org.uk/abstracts/pdfs/P07%20Corich.pdf.

  • Council for Higher Education Accreditation (CHEA). (2002). Student learning outcomes workshop. CHEA Chronicle, 5(2), 1–3. Retrieved November 18, 2006, from http://www.chea.org/Chronicle/vol5/no2/Chron-vol5-no2.pdf (Electronic Version).

  • DeCoster, J. (1998). Overview of factor analysis. Retrieved March 31, 2008, from http://www.stat-help.com/factor.pdf.

  • Facione, P. A., Facione, N. C., & Giancarlo, C. A. (2000). The disposition toward critical thinking: Its character, measurement, and relationship to critical thinking skill. Informal Logic, 20(1), 61–84.

    Google Scholar 

  • Facione, P. A., Giancarlo, C. A., Facione, N. C., & Gainen, J. (1995). The disposition toward critical thinking. Journal of General Education, 44, 1–25.

    Google Scholar 

  • Fahy, P. J. (2003). Indicators of support in online interaction. International Review of Research in Open and Distance Learning. Available from http://www.irrodl.org/content/v4.1/fahy.html.

  • Garrison, D. R. (1992). Critical thinking and self-directed learning in adult education: An analysis of responsibility and control issues. Adult Education Quarterly, 42(3), 136–148.

    Google Scholar 

  • Garrison, D. R. (2003). Cognitive presence for effective asynchronous online learning: The role of reflective inquiry, self-direction and metacognition. In J. Bourne & J. C. Moore (Eds.), Elements of quality online education: Practice and direction (pp. 47–58). Needham, MA: Sloan Center for Online Education.

    Google Scholar 

  • Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical inquiry in a text-based environment: Computer conferencing in higher education. The Internet and Higher Education, 2(2–3), 87–105.

    Google Scholar 

  • Garrison, D. R., Anderson, T., & Archer, W. (2001). Critical thinking, cognitive presence, and computer conferencing in distance education. American Journal of Distance Education, 15(1), 7–23.

    Article  Google Scholar 

  • Glaser, B. G., & Strauss, A. L. (1967). The discovery of grounded theory: Strategies for qualitative research. Chicago: Aldine.

    Google Scholar 

  • Golafshani, N. (2003). Understanding reliability and validity in qualitative research. The Qualitative Report, 8(4), 597–606.

    Google Scholar 

  • Gunawardena, C. N., Lowe, C. A., & Anderson, T. (1997). Analysis of global online debate and the development of an interaction analysis model for examining social construction of knowledge in computer conferencing. Journal of Educational Computing Research, 17(4), 397–431.

    Article  Google Scholar 

  • Hanson, W. E., Creswell, J. W., Plano-Clark, V. L., Petska, K. S., & Creswell, J. D. (2005). Mixed methods research designs in counseling psychology. Journal of Counseling Psychology, 52, 224–235.

    Article  Google Scholar 

  • Henri, F. (1992). Computer conferencing and content analysis. In A. Kaye (Ed.), Collaborative learning through computer conferencing: The Najaden papers (pp. 117–136). London: Springer-Verlag.

    Google Scholar 

  • Hu, L., & Bentler, P. M. (1999). Cut-off criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling, 6(1), 1–55.

    Article  Google Scholar 

  • Joreskog, K. G., & Sorbom, D. (2001). LISREL 8: Users’ reference guide. Chicago: Scientific Software International.

    Google Scholar 

  • Kane, M. T. (2006). Validation. In R. L. Brennan (Ed.), Educational measurement (4th ed., pp. 17–64). Westport: American Council on Education and Praeger.

    Google Scholar 

  • Marra, R. (2006). A review of research methods for assessing content of computer-mediated discussion forums. Journal of Interactive Learning Research, 17(3), 243–267.

    Google Scholar 

  • Marra, R. M., Moore, J. L., & Klimczack, A. K. (2004). Content analysis of online discussion forums: A comparative analysis of protocols. Educational Technology Research and Development, 52(2), 23–40.

    Article  Google Scholar 

  • Mazur, J. (2004). Conversation analysis for educational technologists: Theoretical and methodological issues for researching the structures, processes and meaning of on-line talk. In D. H. Jonassen (Ed.), Handbook for research in educational communications and technology (2nd ed., pp. 1073–1099). Mahwah, NJ: Lawrence Erlbaum Associates.

    Google Scholar 

  • McLoughlin, C., & Luca, J. (2000). Cognitive engagement and higher order thinking through computer conferencing: We know why but do we know how? Retrieved October 2, 2006, from http://lsn.curtin.edu.au/tlf/tlf2000/mcloughlin.html.

  • McPeck, J. (1981). Critical thinking and education. New York: St. Martin’s.

    Google Scholar 

  • Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis (2nd ed.). Thousand Oaks, CA: Sage.

    Google Scholar 

  • Moallem, M. (2005). Designing and managing student assessment in online learning environment. In P. Comeaux (Ed.), Assessing online learning (pp. 18–34). Bolton, MA: Anker.

    Google Scholar 

  • Moore, M. G. (1989). Three types of interaction. The American Journal of Distance Education, 3(2). Retrieved January 24, 2005, from http://www.ajde.com/Contents/vol3_2.htm#editorial (Electronic version).

  • Moskal, B. M., & Leydens, J. A. (2000). Scoring rubric development: Validity and reliability. Practical Assessment, Research and Evaluation, 7(10). Retrieved May 31, 2007, from http://PAREonline.net/getvn.asp?v=7&n=10.

  • Newman, D. R., Webb, B., & Cochrane, C. (1996). A content analysis method to measure critical thinking in face-to-face and computer supported group learning. Retrieved October 5, 2006, from http://www.qub.ac.uk/mgt/papers/methods/contpap.html.

  • Pallant, J. (2007). SPSS survival manual (3rd ed.). Maidenhead: McGraw Hill/Open University.

    Google Scholar 

  • Pandit, N. R. (1996). The creation of theory: a recent application of the grounded theory method. Qualitative Report, 2(4). http://www.nova.edu/ssss/QR/QR2-4/pandit.html.

  • Pirnay-Dummer, P., Ifenthaler, D., & Spector, J. M. (2009). Highly integrated model assessment technology and tools. Educational Technology Research and Development. Retrieved April 12, 2009, from http://www.springerlink.com/content/6005268510618175/fulltext.pdf.

  • Reeves, T. C. (2000). Alternative assessment approaches for online learning environments in higher education. Journal of Educational Computing Research, 23(1), 101–111.

    Article  Google Scholar 

  • Riffe, D., Lacy, S., & Fico, F. (1998). Analyzing media messages: Quantitative content analysis. New Jersey: Lawrence Erlbaum.

    Google Scholar 

  • Rourke, L., Anderson, T., Garrison, D. R., & Archer, W. (2001). Methodological issues in the content analysis of computer conference transcripts. International Journal of Artificial Intelligence in Education, 12, 8–22.

    Google Scholar 

  • Savenye, W. C. (2004). Alternatives for assessing learning in Web-based courses. Distance Learning, 1(1), 29–35.

    Google Scholar 

  • Schumacher, B. K., West, D. V., & Angell, L. R. (1997, November). Assessing knowledge and cognitive skills. Paper presented at the Annual Meeting of the National Communication Association, Chicago, IL. ERIC Document Reproduction Service No. ED417429.

  • Scientific Software International. (2008). Retrieved June 12, 2008, from http://www.ssicentral.com/.

  • Strauss, A., & Corbin, J. (1998). Basics of qualitative research: Techniques and procedures for developing grounded theory (2nd ed.). Thousand Oaks, CA: Sage.

    Google Scholar 

  • Swan, K., Schenker, J., Arnold, S., & Kuo, C.-L. (2007). Shaping online discussion: Assessment matters. E-mentor, 1(18). Retrieved December 12, 2006, from http://www.e-mentor.edu.pl/_xml/wydania/18/390.pdf.

  • Swan, K., Shen, J., & Hiltz, S. R. (2006). Assessment and collaboration in online learning. Journal of Asynchronous Learning Networks, 10(1), 45–62. Retrieved November 12, 2006, from http://www.sloan-c-wiki.org/JALN/v10n1/pdf/v10n1_5swan.pdf (Electronic Version).

    Google Scholar 

  • Wijekumar, K., Ferguson, L., & Wagoner, D. (2006). Problems with assessment validity and reliability in Web-based distance learning environments and solutions. Journal of Educational Multimedia and Hypermedia, 15(2), 199–215.

    Google Scholar 

  • Yang, Y. C. (2008). A catalyst for teaching critical thinking in a large university class in Taiwan: Asynchronous online discussions with the facilitation of teaching assistants. Educational Technology Research and Development, 56, 241–264.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dazhi Yang.

Appendices

Appendix 1: Sample discussion questions

Sample discussion questions for course IMCE

1. Discussion question: Technology and learning

What do you think is the role of technology in learning? Looking back on Robyler’s chapter #1 (4th edition), frame your response in terms of method (constructivist or directed, or another method), learner level, and cite a specific learning theorist from the chapter as a means (or example) of expressing your viewpoint.

2. Discussion question: How technology can facilitate learning

After reading the Johnson & Johnson article, what are your views on group composition impact on cooperative learning, and how it relates to technology (e.g. does it differ when you use technology)? Take some time to think and formulate a position on technology and cooperative learning, tie it into the readings (past and present) and let us know what you think.

3. Discussion question: Technology planning

It appears that planning and teamwork are keys to effectively implementing technology into the classroom. How successful can an educator be at accomplishing technological integration if s/he does not have district or even team support? What are some ways that teachers can be effective at accomplishing this goal if they are ‘flying solo’? For those of you that are administrators, what other considerations do you think should be included at the school level and why? Support your points.

Sample discussion questions for course FDE

1. Discussion question: Learning theories and practical applications in online

Thinking ahead to your final project, or at least the general topic at this point, consider a way in which you would integrate both the behaviorist learning theory and the cognitivist learning theory for particular activities (e.g. one activity for each theory). Which strategies would you utilize when integrating (based on the strategies in your readings associated with each learning theory). It is also important to consider that you may not have use for both (or either?) of these learning theories in your learning module—of not, why not? I’m hoping you will reflect on what each theory does have to offer and have an understanding of what they look like when implemented. Now, after you’ve done your part, look to your peers and see if you can come up with some suggestions for them as well. Since you will be working in small groups you’ll also have to have someone willing to serve as the wrapper for each small group and provide the summary to the larger group.

2. Discussion question: Your theory of online learning: what’s important to you?

This week let’s consider several new questions that will help you develop an individualized theory for online learning. Start by considering the Prensky piece on page 50 of Anderson (“how people learn what”), and then the Bransford piece on page 54, Table 2-1 (“how people learn”). Are you taking a learner-centered, community-centered, knowledge-centered, or assessment centered-approach to your module (there is no right or wrong answer). Now, let’s incorporate interaction and presence. I think that all of us have discussed presence at some point in the class, whether we labeled as such or not. Thinking about your online learning modules, what can you do to improve presence? Or, is presence an important part of your module? What kind of presence is important to you for this project and/or in general (instructor, peer, other?).

3. Discussion question: Assessment of online discussions

Give your recommendations and offer your ideas to the following scenario. Be sure to provide a justification and rationale based on learning and instructional theories as well as course readings. Please cite appropriately for your justifications and rationale.

Dr. Melinda Smith teaches an online graduate course, Contemporary English Literature. In this online course, the major learning activity is the weekly online discussion and postings on the assigned learning materials. Melinda knows that “if you build it, they will come” doesn’t apply to most online discussions, instead she believes that “if you don’t grade it, they won’t come”. Nonetheless, she struggles with the different rubrics available for grading the students’ online postings and the assignment of the final grades.

Reflective questions: (1) What should Melinda consider when choosing or creating her grading rubrics for the students’ online postings? (2) What are the alternatives Melinda could consider for evaluating students’ learning in this online course? (3) What would you suggest the percentage of online discussion points be in relation to the final grade?

Appendix 2: Initial coding scheme with examples

Table 5

Rights and permissions

Reprints and permissions

About this article

Cite this article

Yang, D., Richardson, J.C., French, B.F. et al. The development of a content analysis model for assessing students’ cognitive learning in asynchronous online discussions. Education Tech Research Dev 59, 43–70 (2011). https://doi.org/10.1007/s11423-010-9166-1

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11423-010-9166-1

Keywords

Navigation