Abstract
The current study aimed to measure preservice teachers’ skills in implementing pedagogical model of collaborative problem solving (CPS) using a more advanced and novel human-to-agent computerized assessment instrument. By doing so, a framework with three major skills in the implementation of CPS and four major individual problem-solving processes was identified. Then, an assessment with 18 items was constructed under the framework. Additionally, the performance of preservice teachers in implementing CPS was further explored and discussed with the multidimensional random coefficients multinomial logit model. To determine the reliability and validity of the assessment, the expected a posteriori/plausible values (EAP/PV) reliability coefficient, the model data fit, and the differential item functioning (DIF) were explored. The findings demonstrated that the hypothesized structure with three dimensions was validated, and no DIF across gender groups was identified. The results showed that this assessment was feasible, and its reliability and validity were satisfied.




Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Abramczyk, A., & Jurkowski, S. (2020). Cooperative learning as an evidence-based teaching strategy: What teachers know, believe, and how they use it. Journal of Education for Teaching. https://doi.org/10.1080/02607476.2020.1733402
Adams, R. J., Wilson, M., & Wang, W. (1997). The multidimensional random coefficients multinomial logit model. Applied Psychological Measurement, 21(1), 1–23. https://doi.org/10.1177/0146621697211001.
Aesaert, K., van Nijlen, D., Vanderlinde, R., & van Braak, J. (2014). Direct measures of digital information processing and communication skills in primary education: Using item response theory for the development and validation of an ICT competence scale. Computers & Education, 76, 168–181. https://doi.org/10.1016/j.compedu.2014.03.013.
Bond, T. G., & Fox, C. M. (2001). Applying the Rasch Model: Fundamental measurementin the Human Sciences. Mahway, NJ: Erlbaum.
Buchs, C., Filippou, D., Pulfrey, C., & Volpe, Y. (2017). Challenges for cooperative learning implementation: Reports from elementary school teachers. Journal of Education for Teaching, 43(3), 296–306. https://doi.org/10.1080/02607476.2017.1321673.
Burrus, J., Jackson, T., Xi, N., & Steinberg, J. (2013). Identifying the most important 21st century workforce competencies: An analysis of the occupational information network (O*NET). ETS Research Report, 2013(2), 1–55. https://doi.org/10.1002/j.2333-8504.2013.tb02328.x.
Care, E., & Griffin, P. (2017). Assessment of collaborative problem-solving processes. In B. Csapó, & J. Funke (Eds.), The nature of problem solving: Using research to inspire 21st century learning (pp. 227–243). Paris, France: OECD Publishing. https://doi.org/10.1787/9789264273955-en.
Care, E., Griffin, P., & Wilson, M. (Eds.). (2018). Assessment and teaching of 21st century skills: Research and applications. Dordrecht, The Netherlands: Springer. https://doi.org/10.1007/978-3-319-65368-6.
De Hei, M. S. A., Strijbos, J. W., Sjoer, E., & Admiraal, W. (2015). Collaborative learning in higher education: Lecturers’ practices and beliefs. Research Papers in Education, 30(2), 232–247. https://doi.org/10.1080/02671522.2014.908407.
De Wever, B. (2006). The impact of structuring tools on knowledge construction in asynchronous discussion groups. Unpublished PhD dissertation, Ghent University.
DeMars, C. (2010). Item response theory (3.). New York: Oxford University Press.
Dunlosky, J., & Metcalfe, J. (2008). Metacognition. Thousand Oaks, CA: Sage Publications.
French, A. W., & Miller, T. R. (1996). Logistic regression and its use in detecting differential item functioning in polytomous items. Journal of Educational Measurement, 33(3), 315–332. https://doi.org/10.1111/j.1745-3984.1996.tb00495.x.
Gillies, R., & Boyle, M. (2010). Teachers’ reflections on cooperative learning: Issues of implementation. Teaching and Teacher Education, 26, 933–940. https://doi.org/10.1016/j.tate.2009.10.034.
Graesser, A. C., Fiore, S. M., Greiff, S., Andrewstodd, J., Foltz, P. W., & Hesse, F. W. (2018). Advancing the science of collaborative problem solving. Psychological Science in the Public Interest, 19(2), 59–92. https://doi.org/10.1177/1529100618808244.
Graesser, A. C., Forsyth, C. M., & Foltz, P. (2017). Assessing conversation quality, reasoning, and problem solving with computer agents. In B. Csapó, & J. Funke (Eds.), The nature of problem solving: Using research to inspire 21st century learning (pp. 245–261). Paris, France: OECD Publishing. https://doi.org/10.1787/9789264273955-17-en.
Griffin, P. (2017). Assessing and teaching 21st century skills: Collaborative problem solving as a case study. Innovative assessment of collaboration (pp. 113–134). Cham: Springer. https://doi.org/10.1007/978-3-319-33261-1. Switzerland.
Hambleton, R. K., & Swaminathan, H. (1985). Item response theory: Principles and applications (p. 110). Boston, MA: Kluwer.
Hao, J., Liu, L., von Davier, A. A., & Kyllonen, P. C. (2017). Initial steps towards a standardized assessment for collaborative problem solving (CPS): Practical challenges and strategies. Innovative assessment of collaboration (pp. 135–156). Cham: Springer.
Herborn, K., Stadler, M., Mustafić, M., & Greiff, S. (2020). The assessment of collaborative problem solving in PISA 2015: Can computer agents replace humans? Computers in Human Behavior, 104, 105624. https://doi.org/10.1016/j.chb.2018.07.035.
Hesse, F., Care, E., Buder, J., Sassenberg, K., & Griffin, P. (2015). A framework for teachable collaborative problem solving skills. Assessment and teaching of 21st century skills. Switzerland (pp. 37–56). Cham: Springer. https://doi.org/10.1007/978-94-017-9395-7_2.
Hmelo-Silver, C. E. (2004). Problem-based learning: What and how do students learn? Educational Psychology Review, 16, 235–266. https://doi.org/10.1023/B:EDPR.0000034022.16470.f3.
Jodoin, M. G., & Gierl, M. J. (2001). Evaluating type i error and power rates using an effect size measure with the logistic regression procedure for dif detection. Applied Measurement in Education, 14(4), 329–349. https://doi.org/10.1207/S15324818AME1404_2
Johnson, D. W., & Johnson, R. T. (1999). Learning together and alone: Cooperative, competitive, and individualistic learning. Boston: Allyn & Bacon.
Johnson, D. W., & Johnson, R. T. (2009). An educational psychology success story: Social interdependence theory and cooperative learning. Educational Researcher, 38(5), 365–379. https://doi.org/10.3102/0013189X09339057
Karantzas, G. C., Avery, M. R., Macfarlane, S., Mussap, A., Tooley, G., Hazelwood, Z., & Fitness, J. (2013). Enhancing critical analysis and problem-solving skills in undergraduate psychology: An evaluation of a collaborative learning and problem-based learning approach. Australian Journal of Psychology, 65(1), 38–45. https://doi.org/10.1111/ajpy.12009.
Kline, P. (2000). Handbook of psychological testing. New York/London: Routledge.
Kuo, B. C., Liao, C. H., Pai, K. C., Shih, S. C., Li, C. H., & Mok, M. M. C. (2020). Computer-based collaborative problem-solving assessment in Taiwan. Educational Psychology, 40(9), 1164–1185. https://doi.org/10.1080/01443410.2018.1549317.
Linacre, J. (2002). What do infit and outfit, mean-square and standardized mean?. Rasch Meas Trans, 16.
Mantel, N., & Haenszel, W. M. (1959). Statistical aspects of the analysis of data from Respective Studies of Disease. Journal of the National Cancer Institute, 22, 719–748. https://doi.org/10.1093/jnci/22.4.719.
Mayer, R. E. (1990). Problem solving. In M. W. Eysenck (Ed.), The Blackwell Dictionary of Cognitive psychology (p. 284). Oxford, England: Blackwell.
McInnerney, J. M., & Roberts, T. S. (2004). Collaborative or cooperative learning. In T. S. Roberts (Ed.), Online collaborative learning: Theory and practice (pp. 203–2014). Hershey, PA: Information Science Publishing.
Meijer, H., Hoekstra, R., Brouwer, J., & Strijbos, J. W. (2020). Unfolding collaborative learning assessment literacy: A reflection on current assessment methods in higher education. Assessment & Evaluation in Higher Education, 45(8), 1222–1240. https://doi.org/10.1080/02602938.2020.1729696.
Mellenbergh, G. (1982). Contingency table models for assessing item bias. Journal of Educational and Behavioral Statistics, 7(2), 105-118. https://doi.org/10.3102/1076986007002105
Nielsen, T., & Dammeyer, J. (2019). Measuring higher education students’ perceived stress: An IRT-based construct validity study of the pss-10. Studies In Educational Evaluation, 63, 17–25. https://doi.org/10.1016/j.stueduc.2019.06.007.
OECD (2005). Problem Solving for Tomorrow’s World: First Measures of Cross-Curricular Competencies from PISA 2003. https://www.oecd-ilibrary.org/education/problem-solving-for-tomorrow-s-world_9789264006430-en.
OECD (2010). PISA 2012 Field Trial Problem Solving Framework. www.oecd.org/dataoecd/8/42/46962005.pdf.
OECD (2014). Technical Report of the Survey of Adult Skills (PIAAC), Retrieved from the OECD website: ww.oecd.org/site/piaac/_Technical%20Report_17OCT13.pdf.
OECD. (2017a). PISA 2015 collaborative problem-solving framework.
OECD (2017b). PISA 2015 assessment and analytical framework: Science, reading, mathematic, financial literacy and collaborative problem solving (Rev. ed.), Retrieved from the OECD website: https://doi.org/10.1787/9789264281820-en.
O’Neil, H. F., Chuang, S., & Chung, G. K. W. K. (2003). Issues in the computer-based assessment of collaborative problem solving. Assessment in Education: Principles Policy & Practice, 10, 361–374. https://doi.org/10.1080/0969594032000148190.
Opdecam, E., & Everaert, P. (2018). Seven disagreements about cooperative learning. Accounting Education, 27(3), 223–233. https://doi.org/10.1080/09639284.2018.1477056.
Osborne, J., Collins, S., Ratcliffe, M., Millar, R., & Duschl, R. (2003). What “ideas-about science” should betaught in school science? A Delphi study of the expert community Journal of Research in Science Teaching, 40, 7, 692–720.
Palincsar, A., & Herrenkohl, L. (2002). Promoting thinking through peer learning II designing collaborative learning contexts. Theory Into Practice, 41(1), 26-32.
Pásztor-Kovács, A., Pásztor, A., & Molnár, G. (2021). Measuring collaborative problem solving: Research agenda and assessment instrument. Interactive Learning Environments. https://doi.org/10.1080/10494820.2021.1999273
Rosen, Y. (2014). Comparability of conflict opportunities in human-to-human and human-to-agent online collaborative problem solving. Technology Knowledge and Learning, 19(1–2), 147–164. https://doi.org/10.1007/s10758-014-9229-1.
Rosen, Y. (2015). Computer-based assessment of collaborative problem solving: Exploring the feasibility of human-to-agent approach. International Journal of Artificial Intelligence in Education, 25(3), 380–406. https://doi.org/10.1007/s40593-015-0042-3.
Rosen, Y., & Foltz, P. W. (2014). Assessing collaborative problem solving through automated technologies. Journal of Research and Practice in Technology Enhanced Learning, 9, 389–410.
Ruys, I., Van Keer, H., & Aelterman, A. (2011). Student teachers’ skills in the implementation of collaborative learning: A multilevel approach. Teaching and Teacher Education, 27(7), 1090–1100. https://doi.org/10.1016/j.tate.2011.05.005.
Ruys, I., Van Keer, H., & Aelterman, A. (2012). Examining pre-service teacher competence in lesson planning pertaining to collaborative learning. Journal of Curriculum Studies, 44(3), 349–379. https://doi.org/10.1080/00220272.2012.675355.
Saborit, J. A. P., Fernández-Río, J., Estrada, J. A. C., Méndez-Giménez, A., & Alonso, D. M. (2016). Teachers’ attitude and perception towards cooperative learning implementation: Influence of continuing training. Teaching & Teacher Education, 59, 438–445. https://doi.org/10.1016/j.tate.2016.07.020.
Scoular, C., & Care, E. (2019). A generalized scoring process to measure collaborative problem solving in Online environments. Educational Assessment, 24(3), 213–234. https://doi.org/10.1080/10627197.2019.1615372.
Scoular, C., Care, E., & Hesse, F. W. (2017). Designs for operationalizing collaborative problem solving for automated assessment. Journal of Educational Measurement, 54, 12–35. https://doi.org/10.1111/jedm.12130.
Shealy, R., & Stout, W. F. (1993). A Model-based standardization approach that separates true bias/DIF from group differences and detects test bias/DIF as well as Item bias/DIF. Psychometrika, 58, 159–194. https://doi.org/10.1007/BF02294572
Stoeffler, K., Rosen, Y., Bolsinova, M., & von Davier, A. A. (2020). Gamified performance assessment of collaborative problem solving skills. Computers in Human Behavior, 104, 106036. https://doi.org/10.1016/j.chb.2019.05.033.
Sung, Y. T., Yang, J. M., & Lee, H. Y. (2017). The effects of mobile computer-supported collaborative learning: A meta-analysis and critical synthesis. Review of Educational Research, 87, 768–805. https://doi.org/10.3102/0034654317704307.
Swaminathan, H., & Rogers, H. J. (1990). Detecting Differential Item Functioning using logistic regression procedures. Journal of Educational Measurement, 27(4), 361–370. https://doi.org/10.1111/j.1745-3984.1990.tb00754.x.
United Nations Educational, Scientific and Cultural Organization (UNESCO) (2011). UNESCO ICT competency framework for teachers, 26.
Van Leeuwen, A., & Janssen, J. (2019). A systematic review of teacher guidance during collaborative learning in primary and secondary education. Educational Research Review, 27(1), 71–89. https://doi.org/10.1016/j.edurev.2019.02.001.
Von der Gracht, H. A. (2012). Consensus measurement in Delphi studies: Review and implications for future quality assurance. Technological Forecasting & Social Change, 79, 1525–1536.
Funding
This study was not funded by any grants or contracts.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The author declares that they have no conflict of interest.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendix
Appendix
A1. The translation of conversation in Fig. 1
Fei: It is great to visit ancient sites. I have been to Mount Tai and can provide some suggestions.
Ming: Let’s assign the tasks.
Lily: May I assign the tasks among the three of us?
Ming: I want to choose the tasks by myself.
Teacher: Everyone can tell me the tasks that you want to take charge of.
Fei: Since I have been to Mount Tai, I can take charge of the choice of scenic spots.
Ming: I can arrange the schedule and transportation. Let Lily handle the accomodation and expenses.
Lily: I have also been to Mount Tai, I can also take charge of the choice of scenic spots.
A2. The translation of conversation in Fig. 2
Ming: We forgot to summarize the expenses and went over budget. Fei, let’s revise it together.
Fei: Can we change the hotel?
Ming: Ok, let’s change it to one that costs less.
Teacher: Did you encounter any problems during the process of completing the task?
Ming: We encountered problems during the process of assigning tasks. Then Fei referred the plan of Group B, and we made revisions. Every group member also checked the expenses related to his/her own tasks.
Lily: Because we forgot the teacher’s original requirements, our expenses went over budget. In order to comply with the requirements, we changed the hotel.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Lu, H., Wang, Y., Wu, Xy. et al. Developing and validating the assessment of the skills of preservice teachers in implementing pedagogical model of collaborative problem solving. Education Tech Research Dev 71, 1799–1819 (2023). https://doi.org/10.1007/s11423-023-10250-z
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11423-023-10250-z