Abstract
In this introductory chapter we provide the context for this edited volume, describe the recent research interests around developing collaborative assessments around the world, and synthesize the major research results from the literature from different fields. The purpose of this edited volume was to bring together researchers from diverse disciplines—educational psychology, organizational psychology, learning sciences, assessment design, communications, human-computer interaction, computer science, engineering and applied science, psychometrics—who shared a research interest in examining learners and workers engaged in collaborative activity. This chapter concludes with an emphasis on how each chapter contributes to the research agenda around the measurement research questions, from how to define the constructs to how to model the data from collaborative interactions.
This work was conducted while Alina A. von Davier was employed with Educational Testing Service.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
The working meeting is described at http://www.cvent.com/events/innovative-assessment-of-collaboration-two-day-working-meeting/custom-19-4110888121994d93bccb78007a50ebc8.aspx.
References
Amazon Mechanical Turk Requester Tour. (n.d.). Retrieved from https://requester.mturk.com/tour
Bartram, D. (2013). Scalar equivalence of OPQ32: Big five profiles of 31 countries. Journal of Cross-Cultural Psychology, 44, 61–83.
Burrus, J., Elliott, D., Brenneman, M., Markle, R., Carney, L., Moore, G. … Roberts, R. D. (2013). Putting and keeping students on track: Toward a comprehensive model of college persistence and attainment (Research Report 13–14). Princeton, NJ: Educational Testing Service.
Care, E., & Griffin, P. (2014). An approach to assessment of collaborative problem solving. Research and Practice in Technology Enhanced Learning, 9(3), 367–388.
Casner-Lotto, J., & Barrington, L. (2006). Are they really ready to work? Employers’ perspectives on the basic knowledge and applied skills of new entrants to the 21st century U.S. workforce. ERIC Number: ED519465, ISBN-0-8237-0888-8. Washington, DC: Partnership for 21st Century Skills. Retrieved from http://www.p21.org/storage/documents/FINAL_REPORT_PDF09-29-06.pdf
Connelly, B. S., & Ones, D. S. (2010). An other perspective on personality: Meta-analytic integration of observers’ accuracy and predictive validity. Psychological Bulletin, 136, 1092–1122.
Deming, D. J. (2015). The growing importance of social skills in the labor market (Working Paper 21473). National Bureau of Economic Research. Retrieved from http://www.nber.org/papers/w21473
Drasgow, F., Stark, S., Chernyshenko, O. S., Nye, C. D., Hulin, C. L., & White, L. A. (2012). Development of the tailored adaptive personality assessment system (TAPAS) to support Army selection and classification decisions (Technical Report 1311). Fort Belvoir, VA: U.S. Army Research Institute for the Behavioral and Social Sciences.
Duhigg, C. (2016, February 28). What Google learned in trying to build the perfect team. New York Times Magazine, MM20.
Ferschke, O. (2016). DiscourseDB core wiki. https://github.com/DiscourseDB/discoursedb-core.wiki.git
Graesser, A. C., Wiemer-Hastings, K., Wiemer-Hastings, P., Kreuz, R., & the Tutoring Research Group. (1999). Auto tutor: A simulation of a human tutor. Journal of Cognitive Systems Research, 1, 35–51.
Greiff, S., & Kyllonen, P. C. (in press). Contemporary assessment challenges: The measurement of 21st century skills (Guest Editors’ Introduction). Applied Measurement in Education, 29(4), 243–244.
Griffin, P., & Care, E. (Eds.). (2015). Assessment and teaching of 21st century skills: Methods and approach. Dordrecht, the Netherlands: Springer.
John, O. P. (1990). The “Big Five” factor taxonomy: Dimensions of personality in the natural language and in questionnaires. In L. Pervin (Ed.), Handbook of personality: Theory and research (pp. 66–100). New York, NY: Guilford Press.
John, O. P., Naumann, L. P., & Soto, C. J. (2008). Paradigm shift to the integrative big-five trait taxonomy: History, measurement, and conceptual issues. In O. P. John, R. W. Robins, & L. A. Pervin (Eds.), Handbook of personality: Theory and research (pp. 114–158). New York, NY: Guilford Press.
Kerr, D., & Chung, G. K. W. K. (2012). Identifying key features of student performance in educational video games and simulations through cluster analysis. Journal of Educational Data Mining, 4(1), 144–182.
Kinect® for Windows. (2016). Meet Kinect for Windows. https://developer.microsoft.com/en-us/windows/kinect. Microsoft.
King, G., Murray, C. J. L., Salomon, J. A., & Tandon, A. (2004). Enhancing the validity and cross-cultural comparability of measurement in survey research. American Political Science Review, 98(1), 191–207.
Korn Ferry International. (2014–2016). Leadership architect technical manual (Item number 82277). Minneapolis, MN: Author. http://static.kornferry.com/media/sidebar_downloads/KFLA_Technical_Manual.pdf
Kyllonen, P. C., & Bertling, J. P. (2014). Innovative questionnaire assessment methods to increase cross-country comparability. In L. Rutkowski, M. von Davier, & D. Rutkowski (Eds.), Handbook of international large-scale assessment: Background, technical issues, and methods of data analysis (pp. 277–285). Boca Raton, FL: CRC Press.
Lovett, B. J., & Lewandowski, L. J. (2015). Testing accommodations for students with disabilities: Research-based practices. Washington, DC: American Psychological Association.
Mason, W., & Suri, S. (2012). Conducting behavioral research on Amazon’s mechanical turk. Behavioral Research, 44(1), 1–23.
Morgan, B., Keshtkar, F., Graesser, A., & Shaffer, D. W. (2013). Automating the mentor in a serious game: A discourse analysis using finite state machines. In C. Stephanidis (Ed.), Proceedings of the 15th international conference on human-computer interaction (HCI international) (pp. 591–595). Berlin, Germany: Springer.
Motowidlo, S. J., Dunnette, M. D., & Carter, G. W. (1990). An alternative selection procedure: The low-fidelity simulation. Journal of Applied Psychology, 75, 640–647.
National Association of Colleges and Employers. (2014). The skills/qualities employers want in new college graduate hires. Retrieved from http://www.naceweb.org/about-us/press/class-2015-skills-qualities-employers-want.aspx
National Center for Education Statistics. (2014, September 29). NAEP innovations symposium: Collaborative problem solving. Washington, DC: Author.
National Research Council. (2011). Assessing 21st century skills: Summary of a workshop (J. A. Koenig, Rapporteur). Committee on the Assessment of 21st Century Skills. Board on Testing and Assessment, Division of Behavioral and Social Sciences and Education. Washington, DC: The National Academies Press.
National Research Council. (2012). Education for life and work: Developing transferable knowledge and skills in the 21st century (J. W. Pellegrino & M. L. Hilton, Eds.). Committee on Defining Deeper Learning and 21st Century Skills. Board on Testing and Assessment and Board on Science Education, Division of Behavioral and Social Sciences and Education. Washington, DC: The National Academies Press.
National Research Council. (2015). Enhancing the effectiveness of team science (N. J. Cooke & M. L. Hilton, Eds.). Committee on the Science of Team Science. Board on Behavioral, Cognitive, and Sensory Sciences, Division of Behavioral and Social Sciences and Education. Washington, DC: The National Academies Press.
Oh, I.-S., Wang, G., & Mount, M. K. (2011). Validity of observer ratings of the five-factor model of personality traits: A meta-analysis. Journal of Applied Psychology, 96(4), 762–773.
Organisation for Economic Cooperation and Development. (2013). PISA 2015: Draft collaborative problem solving framework. Paris, France: Author. Retrieved from https://www.oecd.org/pisa/pisaproducts/Draft%20PISA%202015%20Collaborative%20Problem%20Solving%20Framework%20.pdf
Paunonen, S. V., & Ashton, M. C. (2001). Big five factors and facets and the prediction of behavior. Journal of Personality and Social Psychology, 81(3), 524–539.
Salgado, J. F., & Tauriz, G. (2014). The five-factor model, forced-choice personality inventories and performance: A comprehensive meta-analysis of academic and occupational validity studies. European Journal of Work and Organizational Psychology, 23(1), 3–30. doi:10.1080/1359432X.2012.716198
von Davier, A. A. (2015, July). Virtual and collaborative assessments: Examples, implications, and challenges for educational measurement. Invited Talk at the Workshop on Machine Learning for Education, International Conference of Machine Learning, Lille, France http://dsp.rice.edu/ML4Ed_ICML2015
von Davier, A. A. (in press). Computational psychometrics in support of collaborative assessments. In A. A. von Davier (Ed.). Measurement issues in collaborative learning and assessment [Special Issue]. Journal of Educational Measurement.
von Davier, A. A., & Mislevy, R. J. (in press). Design and modeling frameworks for 21st century: Simulations and game-based assessments. In M. Falkner-Bond & C. Wells (Eds.), Educational measurement: From foundations to future. New York, NY: Guilford.
Wang, L., MacCann, C., Zhuang, X., Liu, O. L., & Roberts, R. D. (2009). Assessing teamwork and collaboration in high school students. Canadian Journal of School Psychology, 24(2), 108–124.
Weekley, J. A., Ployhart, R. E., & Harold, C. M. (2004). Personality and situational judgment tests across applicant and incumbent settings: An examination of validity, measurement, and subgroup differences. Human Performance, 17, 433–461. doi:10.1207/s15327043hup1704_5.
Whetzel, D. L., & McDaniel, M. A. (2009). Situational judgment tests: An overview of current research. Human Resource Management Review, 19, 188–202.
Weinberger, C. J. (2014). The increasing complementarity between cognitive and social skills. The Review of Economics and Statistics, 96(5), 849–861. doi:10.1162/REST_a_00449. http://www.mitpressjournals.org/doi/abs/10.1162/REST_a_00449
Zu, J., & Kyllonen, P. C. (2012, April). Item response models for multiple-choice situational judgment tests. In Situational Judgment Testing for Educational Applications. Symposium conducted at the meeting of the National Council of Measurement in Education, Vancouver, Canada.
Acknowledgements
This book was jointly sponsored by Educational Testing Service and the Army Research Institute for the Behavioral Sciences. The authors thank reviewers James Carlson, Don Powers, and Meghan Brenneman for comments on an earlier version of this article. This work was completed when Alina A. von Davier was employed by Educational Testing Service. The opinions expressed in this paper are those of the authors and not of Educational Testing Service or ACT.
The purpose of this edited volume was to bring together researchers from diverse disciplines—educational psychology, organizational psychology, learning sciences, assessment design, communications, human-computer interaction, computer science, engineering and applied science, psychometrics—who shared a research interest in examining learners and workers engaged in collaborative activity. The collaboration could be as a work team, as a group of students learning together, or as a team working together to solve a problem. There have been several volumes concerned with teamwork and collaboration of workers from an organizational perspective (see Salas, Reyes, & Woods, Chap. 2, Table 2.1) and some research on collaboration in education from a collaborative learning perspective (Care & Griffin, 2014; Griffin & Care, 2015). However, these two broad fields, educational and organizational social science research, have proceeded largely independently despite many shared concerns. Over the last several years some attention has been given to assessment and measurement of 21st century skills, such as teamwork and collaboration, as reflected in several National Research Council reports (2011, 2012, 2015), and special issues of the journals Applied Measurement in Education (Greiff & Kyllonen, 2016), and Journal of Educational Measurement (A. von Davier, in press).
Given the interest in collaboration and the need to address measurement issues more systematically, Educational Testing Service’s (ETS) Research and Development division provided funding for a working meeting, Innovative Collaborative Assessment, held in Washington DC, in November 2014.Footnote 1 The Army Research Institute joined ETS to support related activities including the preparation of this volume. We organized the working meeting and assembled this volume because of the growing awareness of the importance of collaboration in school and in the workplace coupled with the fact that we do not yet have good methods for assessing it. There clearly is a need for better assessment and better measurement models for collaboration and collaborative skills. It was our shared goal that by assembling this volume we would create synergies among experts from different disciplines, working from different assumptions and perspectives, but able to contribute to an emerging vision on assessing collaboration.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing Switzerland
About this chapter
Cite this chapter
Kyllonen, P.C., Zhu, M., von Davier, A.A. (2017). Introduction: Innovative Assessment of Collaboration. In: von Davier, A., Zhu, M., Kyllonen, P. (eds) Innovative Assessment of Collaboration. Methodology of Educational Measurement and Assessment. Springer, Cham. https://doi.org/10.1007/978-3-319-33261-1_1
Download citation
DOI: https://doi.org/10.1007/978-3-319-33261-1_1
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-33259-8
Online ISBN: 978-3-319-33261-1
eBook Packages: EducationEducation (R0)