Skip to main content

Adapting and Validating the Collegiate Learning Assessment to Measure Generic Academic Skills of Students in Germany: Implications for International Assessment Studies in Higher Education

  • Chapter
  • First Online:
Assessment of Learning Outcomes in Higher Education

Abstract

Starting in 2015, a German research team from the program Modeling and Measuring Competencies in Higher Education (KoKoHs), in collaboration with the US Council for Aid to Education (CAE), adapted and validated the Collegiate Learning Assessment (CLA+) for the German language and cultural context to measure generic higher-order cognitive skills of university students and graduates in Germany. In this chapter, the conceptual and methodological background, the framework of the adaptation and validation study, as well as preliminary results are presented. Finally, findings are discussed critically, and future challenges and perspectives are explored.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    The outcomes of the KoKoHs research initiative, which also gave the basis for this study, included 40 competency models and more than 100 measuring instruments. The assessments were carried out with altogether more than 50,000 students at more than 220 higher education institutions throughout Germany to gather evidence of their psychometric quality (Zlatkin-Troitschanskaia et al. 2016a, b).

  2. 2.

    Further analyses of (3) the internal test structure and (4) relations to other variables will be conducted after the first administration of the test in the field.

  3. 3.

    The company had also been involved in the adaptation and linguistic verification of the previous version of the test, the CLA, for various countries in the Assessment of Higher Education Learning Outcomes feasibility study (Tremblay et al. 2012, p. 198; on the general approach, see also Ferrari et al. 2013).

  4. 4.

    For example, on the adaptation of the Test of Understanding in College Economics (TUCE) and the Examen General de Egreso de la Lícenciatura (EGEL) in the WiWiKom project, see Brückner et al. (2014).

  5. 5.

    The scoring guidelines were translated by one translator only, as they would be rephrased by the test validators in Germany in line with the German conceptualization of the construct, as advised by CAE.

  6. 6.

    For example, “Does the item represent a higher education curriculum or a higher education domain?” “In what ways are constructs likely to differ across German higher education institutions?”

  7. 7.

    Because of the specific content and context, the cultural adaptation of the PT2 was initially forgone. A cultural adaptation of the PT2 was conducted at Faculty 06 of Mainz University in the summer semester of 2016. Further coglabs have been conducted on both the culturally adapted and the nonculturally adapted version of PT2.

  8. 8.

    For example, the sample included one student from the domain of medicine who was particularly interested in a healthy lifestyle.

  9. 9.

    Test coordinators avoided creating a testing atmosphere by seating themselves inclined to the assesse, positioning video recording devices out of sight, and maintaining a disturbance-free environment. In addition, data privacy was observed by filming only the respondents’ hands and multiple test documents.

  10. 10.

    The note they were read said: “With this interview, we want to investigate how students handle information that they come across in everyday life. For this purpose, we developed a test and we now want to find out whether the tasks that we developed are suitable for use in higher education. It is therefore not the aim of this experiment to measure your expertise; the results will have no influence on your grades whatsoever. We are interested in how students handle the task, how they solve it and what thoughts cross their minds in the process. We would therefore like to ask you to say everything you are thinking out loud while working on the task, even when you have an idea and then end up dismissing it or when you seem to not understand a word! Everything you would say silently to yourself, you should please say out loud. Just imagine you are alone in the room.”

References

  • AHELO Consortium. (2011). Translation and adaptation manual/guidelines.

    Google Scholar 

  • American Educational Research Association (AERA), American Psychological Association (APA), & National Council on Measurement in Education (NCME). (2014). Standards for educational and psychological testing. Washington, DC: American Educational Research Association.

    Google Scholar 

  • Arffman, I. (2013). Problems and issues in translating international educational achievement tests. Educational Measurement: Issues & Practice, 32(2), 2–14.

    Article  Google Scholar 

  • Behr, D. (2012). The team translation approach in questionnaire translation: a special form of expert collaboration. In Proceedings of the 2nd international specialist conference of the German Federal Association of Interpreters and Translators (BDÜ) 28–30 September 2012 (pp. 644–651). BDÜ, 32. Berlin: BDÜ.

    Google Scholar 

  • Behr, D., & Shishido, K. (2016). The translation of measurement instruments for cross-cultural surveys. In C. Wolf, D. Joye, T. W. Smith, & Y.-C. Fu (Eds.), The SAGE handbook of survey methodology (pp. 269–287). London: Sage.

    Chapter  Google Scholar 

  • Braun, M. (2006). Funktionale Äquivalenz in interkulturell vergleichenden Umfragen. Mythos und Realität [Functional equivalence in comparative intercultural surveys: myth and reality.] Mannheim: ZUMA.

    Google Scholar 

  • Brückner, S., & Pellegrino, J. W. (2016). Integrating the analysis of mental operations into multilevel models to validate an assessment of higher education students’ competency in business and economics. Journal of Educational Measurement, 53(3), 293–312.

    Article  Google Scholar 

  • Brückner, S., Zlatkin-Troitschanskaia, O., & Förster, M. (2014). Relevance of adaptation and validation for international comparative research on competencies in higher education – A methodological overview and example from an international comparative project within the KoKoHs research program. In F. Musekamp & G. Spöttl (Eds.), Competence in higher education and the working environment. National and international approaches for assessing engineering competence, Vocational education and training: Research and practice (Vol. 12, pp. 133–152). Frankfurt am Main: Lang.

    Google Scholar 

  • Coates, H. (Ed.). (2014). Higher education learning outcomes assessment – International perspectives. Frankfurt/Main: Peter Lang.

    Google Scholar 

  • Council for Aid to Education (CAE). (2013). Introducing CLA+ Fostering great critical thinkers. New York: CAE. http://cae.org/images/uploads/pdf/Introduction_to_CLA_Plus.pdf

    Google Scholar 

  • Council for Aid to Education (CAE). (2015). The case for generic skills and performance assessment in the United States and international settings. New York: Council for Aid to Education. http://cae.org/images/uploads/pdf/The_Case_for_Generic_Skills_and_Performance_Assessment.pdf

    Google Scholar 

  • Coyne, I. (Ed.). (2000). International test commission test adaptation guidelines. Accessed 11 December from: www.intestcom.org/test_adaptation

  • European Commission (EC). (2015). European qualifications framework. https://ec.europa.eu/ploteus/search/site?f%5B0%5D=im_field_entity_type%3A97

  • Ferrari, A., Wayrynen, L., Behr, D., & Zabal, A. (2013). Translation, adaptation, and verification of test and survey materials. In OECD Technical report of the survey of adult skills (PIAAC) 2013 (pp. 1–28, section 1, chapter 4). http://www.oecd.org/site/piaac/_Technical%20Report_17OCT13.pdf. Accessed 14 Jan 2017

  • Fitzgerald, R., Widdop, S., Gray, M., & Collins, D. (2011). Identifying sources of error in cross-national questionnaires: Application of an error source typology to cognitive interview data. Journal of Official Statistics, 27(4), 569–599.

    Google Scholar 

  • Forster, M. (2004). Higher order thinking skills. Research Developments, 11, article 1. http://research.acer.edu.au/resdev/vol11/iss11/1

  • Förster, M., Happ, R., & Molerov, D. (2017). Using the U.S. test of financial literacy in Germany – Adaptation and validation. The Journal of Economic Education, 48(2), 123.

    Article  Google Scholar 

  • Goerman, P. L. (2006). An examination of pretesting methods for multicultural, multilingual surveys: The use of cognitive interviews to test Spanish instruments. In J Harkness (Ed.), GESIS-ZUMA (Ed.): Conducting cross-national and cross-cultural surveys: papers from the 2005 meeting of the international workshop on Comparative Survey Design and Implementation (CSDI). Mannheim, 2006 (ZUMA-12).

    Google Scholar 

  • Hambleton, R. K. (2001). The next generation of the ITC test translation and adaptation guidelines. European Journal of Psychological Assessment, 17(3), 164–172.

    Article  Google Scholar 

  • Harkness, J. A. (2003). Questionnaire translation. In J. A. Harkness, F. J. R. van de Vijver, & P. Mohler (Eds.), Cross-cultural survey methods (pp. 35–56). Hoboken, NJ: Wiley.

    Google Scholar 

  • Herl, H. E., O’Neil, H. F., Jr., Chung, G. K. W. K., Dennis, R. A., Klein, D. C. D., Schacter, J., & Baker, E. L. (1996). Measurement of learning across five areas of cognitive competency: Design of an integrated simulation approach to measurement. Year 1 report. Los Angeles: University of California.

    Google Scholar 

  • Hyytinen, H., Holma, K., Toom, A., Shavelson, R. J., & Lindblom-Ylänne, S. (2014). The complex relationship between students’ critical thinking and epistemological beliefs in the context of problem solving. Frontline Learning Research, 2(5), 1–25.

    Google Scholar 

  • International Test Commission (ITC). (2016). The ITC guidelines for translating and adapting tests (2nd ed.). www.InTestCom.org. Accessed 14 Jan 2017.

  • Karabenick, S. A., Woolley, M. E., Friedel, J. M., Ammon, V. B., Blazevksi, J., Bonney, C. R., et al. (2007). Cognitive processing of self report items in educational research: Do they think what we mean? Educational Psychologist, 42(3), 139–151.

    Article  Google Scholar 

  • Kaufmann, F. (2017). Validierung des Testinhalts eines Kompetenzerfassungsinstruments anhand von Expertenratings. Unveröffentlichte Masterarbeit.

    Google Scholar 

  • Leighton, J. P. (2013). Item difficulty and interviewer knowledge effects on the accuracy and consistency of examinee response processes in verbal reports. Applied Measurement in Education, 26(2), 136–157.

    Article  Google Scholar 

  • Liepmann, D., Beauducel, A., Brocke, B., & Amthauer, R. (2007). Intelligenz-Struktur-Test 2000 R [Intelligence Structure Test] (2., erweiterte und überarbeitete Aufl.). Göttingen: Hogrefe & Huber Publishers.

    Google Scholar 

  • Liu, O. L., Frankel, L., & Roohr, K. C. (2014). Assessing critical thinking in higher education: Current state and directions for next-generation assessment (Research Report). Princeton: ETS.

    Google Scholar 

  • Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis (2nd ed.). Thousand Oaks, CA: Sage Publications.

    Google Scholar 

  • Mohler, P., Dorer, B., de Jong, J., & Hu, M. (2016). Translation: Overview. Guidelines for best practice in cross-cultural surveys. Ann Arbor, MI: Survey Research Center, Institute for Social Research, University of Michigan.

    Google Scholar 

  • Organisation for Economic Co-operation and Development (OECD). (2013). Assessment of higher education learning outcomes. AHELO feasibility study report – Volume 2. Data analysis and national experiences. Paris: OECD.

    Google Scholar 

  • Organisation for Economic Co-operation and Development (OECD). (2014). Education at a glance 2014: OECD indicators. Paris: OECD Publishing. https://doi.org/10.1787/eag-2014-en

    Google Scholar 

  • Organisation for Economic Co-operation and Development (OECD). (2016). Getting skills right: Sweden. Paris: OECD Publishing. https://doi.org/10.1787/9789264265479-en

    Google Scholar 

  • Pellegrino, J. W., & Hilton, M. L. (Eds.). (2012). Education for life and work: Developing transferable knowledge and skills in the 21st century. Washington, DC: The National Academies Press.

    Google Scholar 

  • PISA Consortium. (2010). Translation and adaptation guidelines for PISA 2012. National Project Managers’ meeting, Budapest 2010. https://www.oecd.org/pisa/pisaproducts/49273486.pdf. Accessed 14 Jan 2017.

  • Ruiz-Primo, M. A., & Shavelson, R. J. (1996). Problems and issues in the use of concept maps in science assessment. Journal of Research in Science Teaching, 33(6), 569–600.

    Article  Google Scholar 

  • Shavelson, R. J. (2013). On an approach to testing and modeling competence. Educational Psychologist, 48(2), 73–86.

    Article  Google Scholar 

  • Shavelson, R. J., Davey, T., Ferrara, S., Holland, P., Webb, N., & Wise, L. (2015). Psychometric considerations for the next generation of performance assessment. Princeton, NJ: Educational Testing Service.

    Google Scholar 

  • Snow, R. E. (1993). Construct validity and constructed-response tests. In R. E. Bennett & W. C. Ward (Eds.), Construction versus choice in cognitive measurement (pp. 45–60). Hillsdale, NJ: Erlbaum.

    Google Scholar 

  • Solano-Flores, G., Wang, C., & Shade, C. (2016). International semiotics: Item difficulty and the complexity of science item illustrations in the PISA-2009 international test comparison. International Journal of Testing, 16(3), 205.

    Article  Google Scholar 

  • Solano-Flores, G., Chia, M., Shavelson, R. J., & Kurpius, A., (n.d.). CAE cognitive labs guidelines. Unpublished document by the Council for Aid to Education. New York

    Google Scholar 

  • Tremblay, K. (2013). OECD assessment of higher education learning outcomes (AHELO): Rationale, challenges and initial insights from the feasibility study. In S. Blömeke, O. Zlatkin-Troitschanskaia, C. Kuhn, & J. Fege (Eds.), Modeling and measuring competencies in higher education (pp. 113–116). Rotterdam: Sense Publishers.

    Chapter  Google Scholar 

  • Tremblay, K., Lalancette, D., & Roseveare, D. (2012). Assessment of higher education learning outcomes. Feasibility study report. Volume 1 – Design and implementation. OECD. http://www.oecd.org/edu/skills-beyond-school/AHELOFSReportVolume1.pdf

  • Wheeler, P., & Haertel, G. (1993). Resource handbook on performance assessment and measurement. Berkeley, CA: The Owl Press.

    Google Scholar 

  • Willis, G. B. (2005). Cognitive interviewing. A tool for improving questionnaire design. Thousand Oaks, CA: Sage.

    Book  Google Scholar 

  • Zlatkin-Troitschanskaia, O., Förster, M., Brückner, S., & Happ, R. (2014). Insights from a German assessment of business and economics competence. In H. Coates (Ed.), Higher education learning outcomes assessment – International perspectives (pp. 175–197). Frankfurt am Main: Peter Lang.

    Google Scholar 

  • Zlatkin-Troitschanskaia, O., Shavelson, R. J., & Kuhn, C. (2015). The international state of research on measurement of competency in higher education. Studies in Higher Education, 40(3), 393–411.

    Article  Google Scholar 

  • Zlatkin-Troitschanskaia, O., Pant, H. A., Kuhn, C., Lautenbach, C., & Toepper, M. (2016a). Assessment practices in higher education and results of the German research program modeling and measuring competencies in higher education (KoKoHs). Journal Research & Practice in Assessment, 11, 46–54.

    Google Scholar 

  • Zlatkin-Troitschanskaia, O., Pant, H. A., Kuhn, C., Toepper, M., & Lautenbach, C. (2016b). Messung akademisch vermittelter Kompetenzen von Studierenden und Hochschulabsolventen. Ein Überblick zum nationalen und internationalen Forschungsstand [Assessment of academic competencies of students and graduates – An overview of the national and international state of research]. Wiesbaden: Springer

    Google Scholar 

  • Zlatkin-Troitschanskaia, O., Pant, H. A., Lautenbach, C., Molerov, D., Toepper, M., & Brückner, S. (2017). Modeling and measuring competencies in higher education. Approaches to challenges in higher education policy and practice. Wiesbaden: Springer.

    Book  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Olga Zlatkin-Troitschanskaia .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG, part of Springer Nature

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Zlatkin-Troitschanskaia, O. et al. (2018). Adapting and Validating the Collegiate Learning Assessment to Measure Generic Academic Skills of Students in Germany: Implications for International Assessment Studies in Higher Education. In: Zlatkin-Troitschanskaia, O., Toepper, M., Pant, H., Lautenbach, C., Kuhn, C. (eds) Assessment of Learning Outcomes in Higher Education. Methodology of Educational Measurement and Assessment. Springer, Cham. https://doi.org/10.1007/978-3-319-74338-7_12

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-74338-7_12

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-74337-0

  • Online ISBN: 978-3-319-74338-7

  • eBook Packages: EducationEducation (R0)

Publish with us

Policies and ethics