Skip to main content

International Performance Assessment of Learning in Higher Education (iPAL): Research and Development

  • Chapter
  • First Online:
Book cover Assessment of Learning Outcomes in Higher Education

Abstract

Educators and policy makers now recognize how important and necessary it is to assess student learning outcomes (SLOs) in higher education. The question has shifted from whether such outcomes should be measured to how they should be measured. Today SLOs are typically assessed by student self-reports of learning or with multiple-choice and short-answer tests. Each of these methods has its strengths and limitations; each one provides insights into the nature of teaching and learning. An alternative approach is the assessment of performance using “criterion” tasks that are drawn from real-world situations in which students are being educated, both within and across academic or professional domains. The international Performance Assessment of Learning (iPAL) project, described herein, consolidates previous research and moves to the next generation of performance assessments for local, national, and international use. iPAL, a voluntary collaborative of scholars and practitioners, seeks to develop, research, and use performance assessments of college students’ twenty-first-century skills (e.g., critical thinking, written communication, quantitative literacy, civic competency and engagement, intercultural perspective-taking) for both instructional improvement and accountability purposes.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Colombia, Egypt, Finland, Korea, Kuwait, Mexico, Norway, the Slovak Republic, and the USA (Connecticut, Missouri, Pennsylvania).

  2. 2.

    For the assessment of discipline-specific skills, many different tests and assessments exist in various countries, for example, ETS’, Major Field Tests (MFTs) in the USA, and Exámenes Generales para el Egreso de Licenciatura (EGEL) by Ceneval in Mexico and KoKoHs in Germany and Austria (see an overview in Zlatkin-Troitschanskaia et al. 2016).

  3. 3.

    Note that a government report in the USA, such as the Federal Aviation Reports on aircraft accidents, are considered to be highly reliable. However, in other countries, government reports are treated with great suspicion and not considered to be reliable. Hence the challenge in developing tasks that cross boundaries.

  4. 4.

    For more details of challenges of international assessment, see also Zlatkin-Troitschanskaia et al. (2015, 2017).

References

  • Achtenhagen, F., & Winther, E. (2014). Workplace-based competence measurement: Developing innovative assessment systems for tomorrow’s VET programmes. Journal of Vocational Education & Training, 66, 281–295. https://doi.org/10.1080/13636820.2ß14.916740.

    Article  Google Scholar 

  • American Educational Research Association, American Psychological Association, & National Council on Measurement in Education (AERA, APA, & NCME). (2014). Standards for educational and psychological testing. Washington, DC: American Educational Research Association.

    Google Scholar 

  • Corno, L., Cronbach, L. J., Kupermintz, H., Lohman, D. F., Mandinach, E. B., Porteus, A. W., & Talbert, J. E. (2002). Remaking the concept of aptitude: Extending the legacy of Richard E. Snow. New York: Routledge.

    Google Scholar 

  • Council for Aid to Education. (2013). Introducing CLA+. Fostering great critical thinkers. New York: CAE. http://cae.org/images/uploads/pdf/Introduction_to_CLA_Plus.pdf

  • Educational Testing Service (ETS). (2017). Introducing the HEIghten: Outcomes assessment suite. https://www.ets.org/heighten

  • Fu, A. C., Kannan, A., Shavelson, R. J., Peterson, L., & Kurpius, A. (2016). Room for rigor: Designs and methods in informal science education evaluation. Visitor Studies, 19(1), 12–38. https://doi.org/10.1080/10645578.2016.1144025.

    Article  Google Scholar 

  • Hambleton, R. K., & Zenisky, L. (2010). Translating and adapting tests for cross-cultural assessments. https://doi.org/10.1017/CBO9780511779381.004

  • Holtsch, D., Rohr-Mentele, S., Wenger, E., Eberle, F., & Shavelson, R. J. (2016). Challenges of a cross-national computer-based test adaptation. Empirical Research in Vocational Education and Training, 8(18), 1–32.

    Google Scholar 

  • International Test Commission. (2005). International Test Commission guidelines for translating and adapting tests. Retrieved from http://www.intestcom.org/files/guideline_test_adaptation.pdf

  • Kahneman, D. (2011). Thinking, fast and slow. New York: Farrar, Straus and Giroux.

    Google Scholar 

  • Koretz, D. (2016, April 4). Measuring postsecondary competencies: Lessons from large-scale K-12 assessments. Presentation at the KoKoHs conference, Berlin.

    Google Scholar 

  • Lai, E. R., & Viering, M. (2012). Assessing 21st century skills: Integrating research findings. Paper presented at the annual meeting of the National Council on Measurement in Education, Vancouver, B.C., Canada.

    Google Scholar 

  • Leighton, J. P. (2017). Using think-aloud interviews and cognitive labs in educational research. Oxford: Oxford University Press.

    Book  Google Scholar 

  • Liu, O. L., Mao, L., Frankel, L., & Xu, J. (2016). Assessing critical thinking in higher education: The HEIghten™ approach and preliminary validity evidence. Assessment & Evaluation in Higher Education, 41(5), 677–694. https://doi.org/10.1080/02602938.2016.1168358.

    Article  Google Scholar 

  • Marion, S. F., & Pellegrino, J. (2007). A validity framework for evaluating the technical quality of alternate assessments. Educational Measurement Issues and Practice, 25, 47–57.

    Article  Google Scholar 

  • McClelland, D. C. (1973). Testing for competence rather than intelligence. American Psychologist, 28, 1–14.

    Article  Google Scholar 

  • OECD. (2012). Assessment of higher education learning outcomes. Feasibility study report: Volume 1 – Design and implementation. Retrieved from http://www.oecd.org/edu/skills-beyond-school/AHELOFSReportVolume1.pdf

  • OECD. (2013a). Assessment of higher education learning outcomes. AHELO feasibility study report – Volume 2. Data analysis and national experiences. Paris: OECD.

    Google Scholar 

  • OECD. (2013b). The survey of adult skills: Reader’s companion. OECD Publishing. https://doi.org/10.1787/9789264204027-en.

  • Pellegrino, J. W., & Hilton, M. L. (2012). Education for life and work: Developing transferable knowledge and skills in the 21st century. Washington, DC: National Academies Press.

    Google Scholar 

  • Pellegrino, J. W., Chudowsky, N., & Glaser, R. (Eds.). (2001). Knowing what students know: The science and design of educational assessment. Washington, DC: The National Academies Press.

    Google Scholar 

  • Shavelson, R. J. (2008). Reflections on quantitative reasoning: An assessment perspective. In B. L. Madison & L. A. Steen (Eds.), Calculation vs. context: Quantitative literacy and its implications for teacher education. Washington, DC: Mathematical Association of America.

    Google Scholar 

  • Shavelson, R. J. (2010). Measuring college learning responsibly: Accountability in a new era. Stanford: Stanford University Press.

    Google Scholar 

  • Shavelson, R. J. (2012). Assessing business-planning competence using the collegiate learning assessment as a prototype. Empirical Research in Vocational Education and Training, 4, 77–90.

    Google Scholar 

  • Shavelson, R. J. (2013a). On an approach to testing and modeling competence. Educational Psychologist, 48(2), 73–86.

    Article  Google Scholar 

  • Shavelson, R. J. (2013b). An approach to testing and modeling competencies. In S. Blömeke, O. Zlatkin-Troitschanskaia, C. Kuhn, & J. Fege (Eds.), Modeling and measuring competencies in higher education: Tasks and challenges. Boston: Sense.

    Google Scholar 

  • Shavelson, R. J. (2017). Statistical significance and program effect: Rejoinder to “why assessment will never work in many business schools: A call for better utilization of pedagogical research”. Journal of Management Education, 41, 1–5.

    Article  Google Scholar 

  • Shavelson, R. J., Roeser, R. W., Kupermintz, H., Lau, S., Ayala, C., Haydel, A., Schultz, S., Quihuis, G., & Gallagher, L. (2002). Richard E. Snow’s remaking of the concept of aptitude and multidimensional test validity: Introduction to the special issue. Educational Assessment, 8(2), 77–100.

    Article  Google Scholar 

  • Shavelson, R. J., Davey, T., Ferrara, S., Holland, P., Webb, N., & Wise, L. (2015). Psychometric considerations for the next generation of performance assessment. Princeton: Educational Testing Service.

    Google Scholar 

  • Shavelson, R. J., Domingue, B. W., Mariño, J. P., Molina-Mantilla, A., Morales, J. A., & Wiley, E. E. (2016). On the practices and challenges of measuring higher education value added: The case of Colombia. Assessment and Evaluation in Higher Education, 41(5), 695–720.

    Article  Google Scholar 

  • Shavelson, R. J., Marino, J., Zlatkin-Troitschanskaia, O., & Schmidt, S. (2017a). Reflections on the assessment of quantitative reasoning. In B. L. Madison & L. A. Steen (Eds.), Calculation vs. context: Quantitative literacy and its implications for teacher education. Washington, DC: Mathematical Association of America. (in press).

    Google Scholar 

  • Shavelson, R. J., Zlatkin-Troitschanskaia, O., & Marino, J. (2017b). Performance indicators of learning in higher education institutions: Overview of the field. In E. Hazerkorn, H. Coates, & A. Cormick (Eds.), Research handbook on quality, performance and accountability in higher education. Edward Elgar. (in press).

    Google Scholar 

  • Snow, R. E. (1996). Aptitude development and education. Psychology, Public Policy, and Law, 2(3/4), 536–560.

    Article  Google Scholar 

  • Solano-Flores, G., Shavelson, R. J., & Schneider, S. A. (2001). Expanding the notion of assessment Shell: From task development tool to instrument for guiding the process of science assessment development. Revista Electrónica de Investigació Educative, 3(1), 33–53.

    Google Scholar 

  • Stanovich, K. E. (2009). What intelligence test miss: The psychology of rational thought. New Haven: Yale University Press.

    Google Scholar 

  • Stanovich, K. E. (2016). The comprehensive assessment of rational thinking. Educational Psychologist, 51, 1–12. https://doi.org/10.1080/00461520.2015.1125787.

    Article  Google Scholar 

  • Strijbos, J., Engels, N., & Struyven, K. (2015). Criteria and standards of generic competences at bachelor degree level: A review study. Educational Research Review, 14, 18–32. https://doi.org/10.1016/j.edurev.2015.01.001.

    Article  Google Scholar 

  • Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185, 1124–1131.

    Article  Google Scholar 

  • Wolf, R., Zahner, D., Kostoris, F., & Benjamin, R. (2014). A case study of an international performance-based assessment of critical thinking skills. New York: Council for Aid to Education.

    Google Scholar 

  • Zahner, D. (2013). Reliability and validity of CLA+. http://cae.org/images/uploads/pdf/Reliability_and_Validity_of_CLA_Plus.pdf

  • Zlatkin-Troitschanskaia, O., Shavelson, R. J., & Kuhn, C. (2015). The international state of research on measurement of competency in higher education. Studies in Higher Education, 40(3), 393–411.

    Article  Google Scholar 

  • Zlatkin-Troitschanskaia, O., Pant, H. A., Kuhn, C., Lautenbach, C., & Toepper, M. (2016). Assessment practices in higher education and results of the German research program modeling and measuring competencies in higher education (KoKoHs). Research & Practice in Assessment, 11, 46–54.

    Google Scholar 

  • Zlatkin-Troitschanskaia, O., Pant, H. A., Lautenbach, C., Molerov, D., Toepper, M., & Brückner, S. (2017a). Modeling and measuring competencies in higher education. Approaches to challenges in higher education policy and practice. Wiesbaden: Springer.

    Book  Google Scholar 

  • Zlatkin-Troitschanskaia, O., Shavelson, R. J., & Pant, H. A. (2017b). Assessment of learning outcomes in higher education – International comparisons and perspectives. In C. Secolsky & B. Denison (Eds.), Handbook on measurement, assessment and evaluation in higher education (2nd ed.). New York: Routledge.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Richard J. Shavelson .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG, part of Springer Nature

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Shavelson, R.J., Zlatkin-Troitschanskaia, O., Mariño, J.P. (2018). International Performance Assessment of Learning in Higher Education (iPAL): Research and Development. In: Zlatkin-Troitschanskaia, O., Toepper, M., Pant, H., Lautenbach, C., Kuhn, C. (eds) Assessment of Learning Outcomes in Higher Education. Methodology of Educational Measurement and Assessment. Springer, Cham. https://doi.org/10.1007/978-3-319-74338-7_10

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-74338-7_10

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-74337-0

  • Online ISBN: 978-3-319-74338-7

  • eBook Packages: EducationEducation (R0)

Publish with us

Policies and ethics