Development and Validation of the Assessment for Learning Experience Inventory (AFLEI) in Chinese Higher Education

Abstract

While assessment for learning (AfL) has gained increasing international prominence, and has been strongly promulgated by an increasing number of education systems, current instruments designed to measure students’ assessment for learning experience show a number of methodological shortcomings, such as lacking construct validity and low internal consistency of scales, or structural confirmations of the dimensionality of AfL constructs captured in the questionnaires have not been tested. This paper presents the development and validation of a psychometrically robust measure of assessment for learning experience in the Chinese higher education—the Assessment for Learning Experience Inventory (AFLEI). Two independent samples of 201 and 163 higher education students responded to the AFLEI. The data were then subjected to Exploratory Factor Analyses (EFAs) and Confirmatory Factor Analyses (CFAs), respectively. Results from both EFAs and CFAs provided support for a five–factor AfL experience inventory with a strong psychometric basis. The five clusters of AfL experience perceived by the Chinese university students are Teacher formal feedback and support, Interactive dialog and peer collaboration, Learning-oriented assessment, Active engagement with subject matter, and Students taking responsibility for their learning. The correlations between the five clusters of AfL experience and the ‘deep learning approach’ of the Revised Two-Factor Study Process Questionnaire (R-SPQ-2F) supported the concurrent validity of the AFLEI. The AFLEI can be used both as an evaluation tool to evaluate the extent to which university students experience AfL practices in university curriculum, and as a research tool to explore more deeply the relationships between AfL experience and student learning.

This is a preview of subscription content, log in to check access.

Fig. 1
Fig. 2
Fig. 3

Notes

  1. 1.

    All fit indices were robust indices based on MLR estimator.

References

  1. Anderson, J. C., & Gerbing, D. W. (1988). Structural equation modeling in practice: A review and recommended two-step approach. Psychological Bulletin, 103(3), 411.

    Google Scholar 

  2. Assessment Reform Group. (2002). Assessment for learning.. https://arrts.gtcni.org.uk/gtcni/handle/2428/4617

  3. Bandalos, D. L., & Finney, S. J. (2010). Factor analysis: Exploratory and confirmatory. In G. R. Hancock & R. O. Mueller (Eds.), The reviewer's guide to quantitative methods in the social sciences (pp. 93-114). New York: Routledge.

    Google Scholar 

  4. Bennett, R. E. (2011). Formative assessment: A critical review. Assessment in Education: principles, policy & practice, 18(1), 5–25.

    Google Scholar 

  5. Biggs, J. B. (1987). Student approaches to learning and studying. Hawthorn, Victoria: Australian Council for Educational Research.

    Google Scholar 

  6. Biggs, J., Kember, D., & Leung, D. Y. (2001). The revised two-factor study process questionnaire: R-SPQ-2F. British journal of educational psychology, 71(1), 133–149.

    Article  Google Scholar 

  7. Birenbaum, M. (2007). Assessment and instruction preferences and their relationship with test anxiety and learning strategies. Higher Education, 53(6), 749–768.

    Article  Google Scholar 

  8. Black, P. (2015). Formative assessment–an optimistic but incomplete vision. Assessment in Education: principles, policy & practice, 22(1), 161–177.

    Google Scholar 

  9. Black, P., & Wiliam, D. (1998). Assessment and classroom learning. Assessment in Education: principles, policy & practice, 5(1), 7–74.

    Google Scholar 

  10. Black, P., Harrison, C., Lee, C., Marshall, B., & Wiliam, D. (2002). Working insidet he black box: Assessment for learning in the classroom. London: King’s College London School of Education.

    Google Scholar 

  11. Black, P., Harrison, C., Lee, C., Marshall, B., & Wiliam, D. (2004). Working inside the black box: Assessment for learning in the classroom. Phi Delta Kappan, 86, 8–21.

    Article  Google Scholar 

  12. Brown, G. T., & Wang, Z. (2013). Illustrating assessment: How Hong Kong university students conceive of the purposes of assessment. Studies in higher education, 38(7), 1037–1057.

    Article  Google Scholar 

  13. Brown, G. T., & Wang, Z. (2016). Understanding Chinese university student conceptions of assessment: cultural similarities and jurisdictional differences between Hong Kong and China. Social Psychology of Education, 19(1), 151–173.

    Article  Google Scholar 

  14. Brown, G. T. L., Kennedy, K. J., Fok, P. K., Chan, J. K. S., & Yu, W. M. (2009). Assessment for student improvement: understanding Hong Kong teachers’ conceptions and practices of assessment. Assessment in Education: Principles, Policy & Practice, 16(3), 347–363.

    Google Scholar 

  15. Brislin, R. W. (1970). Back-translation for cross-cultural research. Journal of cross-cultural psychology, 1(3), 185–216.

    Article  Google Scholar 

  16. Buhagiar, M. A. (2007). Classroom assessment within the alternative assessment paradigm: revisiting the territory. The Curriculum Journal, 18(1), 39–56.

    Article  Google Scholar 

  17. Carless, D. (2007). Learning-oriented assessment: conceptual bases and practical implications. Innovations in Education and Teaching International, 44(1), 57–66.

    Article  Google Scholar 

  18. Carless, D. (2017). Scaling Up Assessment for Learning: Progress and Prospects. In D. Carless, S.M. Bridges, C.K.Y. Chan., R. Glofcheski. (Eds.), Scaling Up Assessment for Learning in Higher Education (pp.3–17). Singapore: Springer.

  19. Chen, J., & Brown, G. T. (2013). High-stakes examination preparation that controls teaching: Chinese prospective teachers’ conceptions of excellent teaching and assessment. Journal of Education for Teaching, 39(5), 541–556.

    Article  Google Scholar 

  20. Cohen, J. (1988). Statistical Power Analysis for the Behavioral Sciences (2nd ed.). Hillsdale, New Jersey: L. Erlbaum.

    Google Scholar 

  21. Davison, C., & Leung, C. (2009). Current issues in English language teacher-based assessment. Tesol Quarterly, 43, 393–415.

    Article  Google Scholar 

  22. DeVellis, R. F. (2003). Scale development: Theory and applications (2nd ed.). Thousand Oaks, CA: Sage publications.

    Google Scholar 

  23. Entwistle, N., & Tait, H. (1990). Approaches to learning, evaluations of teaching, and preferences for contrasting academic environments. Higher Education, 19(2), 169–194.

    Article  Google Scholar 

  24. George, D., & Mallery, P. (2011). SPSS for Windows step by step: A simple guide and reference. 11.0 update (4th ed.). Boston: Allyn & Bacon.

    Google Scholar 

  25. Gibbs, G., & Simpson, C. (2004). Measuring the response of students to assessment: the Assessment Experience Questionnaire. In C. Rust (Ed.), Improving student learning: Theory, research and scholarship (pp. 171–185). Oxford: OCSLD.

    Google Scholar 

  26. Gijbels, D., & Dochy, F. (2006). Students’ assessment preferences and approaches to learning: can formative assessment make a difference? Educational studies, 32(4), 399–409.

    Article  Google Scholar 

  27. Gipps, C. (2002). Sociocultural perspectives on assessment. In G. Wells & G. Claxton (Eds.), Learning for life in the 21st Century: Sociocultural perspectives on the future of education (pp. 73–83). Oxford, UK: Blackwell Publishers.

    Google Scholar 

  28. Hair, J. F. J., Black, W. C., Babin, B. J., Anderson, R. E., & Tatham, R. L. (2010). Multivariate data analysis. New Jersey: Prentice-Hall.

    Google Scholar 

  29. Hargreaves, E. (2013). Inquiring into children’s experiences of teacher feedback: Reconceptualising assessment for learning. Oxford Review of Education, 39(2), 229–246.

    Article  Google Scholar 

  30. Harrell, Frank E. (2001). Regression modeling strategies: with applications to linear models, logistic regression, and survival analysis. New York: Springer.

  31. Hernández, R. (2012). Does continuous assessment in higher education support student learning? Higher Education, 64(4), 489–502.

    Article  Google Scholar 

  32. Hu, L., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling: A Multidisciplinary Journal, 6(1), 1–55.

    Article  Google Scholar 

  33. James, M., Black, P., Carmichael, P., Drummond, M. J., Fox, A.,…& McCormick, R. 2007 Improving learning how to learn: Classrooms, schools and networks London: Routledge

  34. Kaiser, H. F. (1960). The application of electronic computers to factor analysis. Educational and Psychological Measurement, 20(1), 141–151.

    Article  Google Scholar 

  35. Klenowski, V. (2009). Assessment for learning revisited: An Asia-Pacific perspective. Assessment in Education: Principles, Policy & Practice, 16(3), 263–268.

    Google Scholar 

  36. Kline, R. B. (2015). Principles and practice of structural equation modeling. New York: Guilford publications.

    Google Scholar 

  37. Krishnan, V. (2011). A comparison of principal components analysis and factor analysis for uncovering the early development instrument (EDI) domains. Unpublished manuscript, Early Child Development Mapping (ECMap) Project, Alberta, University of Alberta, Edmonton, Canada.

  38. Li, H. (2016). How is formative assessment related to students’ reading achievement? Findings from PISA 2009. Assessment in Education: Principles, Policy & Practice, 23(4), 473–494.

    Google Scholar 

  39. Little, R. J. A., & Rubin, D. B. (2002). Statistical Analysis with Missing Data (2nd ed.). New York: Wiley.

    Google Scholar 

  40. McDowell, L., Wakelin, D., Montgomery, C., & King, S. (2011). Does assessment for learning make a difference? The development of a questionnaire to explore the student response. Assessment & Evaluation in Higher Education, 36(7), 749–765.

    Article  Google Scholar 

  41. Muthén, L., & Muthén, B. (2010). Mplus User's Guide. Los Angeles, CA: Muthén & Muthén.

    Google Scholar 

  42. Muthén, L., & Muthén, B.(2015). Mplus statistical modeling software: Release 7.4. Los Angeles, CA: Muthén & Muthén.

  43. OECD. (2011). Strong performers and successful reformers in education: Lessons from PISA for the United States: OECD. Paris: France.

    Google Scholar 

  44. Osborne, J. W., & Costello, A. B. (2009). Best practices in exploratory factor analysis: Four recommendations for getting the most from your analysis. Pan-Pacific Management Review, 12(2), 131–146.

    Google Scholar 

  45. Pat-El, R. J., Tillema, H., Segers, M., & Vedder, P. (2013). Validation of assessment for learning questionnaires for teachers and students. British Journal of Educational Psychology, 83(1), 98–113.

    Article  Google Scholar 

  46. Popham, W. J. (2008). Transformative assessment. Alexandria, VA: Association for Supervision and Curriculum Development.

    Google Scholar 

  47. Raubenheimer, J. (2004). An item selection procedure to maximize scale reliability and validity. SA Journal of Industrial Psychology, 30(4), 59–64.

    Article  Google Scholar 

  48. Revelle, W. (2017). psych: Procedures for psychological, psychometric, and personality research (p. 165). Evanston, Illinois: Northwestern University.

    Google Scholar 

  49. Sadler, D. R. (1998). Formative assessment: Revisiting the territory. Assessment & Evaluation in Higher Education, 5(1), 77–84.

    Google Scholar 

  50. Segers, M., Gijbels, D., & Thurlings, M. (2008). The relationship between students’ perceptions of portfolio assessment practice and their approaches to learning. Educational Studies, 34(1), 35–44.

    Article  Google Scholar 

  51. Stiggins, R. (2005). From formative assessment to assessment for learning: A path to success in standards-based schools. Phi Delta Kappan, 87(4), 324–328.

    Article  Google Scholar 

  52. Struyven, K., Dochy, F., & Janssens, S. (2005). Students’ perceptions about new modes of assessment in higher education: A review. Assessment & Evaluation in Higher Education, 30(4), 331–247.

    Article  Google Scholar 

  53. Swaffield, S. (2011). Getting to the heart of authentic assessment for learning. Assessment in Education: Principles, Policy & Practice, 18(4), 433–449.

    Google Scholar 

  54. Tabachnick, B. G., & Fidell, L. S. (2007). Using multivariate statistics (5th ed.). Boston, MA: Pearson.

    Google Scholar 

  55. Torrance, H. (1995). Evaluating authentic assessment: Issues, problems and future possibilities. Buckingham: Open University Press.

    Google Scholar 

  56. Xie, Q. (2014). Validating the revised two-factor study process questionnaire among Chinese Uuniversity students. The International Journal of Educational and Psychological Assessment, 16(2), 4–21.

    Google Scholar 

  57. Zhang, Z., & Burry-Stock, J. A. (2003). Classroom assessment practices and teachers' self-perceived assessment skills. Applied Measurement in Education, 16(4), 323–342.

    Article  Google Scholar 

Download references

Acknowledgements

This work was supported by the University of Macau under Grant MYRG 2016–00141-FED. The authors would like to thank all the research assistants for their help in coordinating the collection of data and administering the questionnaire.

Author information

Affiliations

Authors

Corresponding author

Correspondence to Zhengdong Gan.

Ethics declarations

Conflict of interest

The authors have stated no potential conflict of interest.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendices

Appendix 1: Chinese Translation of the Questionnaire Items

figurea

Appendix 2: Scree Plot and Parallel Analysis

See Fig. 3.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Gan, Z., He, J. & Mu, K. Development and Validation of the Assessment for Learning Experience Inventory (AFLEI) in Chinese Higher Education. Asia-Pacific Edu Res 28, 371–385 (2019). https://doi.org/10.1007/s40299-019-00435-7

Download citation

Keywords

  • Scale development
  • Scale validation
  • Assessment for learning
  • Higher education