Abstract
This paper describes the development and psychometric evaluation of the Self-assessment Practice Scale (SaPS), an instrument for assessing students’ actions when engaged in self-assessment. Adopting a theory-driven approach, the SaPS was developed in line with the self-assessment process proposed by Yan and Brown (Assess Eval High Educ, 42(8):1247–1262, 2017). The survey was conducted with a total of 2906 Hong Kong students ranging from Primary 4 to Secondary 3. Two complementary analytical approaches, i.e., factor analysis and Rasch analysis, were applied to investigate the psychometric properties of the SaPS. Exploratory factor analysis revealed a three-factor model, while confirmatory factor analysis supported both three-factor and four-factor solutions. Rasch analysis provided further evidence of the psychometric quality of the four subscales in terms of dimensionality of the SaPS, the rating scale effectiveness, and item fit statistics. The final version of the SaPS contains 20 items in four subscales assessing students’ actions in self-assessment including seeking external feedback through monitoring, seeking external feedback through inquiry, seeking internal feedback, and self-reflection.
Similar content being viewed by others
Notes
CVR = (ne − N/2)/(N/2), where ne = number of experts indicating “essential”, N =total number of experts.
References
Adams, R. J., Wilson, M., & Wang, W. C. (1997). The multidimensional random coefficients multinomial logit model. Applied Psychological Measurement, 21(1), 1–23.
Andrich, D. (2004). Controversy and the Rasch model: A characteristic of incompatible paradigms? Medical Care, 42, 1–16.
Ashford, S. J. (1986). Feedback-seeking in individual adaptation: A resource perspective. Academy of Management Journal, 29(3), 465–487.
Ashford, S. J., & Cummings, L. L. (1983). Feedback as an individual resource: Personal strategies of creating information. Organizational Behavior and Human Performance, 32(3), 370–398.
Baars, M., Vink, S., van Gog, T., de Bruin, A., & Paas, F. (2014). Effects of training self-assessment and using assessment standards on retrospective and prospective monitoring of problem solving. Learning and Instruction, 33, 92–107.
Bond, T. G., & Fox, C. M. (2015). Applying the Rasch model: Fundamental measurement in the human sciences (3rd ed.). New York: Routledge.
Boud, D. (1995). Enhancing learning through self-assessment. London: Kogan Page.
Brown, G. T. L., & Harris, L. R. (2013). Student self-assessment. In J. H. McMillan (Ed.), The SAGE handbook of research on classroom assessment (pp. 367–393). Thousand Oaks, CA: Sage.
Cattell, R. B. (1966). The scree test for the number of factors. Multivariate Behavioral Research, 1, 245–276.
Chang, M. L., & Engelhard, G. (2016). Examining the teachers’ sense of efficacy scale at the item level with Rasch measurement model. Journal of Psychoeducational Assessment, 34(2), 177–191.
Cleary, T. J. (2006). The development and validation of the self-regulation strategy inventory-self-report. Journal of School Psychology, 44, 307–322.
Davis, D. A., Mazmanian, P. E., Fordis, M., van Harrison, R., Thorpe, K. E., & Perrier, L. (2006). Accuracy of physician self-assessment compared with observed measures of competence. Journal of American Medical Association, 296(9), 1094–1102.
Deneen, C., Brown, G. T. L., Bond, T. G., & Shroff, R. (2013). Understanding outcome-based education changes in teacher education: Evaluation of a new instrument with preliminary findings. Asia-Pacific Journal of Teacher Education, 41, 441–456.
Hart, C. O., Mueller, C. E., Royal, K. D., & Jones, M. H. (2013). Achievement goal validation among African American high school students: CFA and Rasch results. Journal of Psychoeducational Assessment, 31(3), 284–299.
Hu, L., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling, 6, 1–55.
Hwang, A., & Arbaugh, J. B. (2006). Virtual and traditional feedback-seeking behaviors: Underlying competitive attitudes and consequent grade performance. Decision Sciences Journal of Innovative Education, 4(1), 1–28.
Ibabe, I., & Jauregizar, J. (2010). Online self-assessment with feedback and metacognitive knowledge. Higher Education, 59, 243–258.
Kirby, N. F., & Downs, C. T. (2007). Self-assessment and the disadvantaged student: Potential for encouraging self-regulated learning? Assessment & Evaluation in Higher Education, 32(4), 475–494.
Kline, R. B. (2010). Principles and practice of structural equation modeling (3rd ed.). New York: The Guilford Press.
Kostons, D., van Gog, T., & Paas, F. (2010). Self-assessment and task selection in learner-controlled instruction: Differences between effective and ineffective learners. Computers & Education, 54(4), 932–940.
Krasman, J. (2010). The feedback-seeking personality: Big five and feedback-seeking behavior. Journal of Leadership & Organizational Studies, 17(1), 18–32.
Lawshe, C. H. (1975). A quantitative approach to content validity. Personnel Psychology, 28, 563–575.
Lin, T. H. (2006). A comparison of model selection indices for nested latent class models. Monte Carlo Methods and Applications, 12(3), 239–259.
Linacre, J. M. (2002). Optimizing rating scale category effectiveness. Journal of Applied Measurement, 3, 85–106.
McDonald, R. P., & Ho, M. R. (2002). Principles and practice in reporting structural equation analyses. Psychological Methods, 7, 64–82.
Messick, S. (1995). Validity of psychological assessment: Validation of inferences from persons’ responses and performances as scientific inquiry into score meaning. The American Psychologist, 50, 741–749.
Mok, M. M. C., Cheng, Y. C., Moore, P. J., & Kennedy, K. J. (2006). The development and validation of the self-directed learning scale (SLS). Journal of Applied Measurement, 7(4), 418–449.
Newsom, J. T. (2012). Some clarifications and recommendations on fit indices. Retrieved from http://www.upa.pdx.edu/IOA/newsom/semclass/ho_fit.pdf.
Panadero, E., Brown, G. T., & Strijibos, J. W. (2016). The future of student self-assessment: A review of known unknowns and potential directions. Educational Psychology Review, 28, 803–830.
Panadero, E., & Romero, M. (2014). To rubric or not to rubric? The effects of self-assessment on self-regulation, performance and self-efficacy. Assessment in Education, 21(2), 133–148.
Pintrich, P. R., Smith, D. A., Garcia, T., & Mckeachie, W. J. (1991). A manual for the use of the motivated strategies for learning questionnaire (MSLQ). Ann Arbor: National Center for Research to Improve Postsecondary teaching and Learning, University of Michigan.
Primi, R., Wechsler, S. M., Nakano, T. C., Oakland, T., & Guzzo, R. S. L. (2014). Using item response theory methods with the Brazilian Temperament Scale for students. Journal of Psychoeducational Assessment, 32(7), 651–662.
Rasch, G. (1960). Probabilistic models for some intelligence and achievement test. Copenhagan: Danish Institute for Educational Research. Expanded ed. (1980). Chicago: The University of Chicago Press.
Richardson, J. (2005). Instruments for obtaining student feedback: A review of the literature. Assessment & Evaluation in Higher Education, 30(4), 387–415.
Ryan, M. E. (2014). Reflexive writers: Rethinking writing development and assessment in schools. Assessing Writing, 22, 60–74.
Suh, H. N., Wang, K. T., & Arterberry, B. J. (2015). Development and initial validation of the self-directed learning inventory with Korean college students. Journal of Psychoeducational Assessment, 33(7), 687–697.
Swann, W. B., Jr., Pelham, B. W., & Krull, D. S. (1989). Agreeable fancy or disagreeable truth? Reconciling self-enhancement and self-verification. Journal of Personality and Social Psychology, 57(5), 782–791.
Tan, K. H. K. (2012). Student self-assessment. Assessment, learning and empowerment. Singapore: Research Publishing.
Thompson, B. (2004). Exploratory and confirmatory factor analysis: Understanding concepts and applications. Washington, DC: American Psychological Association.
Wang, W. C., Yao, G., Tsai, Y. J., Wang, J. D., & Hsieh, C. L. (2006). Validating, improving reliability, and estimating correlation of the four subscales in the WHOQOL-BREF using multidimensional Rasch analysis. Quality of Life Research, 15, 607–620.
Wilson, M. (2005). Constructing measures: An item response modeling approach. Mahwah, NJ: Erlbaum Associates.
Wu, M. L., Adams, R. J., Wilson, M. R., & Haldane, S. A. (2007). ACER ConQuest, version 2.0: Generalized item response modelling software. Camberwell: Australian Council for Educational Research.
Yan, Z. (2016a). The self-assessment practices of Hong Kong secondary students: Findings with a new instrument. Journal of Applied Measurement, 17(3), 335–353.
Yan, Z. (2016b). Student self-assessment practices: The role of gender, year level, and goal orientation. Assessment in Education. https://doi.org/10.1080/0969594X.2016.1218324.
Yan, Z., & Brown, G. T. L. (2017). A cyclical self-assessment process: Towards a model of how students engage in self-assessment. Assessment & Evaluation in Higher Education, 42(8), 1247–1262.
Acknowledgements
I am grateful to Professor Gavin T. L. Brown for his constructive comments on drafting and revising the SaPS items. Special thanks also go to Professor LEE Chi Kin John, Dr. KO Po Yuk, and the team of the project “Fostering Communities of Practice for Effective Teaching and Learning” for their assistance in data collection.
Funding
This work was supported by the General Research Fund (GRF) (Project Number: 18605715) of the Research Grants Council of Hong Kong.
Author information
Authors and Affiliations
Contributions
Dr. Zi Yan is an associate professor at The Education University of Hong Kong, and Associate Head of the Department of Curriculum and Instruction. His research interests focus on Rasch measurement, assessment in school and higher education contexts, with an emphasis on student self-assessment and self-regulated learning.
Corresponding author
Appendix
Appendix
Self-assessment Practice Scale (SaPS)
Seeking External Feedback Through Monitoring (SEFM)
-
1.
I check whether I have mastered the course content by doing extra exercises.
-
2.
I check whether I have fully understood the course content by doing past exam papers.
-
3.
I keep track of my progress by recording my performance.
-
4.
I ask myself questions in my head to check whether I have understood the course content.
-
5.
I check my performance against the answers in the text book or on a website.
Seeking External Feedback Through Inquiry (SEFI)
-
6.
I ask my teachers to give me feedback about my performance.
-
7.
I ask my family members to give me advice on my work.
-
8.
I ask my friends to tell me how to improve my learning.
-
9.
I ask my fellow group members to evaluate my contributions to group work tasks.
Seeking Internal Feedback (SIF)
-
10.
My gut feelings tell me whether my work is good or bad.
-
11.
My emotions influence my evaluation on my learning performance.
-
12.
How my body feels tells me how well I am doing.
-
13.
My intuition tells me if I am doing a good job or not.
Self-reflection (SR)
-
14.
I seek out the reasons for mistakes I made after getting back marked work.
-
15.
I think about how much sense the comments of other people (e.g., teachers, family members, and friends) regarding my work make to me.
-
16.
Any areas I am unsure of after finishing my work, I go over again.
-
17.
As I study, I think about whether the way I am studying is really helping me learn.
-
18.
When I do exercise, I look at what I got wrong or did poorly on to guide me as to what I should learn next.
-
19.
I pay attention to my assessment results to identify what I can do better next time.
-
20.
I reflect on my weaknesses when I discuss study-related issues with my classmates.
Rights and permissions
About this article
Cite this article
Yan, Z. The Self-assessment Practice Scale (SaPS) for Students: Development and Psychometric Studies. Asia-Pacific Edu Res 27, 123–135 (2018). https://doi.org/10.1007/s40299-018-0371-8
Published:
Issue Date:
DOI: https://doi.org/10.1007/s40299-018-0371-8