Promoting Valid Assessment of Students with Disabilities and English Learners
In this chapter, we (a) review validity issues relevant to the assessment of students with disabilities (SWDs) and English learners (ELs), (b) discuss current and emerging practices in test accommodations, and (c) describe methods for evaluating the degree to which test accommodations may facilitate or hinder valid interpretations of students’ performance. We highlight actions test developers, and researchers can take to make tests more accessible and to evaluate the impact of their testing procedures on SWDs and ELs. We also describe new developments in assessing students with severe cognitive disabilities. Given that these developments have a common goal of increasing fairness and accessibility in educational assessment, we also discuss fairness issues in educational testing related to assessing SWDs and ELs.
KeywordsEducational testing English learners Students with disabilities Test accommodations Validity
- Abedi, J. (2007). English language proficiency assessment in the nation: Current status and future practice. Davis, CA: University of California, Davis, School of Education.Google Scholar
- Abedi, J., & Ewers, N. (2013). Smarter balanced assessment consortium: Accommodations for english language learners and students with disabilities a research based decision algorithm. Google Scholar
- American Educational Research Association, American Psychological Association, & National Council on Measurement in Education. (2014). Standards for educational and psychological testing. Washington, DC: American Educational Research Association.Google Scholar
- Bennett, R. E., Rock, D. A., Kaplan, B. A., & Jirele, T. (1988). Psychometric characteristics. In W. W. Willingham, M. Ragosta, R. E. Bennett, H. Braun, D. A. Rock, & D. E. Powers (Eds.), Testing handicapped people (pp. 1–15). Needham Heights, MA: Allyn and Bacon.Google Scholar
- Christensen, L. L., Braam, M., Scullin, S., & Thurlow, M. L. (2011). 2009 state policies on assessment participation and accommodations for students with disabilities (Synthesis Report 83). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.Google Scholar
- Clapper, A. T., Morse, A. B., Thompson, S. J., & Thurlow, M. L. (2005). Access assistants for state assessments: A study of state guidelines for scribes, readers, and sign language interpreters (Synthesis Report 58). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.Google Scholar
- Clark, A., Kingston, N., Templin, J., & Pardos, Z. (2014). Summary of results from the fall 2013 pilot administration of the Dynamic Learning Maps™ Alternate Assessment System (Technical Report No. 14–01). Lawrence, KS: University of Kansas Centre for Educational Testing and Evaluation.Google Scholar
- Dynamic Learning Maps Consortium. (2013). Dynamic Learning Maps Essential Elements for English language arts. Lawrence, KS: University of Kansas. Retrieved from: http://dynamiclearningmaps.org/sites/default/files/documents/ELA_EEs/DLM_Essential_Elements_ELA_%28 2013%29_v4.pdf.
- Elliott, S. N., & Kettler, R. J. (2016). Item and test design considerations for students with special needs. In S. Lane, T. Haladyna, & M. Raymond (Eds.), Handbook of test development (pp. 374–391). Washington, DC: National Council on Measurement in Education.Google Scholar
- Engelhard, G., Fincher, M., & Domaleski, C. S. (2011). Mathematics performance of students with and without disabilities under accommodated conditions using resource guides and calculators on high stakes tests. Applied Measurement in Education, 37, 281–306.Google Scholar
- Hambleton, R. K. (1989). Principles and selected applications of item response theory. In R. Linn (Ed.), Educational measurement (3rd ed., pp. 147–200). New York, NY: Macmillan.Google Scholar
- Hambleton, R. K. (2006). Good practices for identifying differential item functioning. Medical Care, 44(Suppl. 3), Sl 82–Sl 88.Google Scholar
- Hambleton, R. K., Swaminathan, H., & Rogers, H. J. (1991). Fundamentals of item response theory. Newbury Park, CA: Sage.Google Scholar
- Herrera, A. W., Turner, C. D., Quenemoen, R. F., & Thurlow, M. L. (2015). NCSC’s age and grade–appropriate assessment of student learning (NCSC brief #6). Minneapolis, MN: University of Minnesota, National Center and State Collaborative.Google Scholar
- Hodgson, J. R., Lazarus, S. S., Price, L., Altman, J. R., & Thurlow, M. L. (2012). Test administrators’ perspectives on the use of the read aloud accommodation on state tests for accountability (Technical Report No. 66). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes. Clapper et al. (2005).Google Scholar
- Karami, H. (2012). An introduction to differential item functioning. International Journal of Educational Psychological Assessment, 11, 56–76.Google Scholar
- Kline, R. B. (2016). Principles and practice of structural equation modeling. New York, NY: Guilford Press.Google Scholar
- Koenig, J. A., & Bachman, L. F. (Eds.). (2004). Keeping score for all: The effects of inclusion and accommodation policies on large-scale educational assessments. Washington, DC: National Academies Press.Google Scholar
- Lee, A., Browder, D. M., Wakeman, S. Y., Quenemoen, R. F., & Thurlow, M. L. (2015, August). AA-AAS: How do our students learn and show what they know? (NCSC brief #3). Minneapolis, MN: University of Minnesota, National Center and State Collaborative.Google Scholar
- Messick, S. (1989). Validity. In R. Linn (Ed.), Educational measurement (pp. 13–103). Washington, DC: American Council on Education.Google Scholar
- National Center and State Collaborative. (2015). NCSC assessment policies. Retrieved from www.ncscpartners.org/Media/Default/PDFs/Resources/Parents/NCSCAssessmentPolicies.pdf.
- PARCC. (2017). Accessibility Features and Accommodations Manual. Parcc Inc. Washington, DC: PARCC Assessment Consortia. Retrieved from http://avocet.pearson.com/PARCC/Home.
- Russell, M. (2011). Digital test delivery: Empowering accessible test design to increase test validity for all students. Washington, DC: Arabella Advisors.Google Scholar
- Sireci, S. G., & Gandara, M. F. (2016). Testing in educational and developmental settings. In F. Leong et al. (Eds.), International test commission handbook of testing and assessment (pp. 187–202). Oxford: Oxford University Press.Google Scholar
- Sireci, S. G., & Wells, C. S. (2010). Evaluating the comparability of English and Spanish video accommodations for English language learners. In P. Winter (Ed.), Evaluating the comparability of scores from achievement test variations (pp. 33–68). Washington, DC: Council of Chief State School Officers.Google Scholar
- Sireci, S. G., Wells, C., & Hu, H. (2014, April). Using internal structure validity evidence to evaluate test accommodations. Paper presented at the annual meeting of the National Council on Measurement in Education, Philadelphia.Google Scholar
- Smarter Balanced. (2016). Usability, Accessibility, and Accommodations Guidelines. Retrieved from https://portal.smarterbalanced.org/library/en/usability-accessibility-and-accommodations-guidelines.pdf.
- Thompson, S., Blount, A., & Thurlow, M. (2002). A summary of research on the effects of test accommodations: 1999 through 2001 (Technical Report 34). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes. Retrieved January 2003 from http://education.umn.edu/NCEO/OnlinePubs/Technical34.htm.
- Thurlow, M. L., Elliot, J. L., & Ysseldyke, J. E. (2003). Testing students with disabilities: Practical strategies for complying with district and state requirements. Thousand Oaks, CA: Corwin Press.Google Scholar
- Tippets, E., & Michaels, H. (1997). Factor structure invariance of accommodated and non-accommodated performance assessments. Paper presented at the meeting of the National Council on Measurement in Education, Chicago. Retrieved from http://www.tandfonline.com/.
- U.S. Department of Education. (2015). US department of education FY2015 annual performance report and FY2017 annual performance plan. Retrieved from: http://www.ed.gov.about/reports/annual/index.html.
- Wells-Moreaux, S., Bechard, S., & Karvonen, M. (2015). Accessibility manual for the dynamic learning maps alternate assessment, 2015–2016. Lawrence, KS: The University of Kansas Centre for Educational Testing and Evaluation.Google Scholar
- Zumbo, B. D. (1999). A handbook on the theory and methods of differential item functioning (DIF): Logistic regression modeling as a unitary framework for binary and Likert-type (ordinal) item scores. Ottawa, Canada: Directorate of Human Resources Research and Evaluation, Department of National Defense.Google Scholar