Abstract
In this chapter, we (a) review validity issues relevant to the assessment of students with disabilities (SWDs) and English learners (ELs), (b) discuss current and emerging practices in test accommodations, and (c) describe methods for evaluating the degree to which test accommodations may facilitate or hinder valid interpretations of students’ performance. We highlight actions test developers, and researchers can take to make tests more accessible and to evaluate the impact of their testing procedures on SWDs and ELs. We also describe new developments in assessing students with severe cognitive disabilities. Given that these developments have a common goal of increasing fairness and accessibility in educational assessment, we also discuss fairness issues in educational testing related to assessing SWDs and ELs.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
For tests measuring English proficiency, English proficiency is seen as construct-relevant (Sireci & Faulkner-Bond, 2015). Thus, there are more accommodations available to ELs for subject area tests such as math and science, where the linguistic complexity of items administered in English is considered construct-irrelevant.
- 2.
Students with severe cognitive disabilities represent a fourth group, but their needs are typically beyond what accommodations can provide, and so alternate assessments are typically provided. We describe alternate assessments in a later section of this chapter.
- 3.
Smarter Balanced and PARRC are multistate assessment consortia in the United States that represent groups of states working together to deliver common assessments in reading and mathematics for elementary, middle, and high school students.
- 4.
Of course, understanding the available accommodations, and practicing using them, would be important for students, parents, and teachers to know and are likely to be beneficial.
References
Abedi, J. (2007). English language proficiency assessment in the nation: Current status and future practice. Davis, CA: University of California, Davis, School of Education.
Abedi, J., & Ewers, N. (2013). Smarter balanced assessment consortium: Accommodations for english language learners and students with disabilities a research based decision algorithm.
American Educational Research Association, American Psychological Association, & National Council on Measurement in Education. (2014). Standards for educational and psychological testing. Washington, DC: American Educational Research Association.
Azevedo, R., & Hadwin, A. (2005). Scaffolding self-regulated learning and metacognition. Implications for the design of computer based scaffolds. Instructional Science, 33, 367–379.
Bennett, R. E., Rock, D. A., Kaplan, B. A., & Jirele, T. (1988). Psychometric characteristics. In W. W. Willingham, M. Ragosta, R. E. Bennett, H. Braun, D. A. Rock, & D. E. Powers (Eds.), Testing handicapped people (pp. 1–15). Needham Heights, MA: Allyn and Bacon.
Cheung, G., & Rensvold, R. (2002). Evaluating goodness of fit indexes for testing measurement invariance. Structural Equation Modeling: A Multidisciplinary Jounal, 9, 233–245.
Christensen, L. L., Braam, M., Scullin, S., & Thurlow, M. L. (2011). 2009 state policies on assessment participation and accommodations for students with disabilities (Synthesis Report 83). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.
Clapper, A. T., Morse, A. B., Thompson, S. J., & Thurlow, M. L. (2005). Access assistants for state assessments: A study of state guidelines for scribes, readers, and sign language interpreters (Synthesis Report 58). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.
Clark, A., Kingston, N., Templin, J., & Pardos, Z. (2014). Summary of results from the fall 2013 pilot administration of the Dynamic Learning Maps™ Alternate Assessment System (Technical Report No. 14–01). Lawrence, KS: University of Kansas Centre for Educational Testing and Evaluation.
Clauser, B. E., & Mazor, K. M. (1998). Using statistical procedures to identify differential item functioning test items. Educational Measurement: Issues and Practice, 17, 31–44.
Dynamic Learning Maps Consortium. (2013). Dynamic Learning Maps Essential Elements for English language arts. Lawrence, KS: University of Kansas. Retrieved from: http://dynamiclearningmaps.org/sites/default/files/documents/ELA_EEs/DLM_Essential_Elements_ELA_%28 2013%29_v4.pdf.
Elliott, S. N., & Kettler, R. J. (2016). Item and test design considerations for students with special needs. In S. Lane, T. Haladyna, & M. Raymond (Eds.), Handbook of test development (pp. 374–391). Washington, DC: National Council on Measurement in Education.
Engelhard, G., Fincher, M., & Domaleski, C. S. (2011). Mathematics performance of students with and without disabilities under accommodated conditions using resource guides and calculators on high stakes tests. Applied Measurement in Education, 37, 281–306.
Fuchs, L. S., Fuchs, D., Eaton, S. B., Hamlett, C. L., Binkley, E., & Crouch, R. (2000, Fall). Using objective data sources to enhance teacher judgments about test accommodations. Exceptional Children, 67, 67–81.
Hambleton, R. K. (1989). Principles and selected applications of item response theory. In R. Linn (Ed.), Educational measurement (3rd ed., pp. 147–200). New York, NY: Macmillan.
Hambleton, R. K. (2006). Good practices for identifying differential item functioning. Medical Care, 44(Suppl. 3), Sl 82–Sl 88.
Hambleton, R. K., Swaminathan, H., & Rogers, H. J. (1991). Fundamentals of item response theory. Newbury Park, CA: Sage.
Herrera, A. W., Turner, C. D., Quenemoen, R. F., & Thurlow, M. L. (2015). NCSC’s age and grade–appropriate assessment of student learning (NCSC brief #6). Minneapolis, MN: University of Minnesota, National Center and State Collaborative.
Hodgson, J. R., Lazarus, S. S., Price, L., Altman, J. R., & Thurlow, M. L. (2012). Test administrators’ perspectives on the use of the read aloud accommodation on state tests for accountability (Technical Report No. 66). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes. Clapper et al. (2005).
Hu, L. T., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling, 6, 1–55.
Johnstone, C. J., Thompson, S. J., BottsfordMiller, N., & Thurlow, M. L. (2008). Universal design and multimethod approaches to item review. Educational Measurement: Issues & Practice, 27, 25–36.
Karami, H. (2012). An introduction to differential item functioning. International Journal of Educational Psychological Assessment, 11, 56–76.
Kearns, J. F., Towels-Reeves, E., Kleinert, H. L., Kleinert, J. O., & Kleine-Kracht, M. (2011). Characteristics of and implications for students participating in alternate assessments based on alternate academic achievement standards. Journal of Special Education, 45(3), 3–14.
Kettler, R. J. (2012). Testing accommodations: Theory and research to inform practice. International Disability, Development, and Education, 5(1), 53–66.
Kline, R. B. (2016). Principles and practice of structural equation modeling. New York, NY: Guilford Press.
Koenig, J. A., & Bachman, L. F. (Eds.). (2004). Keeping score for all: The effects of inclusion and accommodation policies on large-scale educational assessments. Washington, DC: National Academies Press.
Lee, A., Browder, D. M., Wakeman, S. Y., Quenemoen, R. F., & Thurlow, M. L. (2015, August). AA-AAS: How do our students learn and show what they know? (NCSC brief #3). Minneapolis, MN: University of Minnesota, National Center and State Collaborative.
Messick, S. (1989). Validity. In R. Linn (Ed.), Educational measurement (pp. 13–103). Washington, DC: American Council on Education.
National Center and State Collaborative. (2015). NCSC assessment policies. Retrieved from www.ncscpartners.org/Media/Default/PDFs/Resources/Parents/NCSCAssessmentPolicies.pdf.
PARCC. (2017). Accessibility Features and Accommodations Manual. Parcc Inc. Washington, DC: PARCC Assessment Consortia. Retrieved from http://avocet.pearson.com/PARCC/Home.
Pennock-Roman, M., & Rivera, C. (2011). Mean effects of test accommodations for ELLs and non-ELLs: A meta-analysis of experimental studies. Educational Measurement: Issues and Practice, 30, 10–18.
Russell, M. (2011). Digital test delivery: Empowering accessible test design to increase test validity for all students. Washington, DC: Arabella Advisors.
Sireci, S. G. (2005). Unlabeling the disabled: A perspective on flagging scores from accommodated test administrations. Educational Researcher, 34, 3–12.
Sireci, S. G., & Faulkner-Bond, M. F. (2015). Promoting validity in the assessment of English learners and other linguistic minorities. Review of Research in Education, 39, 215–252.
Sireci, S. G., & Gandara, M. F. (2016). Testing in educational and developmental settings. In F. Leong et al. (Eds.), International test commission handbook of testing and assessment (pp. 187–202). Oxford: Oxford University Press.
Sireci, S. G., Han, K. T., & Wells, C. S. (2008). Methods for evaluating the validity of test scores for English language learners. Educational Assessment, 13, 108–131.
Sireci, S. G., & Rios, J. (2013). Decisions that make a difference in detecting differential item functioning. Educational Research and Evaluation, 19, 170–187.
Sireci, S. G., Scarpati, S., & Li, S. (2005). Test accommodations for students with disabilities: An analysis of the interaction hypothesis. Review of Educational Research, 75, 457–490.
Sireci, S. G., & Wells, C. S. (2010). Evaluating the comparability of English and Spanish video accommodations for English language learners. In P. Winter (Ed.), Evaluating the comparability of scores from achievement test variations (pp. 33–68). Washington, DC: Council of Chief State School Officers.
Sireci, S. G., Wells, C., & Hu, H. (2014, April). Using internal structure validity evidence to evaluate test accommodations. Paper presented at the annual meeting of the National Council on Measurement in Education, Philadelphia.
Smarter Balanced. (2016). Usability, Accessibility, and Accommodations Guidelines. Retrieved from https://portal.smarterbalanced.org/library/en/usability-accessibility-and-accommodations-guidelines.pdf.
Thompson, S., Blount, A., & Thurlow, M. (2002). A summary of research on the effects of test accommodations: 1999 through 2001 (Technical Report 34). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes. Retrieved January 2003 from http://education.umn.edu/NCEO/OnlinePubs/Technical34.htm.
Thurlow, M. L., Elliot, J. L., & Ysseldyke, J. E. (2003). Testing students with disabilities: Practical strategies for complying with district and state requirements. Thousand Oaks, CA: Corwin Press.
Tippets, E., & Michaels, H. (1997). Factor structure invariance of accommodated and non-accommodated performance assessments. Paper presented at the meeting of the National Council on Measurement in Education, Chicago. Retrieved from http://www.tandfonline.com/.
U.S. Department of Education. (2015). US department of education FY2015 annual performance report and FY2017 annual performance plan. Retrieved from: http://www.ed.gov.about/reports/annual/index.html.
Wells-Moreaux, S., Bechard, S., & Karvonen, M. (2015). Accessibility manual for the dynamic learning maps alternate assessment, 2015–2016. Lawrence, KS: The University of Kansas Centre for Educational Testing and Evaluation.
Zumbo, B. (2007). Three generations of DlF analyses: Considering where it has been, where it is now, and where it is going. Language Assessment Quarterly, 4, 223–233.
Zumbo, B. D. (1999). A handbook on the theory and methods of differential item functioning (DIF): Logistic regression modeling as a unitary framework for binary and Likert-type (ordinal) item scores. Ottawa, Canada: Directorate of Human Resources Research and Evaluation, Department of National Defense.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG
About this chapter
Cite this chapter
Sireci, S.G., Banda, E., Wells, C.S. (2018). Promoting Valid Assessment of Students with Disabilities and English Learners. In: Elliott, S., Kettler, R., Beddow, P., Kurz, A. (eds) Handbook of Accessible Instruction and Testing Practices. Springer, Cham. https://doi.org/10.1007/978-3-319-71126-3_15
Download citation
DOI: https://doi.org/10.1007/978-3-319-71126-3_15
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-71125-6
Online ISBN: 978-3-319-71126-3
eBook Packages: Behavioral Science and PsychologyBehavioral Science and Psychology (R0)