Advertisement

Promoting Valid Assessment of Students with Disabilities and English Learners

  • Stephen G. SireciEmail author
  • Ella Banda
  • Craig S. Wells
Chapter

Abstract

In this chapter, we (a) review validity issues relevant to the assessment of students with disabilities (SWDs) and English learners (ELs), (b) discuss current and emerging practices in test accommodations, and (c) describe methods for evaluating the degree to which test accommodations may facilitate or hinder valid interpretations of students’ performance. We highlight actions test developers, and researchers can take to make tests more accessible and to evaluate the impact of their testing procedures on SWDs and ELs. We also describe new developments in assessing students with severe cognitive disabilities. Given that these developments have a common goal of increasing fairness and accessibility in educational assessment, we also discuss fairness issues in educational testing related to assessing SWDs and ELs.

Keywords

Educational testing English learners Students with disabilities Test accommodations Validity 

References

  1. Abedi, J. (2007). English language proficiency assessment in the nation: Current status and future practice. Davis, CA: University of California, Davis, School of Education.Google Scholar
  2. Abedi, J., & Ewers, N. (2013). Smarter balanced assessment consortium: Accommodations for english language learners and students with disabilities a research based decision algorithm. Google Scholar
  3. American Educational Research Association, American Psychological Association, & National Council on Measurement in Education. (2014). Standards for educational and psychological testing. Washington, DC: American Educational Research Association.Google Scholar
  4. Azevedo, R., & Hadwin, A. (2005). Scaffolding self-regulated learning and metacognition. Implications for the design of computer based scaffolds. Instructional Science, 33, 367–379.CrossRefGoogle Scholar
  5. Bennett, R. E., Rock, D. A., Kaplan, B. A., & Jirele, T. (1988). Psychometric characteristics. In W. W. Willingham, M. Ragosta, R. E. Bennett, H. Braun, D. A. Rock, & D. E. Powers (Eds.), Testing handicapped people (pp. 1–15). Needham Heights, MA: Allyn and Bacon.Google Scholar
  6. Cheung, G., & Rensvold, R. (2002). Evaluating goodness of fit indexes for testing measurement invariance. Structural Equation Modeling: A Multidisciplinary Jounal, 9, 233–245.CrossRefGoogle Scholar
  7. Christensen, L. L., Braam, M., Scullin, S., & Thurlow, M. L. (2011). 2009 state policies on assessment participation and accommodations for students with disabilities (Synthesis Report 83). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.Google Scholar
  8. Clapper, A. T., Morse, A. B., Thompson, S. J., & Thurlow, M. L. (2005). Access assistants for state assessments: A study of state guidelines for scribes, readers, and sign language interpreters (Synthesis Report 58). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes.Google Scholar
  9. Clark, A., Kingston, N., Templin, J., & Pardos, Z. (2014). Summary of results from the fall 2013 pilot administration of the Dynamic Learning Maps™ Alternate Assessment System (Technical Report No. 14–01). Lawrence, KS: University of Kansas Centre for Educational Testing and Evaluation.Google Scholar
  10. Clauser, B. E., & Mazor, K. M. (1998). Using statistical procedures to identify differential item functioning test items. Educational Measurement: Issues and Practice, 17, 31–44.CrossRefGoogle Scholar
  11. Dynamic Learning Maps Consortium. (2013). Dynamic Learning Maps Essential Elements for English language arts. Lawrence, KS: University of Kansas. Retrieved from: http://dynamiclearningmaps.org/sites/default/files/documents/ELA_EEs/DLM_Essential_Elements_ELA_%28 2013%29_v4.pdf.
  12. Elliott, S. N., & Kettler, R. J. (2016). Item and test design considerations for students with special needs. In S. Lane, T. Haladyna, & M. Raymond (Eds.), Handbook of test development (pp. 374–391). Washington, DC: National Council on Measurement in Education.Google Scholar
  13. Engelhard, G., Fincher, M., & Domaleski, C. S. (2011). Mathematics performance of students with and without disabilities under accommodated conditions using resource guides and calculators on high stakes tests. Applied Measurement in Education, 37, 281–306.Google Scholar
  14. Fuchs, L. S., Fuchs, D., Eaton, S. B., Hamlett, C. L., Binkley, E., & Crouch, R. (2000, Fall). Using objective data sources to enhance teacher judgments about test accommodations. Exceptional Children, 67, 67–81.CrossRefGoogle Scholar
  15. Hambleton, R. K. (1989). Principles and selected applications of item response theory. In R. Linn (Ed.), Educational measurement (3rd ed., pp. 147–200). New York, NY: Macmillan.Google Scholar
  16. Hambleton, R. K. (2006). Good practices for identifying differential item functioning. Medical Care, 44(Suppl. 3), Sl 82–Sl 88.Google Scholar
  17. Hambleton, R. K., Swaminathan, H., & Rogers, H. J. (1991). Fundamentals of item response theory. Newbury Park, CA: Sage.Google Scholar
  18. Herrera, A. W., Turner, C. D., Quenemoen, R. F., & Thurlow, M. L. (2015). NCSC’s age and grade–appropriate assessment of student learning (NCSC brief #6). Minneapolis, MN: University of Minnesota, National Center and State Collaborative.Google Scholar
  19. Hodgson, J. R., Lazarus, S. S., Price, L., Altman, J. R., & Thurlow, M. L. (2012). Test administrators’ perspectives on the use of the read aloud accommodation on state tests for accountability (Technical Report No. 66). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes. Clapper et al. (2005).Google Scholar
  20. Hu, L. T., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling, 6, 1–55.CrossRefGoogle Scholar
  21. Johnstone, C. J., Thompson, S. J., BottsfordMiller, N., & Thurlow, M. L. (2008). Universal design and multimethod approaches to item review. Educational Measurement: Issues & Practice, 27, 25–36.CrossRefGoogle Scholar
  22. Karami, H. (2012). An introduction to differential item functioning. International Journal of Educational Psychological Assessment, 11, 56–76.Google Scholar
  23. Kearns, J. F., Towels-Reeves, E., Kleinert, H. L., Kleinert, J. O., & Kleine-Kracht, M. (2011). Characteristics of and implications for students participating in alternate assessments based on alternate academic achievement standards. Journal of Special Education, 45(3), 3–14.CrossRefGoogle Scholar
  24. Kettler, R. J. (2012). Testing accommodations: Theory and research to inform practice. International Disability, Development, and Education, 5(1), 53–66.CrossRefGoogle Scholar
  25. Kline, R. B. (2016). Principles and practice of structural equation modeling. New York, NY: Guilford Press.Google Scholar
  26. Koenig, J. A., & Bachman, L. F. (Eds.). (2004). Keeping score for all: The effects of inclusion and accommodation policies on large-scale educational assessments. Washington, DC: National Academies Press.Google Scholar
  27. Lee, A., Browder, D. M., Wakeman, S. Y., Quenemoen, R. F., & Thurlow, M. L. (2015, August). AA-AAS: How do our students learn and show what they know? (NCSC brief #3). Minneapolis, MN: University of Minnesota, National Center and State Collaborative.Google Scholar
  28. Messick, S. (1989). Validity. In R. Linn (Ed.), Educational measurement (pp. 13–103). Washington, DC: American Council on Education.Google Scholar
  29. National Center and State Collaborative. (2015). NCSC assessment policies. Retrieved from www.ncscpartners.org/Media/Default/PDFs/Resources/Parents/NCSCAssessmentPolicies.pdf.
  30. PARCC. (2017). Accessibility Features and Accommodations Manual. Parcc Inc. Washington, DC: PARCC Assessment Consortia. Retrieved from http://avocet.pearson.com/PARCC/Home.
  31. Pennock-Roman, M., & Rivera, C. (2011). Mean effects of test accommodations for ELLs and non-ELLs: A meta-analysis of experimental studies. Educational Measurement: Issues and Practice, 30, 10–18.CrossRefGoogle Scholar
  32. Russell, M. (2011). Digital test delivery: Empowering accessible test design to increase test validity for all students. Washington, DC: Arabella Advisors.Google Scholar
  33. Sireci, S. G. (2005). Unlabeling the disabled: A perspective on flagging scores from accommodated test administrations. Educational Researcher, 34, 3–12.CrossRefGoogle Scholar
  34. Sireci, S. G., & Faulkner-Bond, M. F. (2015). Promoting validity in the assessment of English learners and other linguistic minorities. Review of Research in Education, 39, 215–252.CrossRefGoogle Scholar
  35. Sireci, S. G., & Gandara, M. F. (2016). Testing in educational and developmental settings. In F. Leong et al. (Eds.), International test commission handbook of testing and assessment (pp. 187–202). Oxford: Oxford University Press.Google Scholar
  36. Sireci, S. G., Han, K. T., & Wells, C. S. (2008). Methods for evaluating the validity of test scores for English language learners. Educational Assessment, 13, 108–131.CrossRefGoogle Scholar
  37. Sireci, S. G., & Rios, J. (2013). Decisions that make a difference in detecting differential item functioning. Educational Research and Evaluation, 19, 170–187.CrossRefGoogle Scholar
  38. Sireci, S. G., Scarpati, S., & Li, S. (2005). Test accommodations for students with disabilities: An analysis of the interaction hypothesis. Review of Educational Research, 75, 457–490.CrossRefGoogle Scholar
  39. Sireci, S. G., & Wells, C. S. (2010). Evaluating the comparability of English and Spanish video accommodations for English language learners. In P. Winter (Ed.), Evaluating the comparability of scores from achievement test variations (pp. 33–68). Washington, DC: Council of Chief State School Officers.Google Scholar
  40. Sireci, S. G., Wells, C., & Hu, H. (2014, April). Using internal structure validity evidence to evaluate test accommodations. Paper presented at the annual meeting of the National Council on Measurement in Education, Philadelphia.Google Scholar
  41. Smarter Balanced. (2016). Usability, Accessibility, and Accommodations Guidelines. Retrieved from https://portal.smarterbalanced.org/library/en/usability-accessibility-and-accommodations-guidelines.pdf.
  42. Thompson, S., Blount, A., & Thurlow, M. (2002). A summary of research on the effects of test accommodations: 1999 through 2001 (Technical Report 34). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes. Retrieved January 2003 from http://education.umn.edu/NCEO/OnlinePubs/Technical34.htm.
  43. Thurlow, M. L., Elliot, J. L., & Ysseldyke, J. E. (2003). Testing students with disabilities: Practical strategies for complying with district and state requirements. Thousand Oaks, CA: Corwin Press.Google Scholar
  44. Tippets, E., & Michaels, H. (1997). Factor structure invariance of accommodated and non-accommodated performance assessments. Paper presented at the meeting of the National Council on Measurement in Education, Chicago. Retrieved from http://www.tandfonline.com/.
  45. U.S. Department of Education. (2015). US department of education FY2015 annual performance report and FY2017 annual performance plan. Retrieved from: http://www.ed.gov.about/reports/annual/index.html.
  46. Wells-Moreaux, S., Bechard, S., & Karvonen, M. (2015). Accessibility manual for the dynamic learning maps alternate assessment, 2015–2016. Lawrence, KS: The University of Kansas Centre for Educational Testing and Evaluation.Google Scholar
  47. Zumbo, B. (2007). Three generations of DlF analyses: Considering where it has been, where it is now, and where it is going. Language Assessment Quarterly, 4, 223–233.CrossRefGoogle Scholar
  48. Zumbo, B. D. (1999). A handbook on the theory and methods of differential item functioning (DIF): Logistic regression modeling as a unitary framework for binary and Likert-type (ordinal) item scores. Ottawa, Canada: Directorate of Human Resources Research and Evaluation, Department of National Defense.Google Scholar

Copyright information

© Springer International Publishing AG 2018

Authors and Affiliations

  • Stephen G. Sireci
    • 1
    Email author
  • Ella Banda
    • 1
  • Craig S. Wells
    • 1
  1. 1.University of Massachusetts AmherstAmherstUSA

Personalised recommendations