Advertisement

Teachers’ Voices in the Decision to Discontinue a Public Examination Reform: Washback Effects and Implications for Utilizing Tests As Levers for Change

  • Hyunjin KimEmail author
  • Talia Isaacs
Chapter

Abstract

Although a growing awareness of the social nature of assessment has led to an increased interest in washback in language testing, previous research has focused on the effects of existing exams or the introduction of new exams. However, if the introduction or existence of an exam has potential power to produce changes in teaching and learning, withdrawing that exam may also have an impact that deserves our attention. In an attempt to address this gap in washback literature, the present study examined the effects of the decision made by the South Korean Ministry of Education to discontinue the National English Ability Test (NEAT), which had been developed with the intention of promoting curricular change in schools by introducing productive skills assessment in High-Stakes national testing. This mixed-methods study reports on questionnaire data completed by 72 English teachers from middle schools and high schools in Seoul, and six follow-up interviews to examine how the Ministry’s decision affected the instructional practice and perceptions of teachers. Results showed that the abrupt withdrawal of NEAT from being implemented may have had unintended washback effects on the participants’ perceptions, if not on their teaching practices. These findings suggest that discontinuing an assessment reform involves more than reverting to the previous state, highlighting the need for greater teacher involvement in the design and implementation of High-Stakes assessment policies in order to enhance the potential success of utilizing tests as levers for change.

Notes

Acknowledgments

This study was conducted in partial fulfillment of the first author’s Master’s degree in TESOL/Applied Linguistics at the University of Bristol in the United Kingdom.

References

  1. Ahn, H. (2015). Assessing proficiency in the National English Ability Test (NEAT) in South Korea. English Today, 31, 34–42.CrossRefGoogle Scholar
  2. Alderson, J. C., & Banerjee, J. (2001). Language testing and assessment (part one). Language Teaching, 34, 213–236.CrossRefGoogle Scholar
  3. Alderson, J. C., & Wall, D. (1993). Does washback exist? Applied Linguistics, 14, 115–129.CrossRefGoogle Scholar
  4. Andrews, S. (2004). Washback and curriculum innovation. In L. Cheng & Y. Watanabe (Eds.), Washback in language testing: Research contexts and methods (pp. 37–50). Mahwah: Lawrence Erlbaum.Google Scholar
  5. Bachman, L. F. (2013). NEAT to have positive impact on English learning. Korean Herald. Retrieved September 30, 2016, from http://www.koreatimes.co.kr/www/news/nation/2013/10/181_133721.html
  6. Bahk, E. (2014). Homegrown English test to be phased out. The Korea Times. Retrieved September 30, 2016, from http://koreatimes.co.kr/www/news/nation/2014/01/113_149859.html
  7. Burrows, C. (2004). Washback in classroom-based assessment: A study of the washback effect in the Australian adult migrant English program. In L. Cheng & Y. Watanabe (Eds.), Washback in language testing: Research contexts and methods (pp. 113–128). Mahwah: Lawrence Erlbaum.Google Scholar
  8. Cheng, L. (2005). Changing language teaching through language testing: A washback study. Cambridge: Cambridge University Press.Google Scholar
  9. Corbin, J. M., & Strauss, A. (2007). Basics of qualitative research techniques and procedures for developing grounded theory (3rd ed.). Thousand Oaks: SAGE.Google Scholar
  10. Creswell, J. W., & Plano Clark, V. L. (2011). Designing and conducting mixed methods research (2nd ed.). Thousand Oaks: SAGE.Google Scholar
  11. Deng, C., & Carless, D. R. (2010). Examination preparation or effective teaching: Conflicting priorities in the implementation of a pedagogic innovation. Language Assessment Quarterly, 7, 285–302.CrossRefGoogle Scholar
  12. Dörnyei, Z. (2007). Research methods in applied linguistics. Oxford: Oxford University Press.Google Scholar
  13. Dörnyei, Z., & Taguchi, T. (2010). Questionnaires in second language research: Construction, administration, and processing (2nd ed.). London: Routledge.Google Scholar
  14. East, M. (2015). Coming to terms with innovative high-stakes assessment practice: Teachers’ viewpoints on assessment reform. Language Testing, 32, 101–120.CrossRefGoogle Scholar
  15. Gambell, T. (2004). Teachers working around large-scale assessment: Reconstructing professionalism and professional development. English Teaching: Practice and Critique, 3, 48–73.Google Scholar
  16. James, N., & Busher, H. (2009). Online interviewing. Thousand Oaks: SAGE.CrossRefGoogle Scholar
  17. Jin, K. (2012). Will new English test facilitate communication in the classroom? Korean Herald. Retrieved September 30, 2016, from http://www.koreaherald.com/view.php?ud=20120925000720
  18. Jung, M. & Jung, S. (2014). Questions remain over billion blown on NEAT. The Korea Times. Retrieved September 30, 2016, from http://www.koreatimes.co.kr/www/news/nation/2014/08/181_157589.html
  19. Kazmer, M., & Xie, B. (2008). Qualitative interviewing in internet studies: Playing with the media, playing with the method. Information, Community and Society, 11, 257–278.CrossRefGoogle Scholar
  20. Kim, H. (2014). Teachers’ voices in the decision to discontinue a public examination reform: Washback effects and implications for teaching and assessing productive skills of English in Korean secondary schools. Dissertation, University of Bristol, UK.Google Scholar
  21. Klenowski, V., & Wyatt-Smith, C. (2012). The impact of high-stakes testing: The Australian story. Assessment in Education: Principles, Policy & Practice, 19, 65–79.CrossRefGoogle Scholar
  22. Korea Educational Statistics Service. (2014). Yoochojoongdeung kyoyuk tongye [Educational statistics for kindergartens, elementary schools, and secondary schools]. Retrieved September 30, 2016, from Korea Educational Statistics Service Web site http://kess.kedi.re.kr/stats/intro?menuCd=0101&survSeq=2014&itemCode=01
  23. Opdenakker, R. (2006). Advantages and disadvantages of four interview techniques in qualitative research. Forum: Qualitative Social Research, 7, 11. Retrieved September 30, 2016, from http://www.qualitative-research.net/index.php/fqs/article/view/175/392.do.Google Scholar
  24. Pellegrino, J. W., Baxter, G. P., & Glaser, R. (1999). Addressing the “two disciplines” problem: Linking theories of cognition and learning with assessment and instructional practice. Review of Research in Education, 24, 307–353.Google Scholar
  25. Rea-Dickins, P., & Scott, C. (2007). Washback from language tests on teaching, learning and policy: Evidence from diverse settings. Assessment in Education: Principles, Policy and Practice, 14, 1–7.CrossRefGoogle Scholar
  26. Ryan, K. (2002). Assessment validation in the context of high-stakes assessment. Educational Measurement: Issues and Practice, 21, 7–15.CrossRefGoogle Scholar
  27. Salmons, J. (2010). Online interviews in real time. Thousand Oaks: SAGE.Google Scholar
  28. Seidman, I. (2006). Interviewing as qualitative research (3rd ed.). New York: Teachers College Press.Google Scholar
  29. Shohamy, E., Donitsa-Schmidt, S., & Ferman, I. (1996). Test impact revisited: Washback effect over time. Language Testing, 13, 298–317.CrossRefGoogle Scholar
  30. Stewart, K., & Williams, M. (2005). Researching online populations: The use of online focus groups for social research. Qualitative Research, 5, 395–416.CrossRefGoogle Scholar
  31. Tashakkori, A., & Teddlie, C. B. (1998). Mixed methodology: Combining qualitative and quantitative approaches. Thousand Oaks: SAGE.Google Scholar
  32. Taylor, L. (2009). Developing assessment literacy. Annual Review of Applied Linguistics, 29, 21–36.CrossRefGoogle Scholar
  33. Teddlie, C., & Tashakkori, A. (2009). Foundations of mixed methods research. Thousand Oaks: SAGE.Google Scholar
  34. Turner, C. E. (2012). Classroom assessment. In G. Fulcher & F. Davidson (Eds.), Routledge handbook of language testing (pp. 65–78). New York: Taylor & Francis.Google Scholar
  35. Wall, D. (2005). The impact of high-stakes examinations on classroom teaching: A case study using insights from testing and innovation theory. Cambridge: Cambridge University Press.Google Scholar
  36. Wall, D., & Horák, T. (2011). The impact of changes in the TOEFL examination on teaching in a sample of countries in Europe: Phase 3, the role of the coursebook and phase 4, describing change. Princeton: ETS.Google Scholar
  37. Winke, P. (2011). Evaluating the validity of a high-stakes ESL test: Why teachers’ perceptions matter. TESOL Quarterly, 45, 628–660.CrossRefGoogle Scholar
  38. Xie, Q., & Andrews, S. (2012). Do test design and uses influence test preparation? Testing a model of washback with structural equation modeling. Language Testing, 30, 49–70.CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Apgujeong High SchoolSeoulSouth Korea
  2. 2.University College LondonLondonUK

Personalised recommendations