Advertisement

Investigating Stakeholder Perspectives on Interact

  • Martin East
Chapter
Part of the Educational Linguistics book series (EDUL, volume 26)

Abstract

The recent introduction of interact marks a significant shift in the way in which New Zealand high school students’ FL spoken communicative proficiency is to be assessed, and stands in stark contrast to earlier procedures. In particular, interact signals a shift from an assessment of learning model (converse) towards more open-ended assessment for learning opportunities (interact). This shift has implications for the perceived comparative usefulness of the two different assessments. This chapter outlines a theoretical framework to support an evaluation of the relative usefulness or fitness for purpose of different assessment types – Bachman and Palmer’s (1996) six qualities of test usefulness. The chapter goes on to articulate the fundamental principles informing interact in practice in terms of the information teachers have received, and evaluates these principles against the test usefulness framework. Finally, the chapter presents the methodology for the 2-year study that sought stakeholder views (both teachers and students) during the initial phases of the implementation of interact.

Keywords

Test Taker Assessment Task Student Survey Stakeholder Perspective Test Usefulness 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. Bachman, L. F. (1990). Fundamental considerations in language testing. Oxford, England: Oxford University Press.Google Scholar
  2. Bachman, L. F. (2002). Some reflections on task-based language performance assessment. Language Testing, 19(4), 453–476. http://dx.doi.org/10.1191/0265532202lt240oa
  3. Bachman, L. F., & Palmer, A. (1996). Language testing in practice: Designing and developing useful language tests. Oxford, England: Oxford University Press.Google Scholar
  4. Bachman, L. F., & Palmer, A. (2010). Language assessment in practice: Developing language assessments and justifying their use in the real world. Oxford, England: Oxford University Press.Google Scholar
  5. Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101. http://dx.doi.org/10.1191/1478088706qp063oa
  6. Brown, H. D., & Abeywickrama, P. (2010). Language assessment: Principles and classroom practices (2nd ed.). New York, NY: Pearson.Google Scholar
  7. Bryman, A. (2004a). Member validation and check. In M. Lewis-Beck, A. Bryman, & T. Liao (Eds.), Encyclopedia of social science research methods (p. 634). Thousand Oaks, CA: Sage. http://dx.doi.org/10.4135/9781412950589.n548
  8. Bryman, A. (2004b). Triangulation. In M. B. Lewis-Beck, A. Bryman, & T. Liao (Eds.), Encyclopedia of social science research methods (pp. 1143–1144). Thousand Oaks, CA: Sage. http://dx.doi.org/10.4135/9781412950589.n1031
  9. Canale, M. (1983). On some dimensions of language proficiency. In J. W. J. Oller (Ed.), Issues in language testing research (pp. 333–342). Rowley, MA: Newbury House.Google Scholar
  10. Canale, M., & Swain, M. (1980). Theoretical bases of communicative approaches to second language teaching and testing. Applied Linguistics, 1(1), 1–47. http://dx.doi.org/10.1093/applin/i.1.1
  11. Council of Europe, (2001). Common European Framework of Reference for languages. Cambridge, England: Cambridge University Press.Google Scholar
  12. Denzin, N. K. (1970). The research act in sociology. Chicago, IL: Aldine.Google Scholar
  13. East, M. (2008). Dictionary use in foreign language writing exams: Impact and implications. Amsterdam, Netherlands/Philadelphia, PA: John Benjamins. http://dx.doi.org/10.1075/lllt.22
  14. East, M. (2012). Task-based language teaching from the teachers’ perspective: Insights from New Zealand. Amsterdam, Netherlands / Philadelphia, PA: John Benjamins. http://dx.doi.org/10.1075/tblt.3
  15. East, M. (2013, August 24). The new NCEA ‘interact’ standard: Teachers’ thinking about assessment reform. Paper presented at the New Zealand Association of Language Teachers (NZALT) Auckland/Northland Region language seminar, Auckland.Google Scholar
  16. East, M. (2014, July, 6–9). To interact or not to interact? That is the question. Keynote address at the New Zealand Association of Language Teachers (NZALT) Biennial National Conference, Languages Give You Wings, Palmerston North, NZ.Google Scholar
  17. East, M., & Scott, A. (2011a). Assessing the foreign language proficiency of high school students in New Zealand: From the traditional to the innovative. Language Assessment Quarterly, 8(2), 179–189. http://dx.doi.org/10.1080/15434303.2010.538779
  18. East, M., & Scott, A. (2011b). Working for positive washback: The standards-curriculum alignment project for Learning Languages. Assessment Matters, 3, 93–115.Google Scholar
  19. Hinkel, E. (2010). Integrating the four skills: Current and historical perspectives. In R. Kaplan (Ed.), The Oxford handbook of applied linguistics (2nd ed., pp. 110–123). Oxford, England: Oxford University Press. http://dx.doi.org/10.1093/oxfordhb/9780195384253.013.0008
  20. Hu, G. (2013). Assessing English as an international language. In L. Alsagoff, S. L. McKay, G. Hu, & W. A. Renandya (Eds.), Principles and practices for teaching English as an international language (pp. 123–143). New York, NY: Routledge.Google Scholar
  21. Koefoed, G. (2012). Policy perspectives from New Zealand. In M. Byram & L. Parmenter (Eds.), The Common European Framework of Reference: The Globalisation of Language Education Policy (pp. 233–247). Clevedon, England: Multilingual Matters.Google Scholar
  22. Kramsch, C. (1986). From language proficiency to interactional competence. The Modern Language Journal, 70(4), 366–372. http://dx.doi.org/10.1111/j.1540-4781.1986.tb05291.x
  23. Landis, J. R., & Koch, G. G. (1977). The measurement of observer agreement for categorical data. Biometrics, 33(1), 159–174. http://dx.doi.org/10.2307/2529310
  24. Lazaraton, A. (1995). Qualitative research in applied linguistics: A progress report. TESOL Quarterly, 29(3), 455–472. http://dx.doi.org/10.2307/3588071
  25. Lazaraton, A. (2002). A qualitative approach to the validation of oral language tests. Cambridge, England: Cambridge University Press.Google Scholar
  26. Leaper, D. A., & Riazi, M. (2014). The influence of prompt on group oral tests. Language Testing, 31(2), 177–204. http://dx.doi.org/10.1177/0265532213498237
  27. Lewkowicz, J. (2000). Authenticity in language testing: Some outstanding questions. Language Testing, 17(1), 43–64. http://dx.doi.org/10.1177/026553220001700102
  28. Luoma, S. (2004). Assessing speaking. Cambridge, England: Cambridge University Press. http://dx.doi.org/10.1017/cbo9780511733017
  29. Mangubhai, F., Marland, P., Dashwood, A., & Son, J. B. (2004). Teaching a foreign language: One teacher’s practical theory. Teaching and Teacher Education, 20, 291–311. http://dx.doi.org/10.1016/j.tate.2004.02.001
  30. McNamara, T. (1997). ‘Interaction’ in second language performance assessment: Whose performance? Applied Linguistics, 18(4), 446–466. http://dx.doi.org/10.1093/applin/18.4.446
  31. Merriam, S. B. (2009). Qualitative research: A guide to design and implementation. San Fransisco, CA: Jossey-Bass.Google Scholar
  32. Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis: An expanded sourcebook (2nd ed.). Thousand Oaks, CA.: Sage.Google Scholar
  33. Ministry of Education. (2014b). Resources for internally assessed achievement standards. Retrieved from http://ncea.tki.org.nz/Resources-for-Internally-Assessed-Achievement-Standards
  34. NZQA. (2014e). NCEA subject resources. Retrieved from http://www.nzqa.govt.nz/qualifications-standards/qualifications/ncea/subjects/
  35. Pardo-Ballester, C. (2010). The validity argument of a web-based Spanish listening exam: Test usefulness evaluation. Language Assessment Quarterly, 7(2), 137–159. http://dx.doi.org/10.1080/15434301003664188
  36. Poehner, M. (2008). Dynamic assessment: A Vygotskian approach to understanding and promoting L2 development. New York, NY: Springer.CrossRefGoogle Scholar
  37. Scott, A., & East, M. (2009). The standards review for learning languages: How come and where to? The New Zealand Language Teacher, 39, 28–33.Google Scholar
  38. Scott, A., & East, M. (2012). Academic perspectives from New Zealand. In M. Byram & L. Parmenter (Eds.), The Common European framework of reference: The globalisation of language education policy (pp. 248–257). Clevedon, England: Multilingual Matters.Google Scholar
  39. Shohamy, E. (2001). The social responsibility of the language testers. In R. L. Cooper (Ed.), New perspectives and issues in educational language policy (pp. 113–130). Amsterdam, Netherlands/Philadelphia, PA: John Benjamins Publishing Company. http://dx.doi.org/10.1075/z.104.09sho
  40. Shohamy, E. (2007). Tests as power tools: Looking back, looking forward. In J. Fox, M. Wesche, D. Bayliss, L. Cheng, C. E. Turner, & C. Doe (Eds.), Language testing reconsidered (pp. 141–152). Ottawa, Canada: University of Ottawa Press.Google Scholar
  41. Spolsky, B. (1985). The limits of authenticity in language testing. Language Testing, 2(1), 31–40. http://dx.doi.org/10.1177/026553228500200104

Copyright information

© Springer Science+Business Media Singapore 2016

Authors and Affiliations

  • Martin East
    • 1
  1. 1.Faculty of Education and Social WorkThe University of AucklandAucklandNew Zealand

Personalised recommendations