Utilizing Qualitative Methods for Assessment
Introduction
In a state‐of‐the‐art paper published in Language Testing, Bachman ( 2000) argues that the field of language testing has shown ample evidence of maturity over the last quarter century—in practical advances such as computer‐based assessment, in our understanding of the many factors involved in performance testing, and in a continuing concern over ethical issues in language assessment. However, an equally important methodological development over just the last fifteen years has been the introduction of qualitative research methodologies to design, describe, and validate language tests. That is, many language testers have come to recognize the limitations of traditional statistical methods for language assessment research, and have come to value these innovative methodologies as a means by which both the assessment process and the product may be understood. In what follows, I discuss a number of notable studies that use qualitative methods for assessment, with particular...
Keywords
Discourse Analysis Proficiency Level Conversation Analysis Language Assessment Oral AssessmentReferences
- Atkinson, J.M. and Heritage, J. (eds.): 1984, Structures of Social Action: Studies in Conversation Analysis, Cambridge University Press, Cambridge.Google Scholar
- Bachman, L.F.: 2000, ‘Modern language testing at the turn of the century: Assuring that what we count counts’, Language Testing 17, 1–42.Google Scholar
- Banerjee, J. and Luoma, S.: 1997, ‘Qualitative approaches to test validation’, in C. Clapham and D. Corson (eds.), Encyclopedia of Language and Education, Volume 7: Language Testing and Assessment, Kluwer, Amsterdam.Google Scholar
- Berwick, R. and Ross, S.: 1996, ‘Cross-cultural pragmatics in oral proficiency interview strategies’, in M. Millanovic and N. Saville (eds.), Performance testing, cognition, and assessment, Cambridge University Press, Cambridge, 34–54.Google Scholar
- Brown, A.D.: 2003, ‘Interviewer variation and the co‐construction of speaking proficiency’, Language Testing 20, 1–25.CrossRefGoogle Scholar
- Celce‐Murcia, M.: 1998, ‘Preface’, in R. Young and A.W. He (eds.), Talking and Testing: Discourse Approaches to the Assessment of Oral Proficiency, John Benjamins, PA.Google Scholar
- Davis, L.: 2005, Paired oral assessment in the EFL classroom: Insights from quantitative and discourse analysis, Paper presented at the Asia TEFL Conference, Beijing, China.Google Scholar
- Dimitrova‐Galaczi, E.: 2004, Peer–Peer Interaction in a Paired Speaking Test: The Case of the First Certificate in English, Unpublished PhD dissertation, Teachers College, Columbia University, NY.Google Scholar
- Green, A.: 1998, Verbal Protocol Analysis in Language Testing Research: A Handbook, Cambridge University Press, Cambridge.Google Scholar
- Katona, L.: 1998, ‘Meaning negotiation in the Hungarian oral proficiency interview’, in R. Young and A.W. He (eds.), Talking and Testing: Discourse Approaches to the Assessment of Oral Proficiency, John Benjamins, PA.Google Scholar
- Lazaraton, A.: 1991, A Conversation Analysis of Structure and Interaction in the Language Interview, Unpublished PhD dissertation, University of California, LA.Google Scholar
- Lazaraton, A.: 2002, A Qualitative Approach to the Validation of Oral Language Tests, Cambridge University Press, Cambridge.Google Scholar
- Lazaraton, A.: 2003, ‘Evaluative criteria for qualitative research in applied linguistics: Whose criteria and whose research?’, Modern Language Journal 87, 1–12.CrossRefGoogle Scholar
- Lumley, T. and Brown, A.: 2005, ‘Research methods in language testing’, in E. Hinkel (ed.), Handbook of Research in Second Language Teaching and Learning, Lawrence Erlbaum, Mahwah, NJ.Google Scholar
- McNamara, T.F., Hill, K., and May, L.: 2002, ‘Discourse and assessment’, Annual Review of Applied Linguistics 22, 221–242.CrossRefGoogle Scholar
- Orr, M.: 2002, ‘The FCE speaking test: Using rater reports to help interpret test scores’, System 30, 143–154.CrossRefGoogle Scholar
- O'Sullivan, B., Weir, C.J., and Saville, N.: 2002, ‘Using observation checklists to validate speaking‐test tasks’, Language Testing 19, 33–56.CrossRefGoogle Scholar
- Richards, K.: 2003, Qualitative Inquiry in TESOL, Palgrave Macmillan, Hampshire.Google Scholar
- Ross, S.: 1992, ‘Accommodative questions in oral proficiency interviews’, Language Testing, 9, 17–186.CrossRefGoogle Scholar
- Schegloff, E.A., Koshik, I., Jacoby, S., and Olsher, D.: 2002, ‘Conversation analysis and applied linguistics’, Annual Review of Applied Linguistics 22, 3–31.CrossRefGoogle Scholar
- Séror, J.: 2005, ‘Computers and qualitative data analysis: Paper, pens, and highlighters vs. screen, mouse, and keyboard’, TESOL Quarterly 39, 321–328.CrossRefGoogle Scholar
- Shohamy, E.: 2001, The Power of Tests: A Critical Perspective on the Use of Language Tests, Pearson Education, New York.Google Scholar
- Taylor, L.: 2005, ‘Linguistic diversity: Implications for testing and assessment’, Paper presented at the British Association of Applied Linguistics Conference, Bristol.Google Scholar
- van Lier, L.: 1989, ‘Reeling, writhing, drawling, stretching, and fainting in coils: Oral proficiency interviews as conversation’, TESOL Quarterly 23, 489–508.CrossRefGoogle Scholar
- Yoshida‐Morise, Y.: 1998, ‘The use of communication strategies in language proficiency interviews', in R. Young and A.W. He (eds.), Talking and Testing: Discourse Approaches to the Assessment of Oral Proficiency, John Benjamins, Philadelphia.Google Scholar
- Young, R. and Halleck, G.B.: 1998, ‘ ‘Let them eat cake!’: Or how to avoid losing your head in cross‐cultural conversations’, in R. Young and A.W. He (eds.), Talking and Testing: Discourse Approaches to the Assessment of Oral Proficiency, John Benjamins, Philadelphia.Google Scholar
- Young, R. and He, A.W. (eds.): 1998, Talking and Testing: Discourse Approaches to the Assessment of Oral Proficiency, John Benjamins, Philadelphia.Google Scholar