Developing an adaptive tool to select, plan, and scaffold oral assessment tasks for undergraduate courses
- 527 Downloads
The increased linguistic and cultural diversity of undergraduate classrooms at English language institutions has imposed additional pedagogical and assessment challenges on instructors, many of whom lack the knowledge necessary to design classroom activities and assessments that are fair to all students regardless of students’ background and language abilities. The development of an adaptive instrument for instructors who do not specialize in English language learning represents an attempt to adjust instructional practices to meet this need. This paper reports on the development of an instrument that undergraduate instructors can use to plan their courses at universities where English is the language of instruction. The instrument’s intended use is illustrated through an example that involves the planning of an interdisciplinary undergraduate course. To build this adaptive tool, a taxonomy that describes the relevant components of assessments that involve oral communication was developed and externally reviewed. The questions used in the instrument were then developed and piloted with a group of university undergraduate instructors; after which, the instrument was further refined. Although piloting revealed an increase in instructor awareness of how language abilities relate to assessment, further research is needed to determine the extent to which this tool affects instructor’s classroom or assessment practices.
KeywordsAssessment Planning Ethics Undergraduates Teaching practices Adaptive instructional tools
English language learner
International Language Testing Association
International English Language Test System
- TOEFL iBT
Test of English as a Foreign Language Internet Based Test
Educational Testing Service
Test of English for International Communication
We would like to thank the reviewers for their guidance. We would also like to thank our instructor, classmates, and participants for their guidance, contributions, and feedback. The first author held W. Garfield Weston and Walter C. Sumner Memorial Fellowships.
- Anaheim University. (2012). Anaheim University—Entrance requirements for accredited online degree and certificate programs. Resource document. Anaheim University. Retrieved December 11, 2012, from http://www.anaheim.edu/admissions/entrance-requirements.
- American Psychological Association (1988). Code of fair testing practices in education. Washington, D.C.: Joint Committee on Testing Practices, American Psychological Association.Google Scholar
- American Educational Research Association (1999). Standards for educational and psychological testing. Washington, D.C.: American Educational Research Association.Google Scholar
- Association of Language Testers in Europe. (2012). The content analysis checklists project. Resource document. Retrieved December 7, 2012, from http://events.alte.org/projects/analysis.php.
- Bachman, L. F. (1990). Fundamental considerations in language testing. Oxford, UK: Oxford University Press.Google Scholar
- Bachman, L. F., & Palmer, A. S. (2012). Language assessment in practice: Developing language assessments and justifying their use in the real world. Oxford, UK: Oxford University Press.Google Scholar
- Bailey, K. (1999). Washback in language testing. Princeton, NJ: Educational Testing Service.Google Scholar
- Cizek, G. J., Schmid, L. A., & Germuth, A. A. (2011). A Checklist For Evaluating K-12 Assessment Programs. Kalamazoo, MI: The Evaluation Center, Western Michigan University.Google Scholar
- Crystal, D. (1991). A dictionary of linguistics and phonetics. Oxford, UK: Basil Blackwell.Google Scholar
- Demmans Epp, C., Tsourounis, S., Djordjevic, J., & Baecker, R. M. (2013). Interactive event: Enabling vocabulary acquisition while providing mobile communication support. In H. C. Lane, K. Yacef, J. Mostow, & P. Pavlik (Eds.), Artificial intelligence in education (pp. 932–933). Memphis, TN: Springer.CrossRefGoogle Scholar
- Dix, A., Finlay, J. E., Abowd, G. D., & Beale, R. (2004). Human–computer interaction (3rd ed.). Harlow, England: Pearson/Prentice-Hall.Google Scholar
- Educational Testing Service. (2004). ibt/Next generation TOEFL Test: Independent Speaking Rubrics (Scoring Standards). Princeton, NJ: Educational Testing Service.Google Scholar
- Educational Testing Service. (2008). Top universities in UK accept TOEFL ® scores. Princeton, NJ: Educational Testing Service.Google Scholar
- Educational Testing Service. (2010). User Guide (Speaking & Writing). Princeton, NJ: Educational Testing Service.Google Scholar
- Educational Testing Service. (2012). TOEFL Destinations Directory. Princeton, NJ: Educational Testing Service.Google Scholar
- Flint, N., & Johnson, B. (2011). Towards fairer university assessment: Recognizing the concerns of students. New York, NY: Taylor & Francis.Google Scholar
- Fowler, F. J. (2009). Survey research methods (4th ed.). Thousand Oaks, CA: Sage Publications.Google Scholar
- Gambino, C. P., Acosta, Y. D., & Grieco, E. M. (2014). English-speaking ability of the foreign-born population in the United States: 2012. Washington, DC: US Census Bureau.Google Scholar
- Heywood, J. (2000). Assessment in higher education: Student learning, teaching, programmes and institutions. Philadelphia, PA: Jessica Kingsley Pub.Google Scholar
- Hu, C., Sharpe, L., Crawford, L., Gopinathan, S., Khine, M. S., Moo, S. N., & Wong, A. (2000). Using lesson video clips via multipoint desktop video conferencing to facilitate reflective practice. Journal of Information Technology for Teacher Education, 9(3), 377–388. doi: 10.1080/14759390000200093.CrossRefGoogle Scholar
- IELTS. (2007). Handbook 2007. Cambridge, UK: University of Cambridge.Google Scholar
- IELTS. (2009). US Recognition List: Educational Institutions, Professional Organizations and Accrediting Bodies Recognizing IELTS. Cambridge, UK: University of Cambridge.Google Scholar
- IELTS. (2012a). Institutions: Who accepts IELTS? Cambridge, UK: University of Cambridge. Retrieved December 7, 2012, from http://bandscore.ielts.org/.
- IELTS. (2012b). Speaking: Band descriptors (public version). Cambridge, UK: University of Cambridge. Retrieved November 14, 2012, from http://www.ielts.org/institutions/test_format_and_results/ielts_band_scores.aspx.
- International Language Testing Association (2001). Code of ethics for ILTA.Google Scholar
- Ishii, D., & Baba, K. (2003). Locally developed oral skills evaluation in ESL/EFL classrooms: A checklist for developing meaningful assessment procedures. TESL Canada Journal, 21, 79–95.Google Scholar
- King, J. (2002). Preparing EFL learners for oral presentations. Internet TESL Journal, 8(3). Retrieved from http://iteslj.org/.
- Kunnan, A. J. (2000). Fairness and justice for all. In A. J. Kunnan (Ed.), Fairness and validation in language assessment (pp. 1–14). Cambridge, UK: Cambridge University Press.Google Scholar
- Kunnan, A. J. (2004). Test Fairness. In M. Milanovic & C. Weir (Eds.), European language testing in a global context: Proceedings of the ALTE Barcelona conference (pp. 27–48). Cambridge, UK: Cambridge University Press.Google Scholar
- Labov, W. (1972). Sociolinguistic patterns. Philadelphia, PN: University of Pennsylvania Press.Google Scholar
- Laing, K., & Todd, L. (2012). Fair or foul? Towards practice and policy in fairness in education. Newcastle, UK: Newcastle University. Retrieved April 28, 2014, from http://www.ncl.ac.uk/socialrenewal/engagement/fairnesscommission/documents/fairness-in-education.pdf.
- Leki, I. (2007). Undergraduates in a second language: Challenges and complexities of academic literacy development. New York, NY: Lawrence Erlbaum Associates.Google Scholar
- MacDonald, K., Alderson, J., & Lai, L. (2004). Selecting and using computer-based language tests (CLBTs) to assess language proficiency: Guidelines for educators. TESL Canada Journal, 21, 93–104.Google Scholar
- McNamara, T. F., & Roever, C. (2006). Language testing: The social dimension. Malden, MA: Blackwell Pub.Google Scholar
- NAEP Data Explorer. (2014). 2011 national vocabulary, reading, and writing scores. USA: Institute of Educational Sciences National Center for Education Statistics.Google Scholar
- Nielson, J. (1994). Heuristic evaluation. In J. Nielson & R. L. Mack (Eds.), Usability inspection methods (pp. 25–62). New York, NY: John Wiley & Sons.Google Scholar
- Nitko, A. J. (2011). Educational assessment of students (6th ed.). Boston, MA: Pearson/Allyn & Bacon.Google Scholar
- Otoshi, J., & Heffernen, N. (2008). Factors predicting effective oral presentations in EFL classrooms. Asian EFL Journal, 10(1), 65–78.Google Scholar
- Popham, W. J. (2000). Modern educational measurement: Practical guidelines for educational leaders. Boston, MA: Allyn and Bacon Pub.Google Scholar
- Saville, N. (2003). The process of test development and revision within UCLES EFL. In C. J. Weir & M. Milanovic (Eds.), Continuity and innovation: Revising the Cambridge Proficiency in English Examination, 1913–2002 (pp. 57–120)). Cambridge, UK: Cambridge University Press.Google Scholar
- Saville, N. (2005). Setting and monitoring professional standards: A QMS approach. Cambridge ESOL: Research Notes, 22, 2–5.Google Scholar
- Trudgill, P. (1983). Sociolinguistics: An introduction to language and society. New York, N.Y.: Penguin.Google Scholar
- University of California, San Diego. (2012). GLI—Admissions information. Resource document. University of California, San Diego. Retrieved December 11, 2012, from http://irps.ucsd.edu/programs/executive-education-gli/admissions-information/.
- Webster, F. (2002). A genre approach to oral presentations. Internet TESL Journal, 8(7). Retrieved from http://iteslj.org/.
- Welsh, A. J. (2012). Exploring undergraduates’ perceptions of the use of active learning techniques in science lectures. Journal of College Science Teaching, 42, 80–87.Google Scholar