Implementing a Learning-Oriented Approach within English Language Assessment in Hong Kong Schools: Practices, Issues and Complexities

  • Liz Hamp-Lyons


It is becoming increasingly well understood that every educational innovation thrives or flounders within a social-political ideological context (Henrichsen 1989; Kellaghan & Greaney 1992; Wall 2005). Hong Kong has for many years had a traditional norm-referenced examination system for school placement, promotion and exit (for a historical overview of the public examination system in Hong Kong, see Choi & Lee 2009). While this system is congruent with a traditional Chinese cultural heritage context, educators have long felt there is something fundamentally flawed about a system in which students may fail every school subject in which they take a formal exam. In the English subject, for example (by no means one with the worst results), between 1997 and 2007 (the last year of norm-referenced results reporting) 41–60% of students failed the Syllabus A English and 60–78% failed the more difficult Syllabus B English. Steps have been taken at several points in the past 30 years to reform the educational system to better fit the needs of school students, and to ensure that the right individuals enter tertiary education and that sound educational opportunities are available to those not entering tertiary education (King 1994; Qian 2008).


Private Tutoring English Teacher Shadow Education Language Assessment Assessment Innovation 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. Andrews, S. & Fullilove, J. (1994). Assessing spoken English in public examinations — why and how? In J. Boyle & P. Falvey (Eds.), English language testing in Hong Kong (pp. 57–86). Hong Kong: Chinese University Press.Google Scholar
  2. Berry, R. (2008). Assessment for learning. Hong Kong: Hong Kong University Press.CrossRefGoogle Scholar
  3. Black, P. & Wiliam, D. (1998). Assessment and classroom learning. Assessment in Education, 5 (1), 7–68.CrossRefGoogle Scholar
  4. Boyle, J. & Falvey, P. (Eds.). (1994). English language testing in Hong Kong. Hong Kong: Chinese University Press.Google Scholar
  5. Bray, M. (2007). The shadow education system: Private tutoring and its implications for planners. (Fundamentals of Educational Planning 61, 2nd Ed.). Paris: UNESCO International Institute for Educational Planning.Google Scholar
  6. Bray, M. (2012). Wolves lurking in the shadows of education. South China Morning Post, 19 July 2012.Google Scholar
  7. Carless, D. (2011). From testing to productive student learning. New York: Routledge.Google Scholar
  8. Chan, C. & Bray, M. (2014). Marketized private tutoring as a supplement to regular schooling: Liberal Studies and the shadow sector in Hong Kong secondary education. Journal of Curriculum Studies, 46 (3), 361–388.CrossRefGoogle Scholar
  9. Chen, Q. & Klenowski, V. (2009). Assessment and curriculum reform in China: the College English test and tertiary English as a foreign language education. In: Proceedings of the 2008 AARE International Education Conference, 30 November-4 December 2008, Queensland University of Technology, Brisbane.Google Scholar
  10. Cheung, D. (2001). School-based assessment in public examination: Identifying the concerns of teachers. Educational Journal, 29 (2), 105–123.Google Scholar
  11. Choi, C-c. & C. Lee. (2009). Developments of English Language assessment in public examinations in Hong Kong. In L. Cheng & A. Curtis (Eds.), English language assessment and the Chinese learner (pp. 60–76). New York: Routledge.Google Scholar
  12. Curriculum Development Council (1999). English language teaching syllabus and the Senior Secondary curriculum. Hong Kong: Hong Kong Education Bureau.Google Scholar
  13. Davison, C. (2007). Views from the chalk face: English language school-based assessment in Hong Kong. Language Assessment Quarterly, 4 (1), 37–68.CrossRefGoogle Scholar
  14. Davison, C. (2013). Innovation in assessment: Common misconceptions and problems. In K. Hyland & L. L. C. Wong (Eds.), Innovation and change in English language education (pp. 263–275). Abingdon: Routledge.Google Scholar
  15. Davison, C. & Hamp-Lyons, L. (2009). The Hong Kong certificate of education: school-based assessment reform in Hong Kong English language education. In L. Cheng & A. Curtis (Eds.), English language assessment and the Chinese learner (pp. 248–262). New York: Routledge.Google Scholar
  16. Ellis, R. (2009). The differential effects of three types of task planning on the fluency, complexity, and accuracy in L2 oral production. Applied Linguistics, 30 (4), 474–509.CrossRefGoogle Scholar
  17. Elman, B. (2000). A cultural history of civil examinations in late imperial China. Los Angeles: University of California Press.Google Scholar
  18. Evans, S. (2000). Hong Kong’s new English language policy in education. World Englishes, 19 (2), 184–204.CrossRefGoogle Scholar
  19. Fok, P. K., Kennedy, K., Chan, K. S. J., & Yu, W. M. (2006). Integrating assessment of learning and assessment for learning in Hong Kong public examinations:Google Scholar
  20. Rationales and realities of introducing school-based assessment. Paper presented at the 32nd Annual Conference of the International Association for Educational Assessment, Singapore, 21–26 May 2006. Accessed on 14 January 2013.
  21. Gan, Z-d. Davison, C. & Hamp-Lyons, L. (2009). Topic negotiation in peer group oral assessment situations: A conversation analytic approach. Applied Linguistics, 30 (3), 315–334.CrossRefGoogle Scholar
  22. Hall, G. E. & Hord, S. M. (2006). Implementing change: Patterns, principles, and potholes. Boston, ME: Allyn and Bacon.Google Scholar
  23. Hamp-Lyons, L. (2006). Fairness as an issue in school-based assessment. Inaugural seminar series on English language school-based assessment: Integrating theory and practice, 9 January 2006. Hong Kong University.Google Scholar
  24. Hamp-Lyons, L. (2007). The impact of testing practices on teaching: Ideologies and alternatives. In J. Cummins & C. Davison (Eds), The international handbook of English language teaching, Vol. 1. (pp. 487–504). Norwell, MA: Springer.CrossRefGoogle Scholar
  25. Henrichsen, L. (1989). Diffusion of innovations in English language teaching: The ELEC effort in Japan, 1956–1968. Westport: Greenwood Press.Google Scholar
  26. Hong Kong Examinations and Assessment Authority. (2006). 2007 HKCEE English Language, Introduction to the School-based Assessment Component.Google Scholar
  27. IBM. (2003). Strategic review of Hong Kong examinations and assessment authority: Final report. Examinations Authority: Hong Kong. Accessed 10 September 2007 from Scholar
  28. Kellaghan, T. & Greaney, V. (1992). Using examinations to improve education: A study in fourteen African countries. Washington, DC: World Bank.Google Scholar
  29. Kennedy, K. (2013). High stakes School Based Assessment and cultural values: Beyond issues of validity. Key Note Address, Seminar on ‘School based assessment: Prospects and realities in Asian contexts’, 3 June 2013, Kuala Lumpur, Malaysia. Accessed 11 November 2014 from
  30. King, R. (1994). Historical survey of English language testing in Hong Kong. In J. Boyle & P. Falvey (Eds.), English language testing in Hong Kong (pp. 3–30), Hong Kong: Hong Kong Chinese University Press.Google Scholar
  31. Lee, C. (2008). The beneficial washback of the introduction of a School-based Assessment component on the Speaking performance of students. Paper presented at the 34th IAEA Conference, Cambridge, September 2008. Accessed 13 August 2014 from
  32. Lee, I. (2005). Latest initiative adds to burden. South China Morning Press, 7 May 2005. Accessed 1 April 2008 from
  33. Nitta, R. & Nakatsuhara, F. (2014). A multifaceted approach to investigating pre-taskplanning effects on paired oral test performance. Language Testing, 31 (2), 147–175.CrossRefGoogle Scholar
  34. Puntel Xhafaj, D. C., Muck, K. E. & de Souza Ferraz D’Ely, R. C. (2011). The impact of individual and peer planning on the oral performance of advanced learners of English as a foreign language. Linguagem & Ensino, Pelotas, 14 (1), 39–65 (January/June 2011).Google Scholar
  35. Qian, D. D. (2008). English language testing in Hong Kong: A survey of practices, developments and issues. Language Testing, 25 (1), 85–110.CrossRefGoogle Scholar
  36. Qian, D. D. (2014). School-based English language assessment as a high-stakes examination component in Hong Kong: Insights of frontline assessors. Assessment in Education: Principles, Policy & Practice, 21 (3), 251–270.CrossRefGoogle Scholar
  37. Skehan, P. & Foster, P. (1997). Task type and task processing conditions as influences on foreign language performance. Language Teaching Research, 1 (3), 185–211.CrossRefGoogle Scholar
  38. Wall, D. (2005). The impact of high-stakes examinations on classroom teaching: A case study using insights from testing and innovation theory (Studies in Language Testing Series, Volume 22). Cambridge: Cambridge ESOL and Cambridge University Press.Google Scholar
  39. Yuan, F. & Ellis, R. (2003). The effects of pre-task planning and on-line planning on fluency, complexity and accuracy in L2 monologicoral production. Applied Linguistics, 24 (1), 1–27.CrossRefGoogle Scholar
  40. Yung, B. H. W. (2002). Same assessment, different practice: Professional consciousness as a determinant of teachers’ practice in a school-based assessment scheme. Assessment in Education: Principles, Policy & Practice, 9 (1), 97–117.CrossRefGoogle Scholar

Copyright information

© Liz Hamp-Lyons 2016

Authors and Affiliations

  • Liz Hamp-Lyons

There are no affiliations available

Personalised recommendations