A preliminary analysis of teacher perceptions of the effects of NAPLAN on pedagogy and curriculum
- 10k Downloads
This paper reports preliminary survey findings of Western Australian and South Australian teacher perceptions of the impact of NAPLAN on curriculum and pedagogy in their classroom and school. The paper examines how teachers perceive the effects of NAPLAN on curriculum and pedagogy and whether these perceptions mediated by the teacher’s gender, the socioeconomics of the school, the State and the school system in which the teacher works. Teachers report that they are either choosing or being instructed to teach to the test, that this results in less time being spent on other curriculum areas and that these effects contribute in a negative way to the class environment and the engagement of students. This largely agrees with a body of international research that suggests that high-stakes literacy and numeracy tests often results in unintended consequences such as a narrow curriculum focus, a return to teacher-centred instruction and a decrease in motivation. Analysis suggests there is a relationship between participant responses to the effect of NAPLAN on curriculum based on the characteristics of which State the teacher taught in, the socioeconomic status of the school and the school system in which they were employed (State, Catholic, and Independent).
KeywordsNAPLAN High-stakes testing Teacher perceptions Curriculum and pedagogy
This research has been made possible by a grant from the ARC.
- ACARA. (2011). NAPLAN. Retrieved January 31, 2012, from Australian Curriculum Assessment and Reporting Authority (ACARA): http://www.nap.edu.au/naplan/naplan.html.
- AERA. (2012, September). Position Statement on High-Stakes Testing. Retrieved September 26, 2012, from American Education Research Association (AERA): http://www.aera.net/AboutAERA/AERARulesPolicies/AERAPolicyStatements/PositionStatementonHighStakesTesting/tabid/11083/Default.aspx.
- Amrein, A., & Berliner, D. (2002). High-stakes testing, uncertainty, and student learning. Education Policy Analysis Archives, 10(18), 1–74.Google Scholar
- Au, W. (2009). Unequal by design: high-stakes testing and the standardization of inequality. New York: Routledge.Google Scholar
- Baker, R. (2012). The effects of high-stakes testing policy on arts education. Arts Education Policy Review, 113(1), 17–25.Google Scholar
- Baker, E., Barton, P., Darling-Hammond, L., Haertel, E., Ladd, H., & Linn, R. (2010). Problems with the use of student test scores to evaluate teachers. Washington: Economic Policy Institute.Google Scholar
- Ball, S. (1994). Education reform: A critical and post-structural approach. Buckingham: Open University Press.Google Scholar
- Collins, S., Reiss, M., & Stobart, G. (2010). What happens when high-stakes testing stops? Teachers’ perceptions of the impact of compulsory national testing in science of 11-year-olds in England and its abolition in Wales. Assessment in Education: Principles, Policy and Practice, 17(3), 273–286.CrossRefGoogle Scholar
- Gillard, J. (2008a, April). NAPLAN, OECD Report, teacher quality, age pension. Retrieved April 29, 2011, from Hon Julia Gillard MP: http://mediacentre.dewr.gov.au/mediacentre/Gillard/Releases/NAPLANOECDReportteacherqualityagepension.htm.
- Gillard, J. (2008b, June). Education revolution: COAG agrees to school reform agenda. Retrieved June 17, 2010 from Media Centre: http://mediacentre.dewr.gov.au.
- Gonski, D., Boston, K., Greiner, K., Lawrence, C., Scales, B., & Tannock, P. (2011). Review of funding for schooling—final report. Canberra: Department of Education, Employment and Workplace Relations.Google Scholar
- Lange, T., & Meaney, T. (2011). Becoming disadvantaged: Public discourse around national testing. Poland: European Society for Research in Mathematic Education.Google Scholar
- Lobascher, S. (2011). What are the potential impacts of high-stakes testing on literacy education in Australia? Literacy Learning in the Middle Years, 19(2), 9–19.Google Scholar
- McGaw, B. (2010). Driving education reform with curriculum reform and accountability. Banksia Lecture, Murdoch University, Perth.Google Scholar
- Nichols, S., Glass, G., & Berliner, D. (2006). High-stakes testing and student achievement: Does accountability pressure increase student learning? Education Policy Analysis Archive, 14(1), 1–173.Google Scholar
- Nichols, S., Glass, G., & Berliner, D. (2012). High-stakes testing and student achievement: Updated analyses with NAEP data. Education Policy Analysis Archives, 20(20), 1–35.Google Scholar
- Perrault, G. (2000). The classroom impact of high-stress testing. Education, 120(4), 705–710.Google Scholar
- Phelps, P. (2007). Standardized Testing. New York: Lang.Google Scholar
- Polesel, J., Dulfer, N., & Turnbull, M. (2012). The experience of education: The impacts of high stakes testing on school students and their families. Sydney: Whitlam Insititute.Google Scholar
- Productivity Commission. (2012). Schools workforce: Productivity commission research report. Canberra: Australian Government.Google Scholar
- Reid, A. (2009). Is this a revolution? A critical analysis of the Rudd government’s national education agenda. Curriculum Perspectives, 29(3), 1–13.Google Scholar
- Rizvi, F., & Lingard, B. (2010). Globalizing education policy. Abingdon: Routledge.Google Scholar
- Rudd, K., & Gillard, J. (2008). Quality education: The case for an education revolution in our schools. Canberra: Commonwealth of Australia.Google Scholar
- Smeed, J., Spiller, K., & Kimber, M. (2009). Issues for principals in high-stakes testing. Principal Matters, 81, 32–34.Google Scholar
- Wigglesworth, G., Simpson, J., & Loakes, D. (2011). NAPLAN language assessments for indigenous students in remote communities: Issues and problems. Australian Review of Applied Linguistics, 34(3), 320–343.Google Scholar