Skip to main content

Advertisement

Log in

Evolving data use policy in Trinidad and Tobago: the search for actionable knowledge on educational improvement in a small island developing state

  • Published:
Educational Assessment, Evaluation and Accountability Aims and scope Submit manuscript

Abstract

This paper presents a single-country case study of the use of large scale assessment (LSA) data to generate actionable knowledge at school and system levels. Actionable knowledge is data-informed insight into school and system processes that can be used to direct corrective action. The analysis is framed from the perspective of the country’s evolving national policy on data use for educational improvement between 1990 and 2013. Trinidad and Tobago first participated in international large scale assessments (ILSAs) in 1991 but also developed a centralized system of national large scale assessments (NLSAs) in 2004. Analyses of both datasets consistently pointed to low quality and high inequality as the main actionable issues in the education system. NLSA data also hinted at notable variation in performance across schools and education districts. Analyses for and of policy point to the need for multiple school performance measures to better inform site-based, formative action. Over the period, actionable knowledge appears to have had greater impact at school level, with evidence being used by some low-performing schools to improve. However, at the system level, the frequent non-use and misuse of actionable knowledge suggest the need to promote and strengthen structures and processes related to evidence-informed policy-making.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

Notes

  1. The Global Relative Risk was 2.7 described as a moderate risk of low reading achievement ( Trong 2009, p. 127).

  2. The API was developed in 2007 and used internally from 2007 to 2010; however, it was only formally adopted by the TTMoE in 2010.

  3. In rural areas additional lunches are provided, and in Tobago, all students participate in the programme.

  4. The TTMoE used the 30 % score criteria, but then made some adjustments based on other data sources, including the numbers of students sitting the 11+.

References

  • Anderson, S., Leithwood, K., & Strauss, T. (2010). Leading data use in schools: organizational conditions and practices at the school and district levels. Leadership and Policy in Schools, 9(3), 292–327.

    Article  Google Scholar 

  • Antonacopoulou, E. P. (2007). Actionable knowledge. In S. Clegg & J. Bailey (Eds.), International encyclopaedia of organization studies (pp. 14–17). London: Sage.

    Google Scholar 

  • Argyris, C. (1993). Knowledge for action: a guide to overcoming barriers to organizational change. San Francisco: Jossey-Bass.

    Google Scholar 

  • Argyris, C. (1996). Actionable knowledge: design causality in the service of consequential theory. The Journal of Applied Behavioral Science, 32(4), 390–406.

    Article  Google Scholar 

  • Bacchus, M. K. (1989). Education, equity and cultural diversity in ‘plural’ societies. Directions Journal of Educational Studies, 11, 12–32.

    Google Scholar 

  • Bacchus, M. K. (2008). The education challenges facing small nation states in the increasingly competitive global economy of the twenty‐first century. Comparative Education, 44(2), 127–145.

    Article  Google Scholar 

  • Bacchus, K., & Brock, C. (Eds.). (1987). The challenge of scale: educational development in the small states of the commonwealth. London: Commonwealth Secretariat.

    Google Scholar 

  • Baldacchino, G., & Bertram, G. (2009). The beak of the finch: insights into the economic development of small economies. The Round Table, 98(401), 141–160.

    Article  Google Scholar 

  • Baldacchino, G., & Bray, M. (2001). Special issue on human resource strategies for small states. International Journal of Educational Development, 21(3), 203–204.

    Article  Google Scholar 

  • Ball, S. J. (1994). Education reform: a critical and post-structural approach. Buckingham: Open University Press.

    Google Scholar 

  • Ball, S. J. (2006). Education policy and social class. The selected works of Stephen J. Ball. London: Routledge.

    Google Scholar 

  • Batteson, C., & Ball, S. J. (1995). Autobiographies and interviews as means of ‘access’ to elite policy making in education. British Journal of Educational Studies, 43(2), 201–216.

    Article  Google Scholar 

  • Bellei, C. (2013). Supporting instructional improvement in low-performing schools to increase students’ academic achievement. The Journal of Educational Research, 106(3), 235–248.

    Article  Google Scholar 

  • Benavot, A. (2012). Policies toward quality education and student learning: constructing a critical perspective. Innovation: European Journal of Social Science Research, 25(1), 67–77.

    Google Scholar 

  • Benveniste, L. (1999). The politics of student testing: A comparative analysis of national assessment systems in southern cone countries. (Unpublished doctoral thesis). Stanford University, Palo Alto, CA.

  • Berne, R., & Stiefel, L. (1994). Measuring equity at the school level: the finance perspective. Educational Evaluation and Policy Analysis, 16(4), 405–421.

    Article  Google Scholar 

  • Best, M., Knight, P., Lietz, P., Lockwood, C., Nugroho, D., & Tobin, M. (2013). The impact of national and international assessment programmes on education policy, particularly policies regarding resource allocation and teaching and learning practices in developing countries. London: EPPI-Centre, Social Science Research Unit, Institute of Education, University of London.

    Google Scholar 

  • Black, P., & Wiliam, D. (2007). Large-scale assessment systems: design principles drawn from international comparisons. Measurement: Interdisciplinary Research and Perspectives, 5(1), 1–53.

    Google Scholar 

  • Blake, S. C., & Ottoson, J. M. (2009). Knowledge utilization: Implications for evaluation. In J. M. Ottoson & P. Hawe (Eds.), Knowledge utilization, diffusion, implementation, transfer, and translation: Implications for evaluation. New Directions for Evaluation, 124, 21–34.

  • Bouillon, C.P., & Buvinic, M. (2003). Inequality, exclusion and poverty in Latin America and the Caribbean: Implications for development. New York: Inter-American Development Bank. Retrieved from January 14, 2014, from http://www.iadb.org/sds/doc/soc-IDB-SocialCohesion-E.pdf. Accessed

  • Bowen, G. A. (2009). Document analysis as a qualitative research method. Qualitative Research Journal, 9(2), 27–40.

    Article  Google Scholar 

  • Bradley, S., & Taylor, J. (2002). The effect of the quasi-market on the efficiency-equity trade-off in the secondary school sector. Bulletin of Economic Research, 54, 295–314.

    Article  Google Scholar 

  • Briguglio, L. (1995). Small island developing states and their economic vulnerabilities. World Development, 23(9), 1615–1632.

    Article  Google Scholar 

  • Brookhart, S. (2009). The many meanings of multiple measures. Educational Leadership, 67(3), 6–12.

    Google Scholar 

  • Brown, C. (2014). Making evidence matter: a new perspective for evidence-informed policy making in education. London: IOE Press.

    Google Scholar 

  • Brown, L., & Conrad, D. A. (2007). School leadership in Trinidad and Tobago: the challenge of context. Comparative Education Review, 51(2), 181–201.

    Article  Google Scholar 

  • Brown, L., Bristol, L., De Four-Babb, J., & Conrad, D. (2013). National tests and diagnostic feedback: what say teachers in Trinidad and Tobago? The Journal of Educational Research, 107(3), 241–251.

    Article  Google Scholar 

  • Burns, T., & Schuller, T. (2007). The evidence agenda. In OECD/CERI (Ed.), Evidence in education: linking research and policy (pp. 15–32). Paris: OECD.

    Chapter  Google Scholar 

  • Campbell, C., & Levin, B. (2009). Using data to support educational improvement. Educational Assessment, Evaluation and Accountability, 21, 47–65.

    Article  Google Scholar 

  • Cao, L. (2012). Actionable knowledge discovery and delivery. WIREs Data Mining and Knowledge Discovery, 2(2), 149–163.

    Article  Google Scholar 

  • Cassen, R., & Kingdon, G. (2007). Tackling low educational achievement. York: Joseph Rowntree Foundation.

    Google Scholar 

  • Castelli, L., Ragazzi, S., & Crescentini, A. (2012). Equity in education: a general overview. Procedia-Social and Behavioural Sciences, 69, 2243–2250.

    Article  Google Scholar 

  • Chay, K. Y., McEwan, P. J., & Urquiola, M. (2005). The central role of noise in evaluating interventions that use test scores to rank schools. American Economic Review, 95(4), 1237–1258.

    Article  Google Scholar 

  • Chester, M. D. (2005). Making valid and consistent inferences about school effectiveness from multiple measures. Educational Measurement: Issues and Practice, 24(4), 40–52.

    Article  Google Scholar 

  • Choi, K., Goldschmidt, P., & Yamashiro, K. (2005). Exploring models of school performance: From theory to practice. In J. L. Herman & E. H. Haertel (Eds.), Uses and misuses of data for educational accountability and improvement (NSSE Yearbook, Vol. 104, Part 2, pp. 119–146). Chicago: National Society for the Study of Education.

  • Coladarci, T. (2006). School size, student achievement, and the “power rating” of poverty: substantive finding or statistical artifact? Education Policy Analysis Archives, 14(28).

  • Connelly, C. (2015). Catholic Church seeks $21 million for education project. Trinidad Newsday. Retrieved from March, 2014 from http://www.newsday.co.tt/news/0,210284.html.

  • Crespo, M., Soares, J., & de Mello e Souza, A. D. (2000). The Brazilian national evaluation system of basic education: context, process, and impact. Studies in Educational Evaluation, 26(2), 105–125.

    Article  Google Scholar 

  • Crossley, M., Bray, M., & Packer, S. (2011). Education in small states: policies and priorities. London: Commonwealth Secretariat.

    Google Scholar 

  • Cullen, J., & Reback, R. (2006). Tinkering toward accolades: school gaming under a performance accountability system. In T. Gronberg & D. Jansen (Eds.), Improving school accountability: check-ups or choice (Advances in applied microeconomics, Vol. 14, pp. 1–34). Amsterdam: Elsevier Science.

    Chapter  Google Scholar 

  • De Lisle, J. (2006). Dragging eleven-plus measurement practice into the fourth quadrant: the Trinidad and Tobago SEA as a gendered sieve. Caribbean Curriculum, 13, 89–101.

    Google Scholar 

  • De Lisle, J., Smith, P., Keller, C., Jules, V., Lochan, S., Pierre, P., Lewis, Y., Mc David, P., & Seunarinesingh, K. (2008). In the context of Trinidad and Tobago, how do we identify schools that are succeeding or failing in the midst of complex and challenging circumstances? In L. Quamina-Aiyejina (Ed.), Reconceptualising the agenda for education in the Caribbean. Conference proceedings of the 2007 Biennial Cross-Campus Conference in Education April, 2007. (pp. 547–562). St. Augustine: UWI.

  • De Lisle, J., Smith, P., & Jules, V. (2010). Evaluating the geography of gendered achievement using large scale-assessment data from the primary school system of the Republic of Trinidad and Tobago. International Journal of Educational Development, 30(4), 405–417.

    Article  Google Scholar 

  • De Lisle, J., Smith, P., Keller, C., & Jules, V. (2012). Differential outcomes in high stakes eleven plus testing: gender, assessment design, and geographic location in secondary school placement within Trinidad and Tobago. Assessment in Education: Principles, Policy, & Practice, 19(1), 45–64.

    Article  Google Scholar 

  • De Lisle, J., Mohammed, R., & Lee-Piggott, R. (2014). Explaining Trinidad and Tobago’s system response to international assessment data. Journal of Educational Administration, 52(4), 487–508.

    Article  Google Scholar 

  • di Gropello, E. (2003). Monitoring educational performance in the Caribbean: working paper no. 6. Washington: World Bank.

    Book  Google Scholar 

  • Elley, W. B. (1992). How in the world do students read? IEA study of reading literacy. The Hague: International Association for the Evaluation of Educational Achievement.

    Google Scholar 

  • Elstad, E. (2009). Schools which are named, shamed and blamed by the media: school accountability in Norway. Educational Assessment, Evaluation and Accountability, 21(2), 173–189.

    Article  Google Scholar 

  • Ferrer, G. (2006). Educational assessment systems in Latin America: current practice and future challenges. Washington: Partnership for Educational Revitalization in the Americas.

    Google Scholar 

  • Firestone, W., Schorr, R., & Monfils, L. (Eds.). (2004). The ambiguity of teaching to the test: standards, assessment, and educational reform. Mahwah: Lawrence Erlbaum.

    Google Scholar 

  • Foley, B., & Goldstein, H. (2012). Measuring success. London: The British Academy.

    Google Scholar 

  • Goldring, E., & Berends, M. (2009). Leading with data: pathways to improve your school. Thousand Oaks: Corwin Press.

    Google Scholar 

  • Goldschmidt, P., Roschewski, P., Choi, K., Auty, W., Hebbler, S., Blank, R., & Williams, A. (2005). Policymakers’ guide to growth models for school accountability: how do accountability models differ? Washington: Council of Chief State School Officers.

    Google Scholar 

  • Goldstein, H. (2001). Using pupil performance data for judging schools and teachers: scope and limitations. British Educational Research Journal, 27(4), 433–442.

    Article  Google Scholar 

  • Gordon, I., Lewis, J., & Young, K. (1977). Perspectives on policy analysis. Public Administration Bulletin, 25, 26–30.

    Google Scholar 

  • Government of the Republic of Trinidad and Tobago. (2012). Education sector strategic plan 2011–2015. Port of Spain: Ministry of Education.

    Google Scholar 

  • Government of the Republic of Trinidad & Tobago. (1993). Education policy paper (1993–2003). Port of Spain: Ministry of Education.

    Google Scholar 

  • Government of the Republic of Trinidad & Tobago. (2002). Strategic plan of the ministry of education. Port of Spain: Ministry of Education.

    Google Scholar 

  • Government of the Republic of Trinidad & Tobago (2004). Report on the National Tests 2004. Port of Spain: Division of Educational Research & Evaluation. Retrieved February 07, 2014 from http://www.moe.gov.tt/media_pdfs/publications/natioal_test_report_2004/National%20Test%202004%20Body%20of%20Report.pdf.

  • Government of the Republic of Trinidad & Tobago. (2005). Report on national tests 2005. Port of Spain: Division of Educational Research & Evaluation.

    Google Scholar 

  • Government of the Republic of Trinidad & Tobago. (2007). Vision 2020 draft national strategic plan. Port of Spain: Ministry of Planning.

    Google Scholar 

  • Government of the Republic of Trinidad & Tobago. (2010). Draft report on national tests 2010. Port of Spain: Division of Educational Research & Evaluation.

    Google Scholar 

  • Government of the Republic of Trinidad & Tobago. (2011a). Medium term policy framework 2011–2014. Port of Spain: Ministry of Planning and the Economy.

    Google Scholar 

  • Government of the Republic of Trinidad & Tobago. (2011b). Public sector investment programme. Port of Spain: Ministry of Planning and the Economy.

    Google Scholar 

  • Government of the Republic of Trinidad & Tobago (2014). Division of educational research and evaluation. Retrieved March 12, 2014 from http://moe.gov.tt/divisions_DERE.html.

  • Gregg, K. (2011). A document analysis of the National Association for the Education of Young Children’s developmentally appropriate practice position statement: what does it tell us about supporting children with disabilities? Contemporary Issues in Early Childhood, 12(2), 175–186.

    Article  Google Scholar 

  • Grek, S. (2009). Governing by numbers: the PISA ‘effect’ in Europe. Journal of Education Policy, 24(1), 23–37.

    Article  Google Scholar 

  • Gulliford, M. C., Mahabir, D., Rocke, B., Chinn, S., & Rona, R. J. (2002). Free school meals and children’s social and nutritional status in Trinidad and Tobago. Public Health Nutrition, 5(5), 625–630.

    Article  Google Scholar 

  • Gulson, K. N., & Symes, C. (Eds.). (2007). Spatial theories of education: policy and geography matters. New York: Routledge.

    Google Scholar 

  • Hamilton, L. S., Schwartz, H. L., Stecher, B. M., & Steele, J. L. (2013). Improving accountability through expanded measures of performance. Journal of Educational Administration, 51(4), 453–475.

    Article  Google Scholar 

  • Harvey, W. S. (2011). Strategies for conducting elite interviews. Qualitative Research, 11(4), 431–441.

    Article  Google Scholar 

  • Head, B. W. (2008). Wicked problems in public policy. Public Policy, 3, 101–118.

    Google Scholar 

  • Henderson‐Montero, D., Julian, M. W., & Yen, W. M. (2003). Multiple measures: alternative design and analysis models. Educational Measurement: Issues and Practice, 22(2), 7–12.

    Article  Google Scholar 

  • Heritage, M., & Yeagley, R. (2005). Data use and school improvement: challenges and prospects. In J. L. Herman & E. Haertel (Eds.), Uses and misuses of data for educational accountability and improvement: 104th yearbook of the National Society for the Study of Education, Part 2 (pp. 320–339). Malden: Blackwell.

    Google Scholar 

  • Honig, M. I., & Coburn, C. E. (2008). Evidence-based decision making in school district central offices: toward a research agenda. Educational Policy, 22(4), 578–608.

    Article  Google Scholar 

  • Ikemoto, G. S., & Marsh, J. A. (2007). Cutting through the data-driven mantra: different conceptions of data-driven decision making. In P. A. Moss (Ed.), Evidence and decision making (pp. 105–131). Malden: Wiley-Blackwell.

    Google Scholar 

  • Ingram, D., Seashore Louis, K., & Schroeder, R. (2004). Accountability policies and teacher decision making: barriers to the use of data to improve practice. Teachers College Record, 106(6), 1258–1287.

    Article  Google Scholar 

  • Isaac, A. (2001). Education reform in the Eastern Caribbean: Implications of a policy and decision-making program by an external donor. Unpublished doctoral dissertation, McGill University, Montreal.

  • Kairi Consultants. (2007). 2005 Analysis of the Trinidad and Tobago survey of living conditions. Port of Spain: Ministry of Social Development.

    Google Scholar 

  • Kelly, A., & Downey, C. (2011). Using effectiveness data for school improvement: developing and utilising metrics. London: Routledge.

    Google Scholar 

  • Kerr, K. A., Marsh, J. A., Ikemoto, G. S., Darilek, H., & Barney, H. (2006). Strategies to promote data use for instructional improvement: actions, outcomes, and lessons from three urban districts. American Journal of Education, 112(4), 496–520.

    Article  Google Scholar 

  • Ladd, H. F. (2012). Education and poverty: confronting the evidence. Journal of Policy Analysis and Management, 31(2), 203–227.

    Article  Google Scholar 

  • Leithwood, K., & Earl, L. (2000). Educational accountability effects: an international perspective. Peabody Journal of Education, 75(4), 1–18.

    Article  Google Scholar 

  • Loud, M. L., & Mayne, J. (2014). Enhancing evaluation use: insights from internal evaluation units. Thousand Oaks: Sage.

    Book  Google Scholar 

  • Luo, M. (2008). Structural equation modelling for high school principals’ data-driven decision making: an analysis of information use environments. Educational Administration Quarterly, 44(5), 603–634.

    Article  Google Scholar 

  • Mandinach, E. B., & Jackson, S. S. (2012). Transforming teaching and learning through data-driven decision making. Thousand Oaks: Sage.

    Book  Google Scholar 

  • Mandinach, E. B., Honey, M., & Light, D. (2006). A theoretical framework for data driven decision making. Paper presented at the annual meeting of American Educational Research Association, San Francisco. Retrieved November 14, 2008, from http://cct.edc.org/admin/publications/speeches/DataFrame_AERA06.pdf.

  • Masters, G. N., Rowley, G., Ainley, J., & Khoo, S. T. (2008). Reporting and comparing school performances. Commissioned Report: Department of Education, Employment and Workplace Relations (DEEWR). Melbourne: ACER.

  • Mikecz, R. (2012). Interviewing elites addressing methodological issues. Qualitative Inquiry, 18(6), 482–493.

    Article  Google Scholar 

  • Miller, E. (2000). Education for all in the Caribbean in the 1990s: retrospect and prospect. Kingston: UNESCO.

    Google Scholar 

  • Mintrop, H. (2003). The limits of sanctions in low-performing schools: A study of Maryland and Kentucky schools on probation. Education Policy Analysis Archives, 11(3). Retrieved March 3, 2014 from http://epaa.asua.edu/epaa/v11n3.htm.

  • Mintrop, H., & Sunderman, G. L. (2009). Predictable failure of federal sanctions-driven accountability for school improvement—and why we may retain it anyway. Educational Researcher, 38(5), 353–364.

    Article  Google Scholar 

  • Miri, L. K. (2005). Evaluation of the effects of state and federal accountability policies on diverse populations: API, SES and EL’S. (Unpublished doctoral thesis). Los Angeles: University of Southern California.

  • Miyako, I., & García, E. (2014). Grade repetition: A comparative study of academic and non-academic consequences. OECD Journal: Economic Studies, 2013(1). Retrieved March 20, 2014 from 10.1787/eco_studies-2013-5k3w65mx3hnx.

  • Mizala, A., Romaguera, P., & Urquiola, M. (2007). Socioeconomic status or noise? Tradeoffs in the generation of school quality information. Journal of Development Economics, 84(1), 61–75.

    Article  Google Scholar 

  • Moe, C. (2009). Ministry moves to assist underperforming schools. Trinidad guardian newspaper. Online Edition. Retrieved February 21, 2014 from http://guardian.co.tt/news/general/2009/11/14/ministry-moves-assist-underperforming-schools.

  • Morris, Z. S. (2009). The truth about interviewing elites. Politics, 29(3), 209–217.

    Article  Google Scholar 

  • Moss, P. A. (2013). Validity in action: lessons from studies of data use. Journal of Educational Measurement, 50(1), 91–98.

    Article  Google Scholar 

  • Muijs, D., Harris, A., Chapman, C., Stoll, L., & Russ, J. (2004). Improving schools in socioeconomically disadvantaged areas—a review of research evidence. School Effectiveness and School Improvement, 15(2), 149–175.

    Article  Google Scholar 

  • Mullis, I. V. S., Martin, M. O., Kennedy, A. M., & Foy, P. (2007). PIRLS 2006 International Report: IEA’s progress in International Reading Literacy Study in primary schools in 40 countries. Chestnut Hill: Boston College.

    Google Scholar 

  • Muriel, A., & Smith, J. (2011). On educational performance measures. Fiscal Studies, 32(2), 187–206.

    Article  Google Scholar 

  • Neaves, J. (2012). Religious schools doing worse at SEA. Trinidad Express Newspaper. Retrieved March 5, 2014 from http://www.trinidadexpress.com/news/Religious_schools__doing_worse_at_SEA_-159964785.html.

  • Nudzor, H. P. (2013). Interviewing Ghanaian educational elites: strategies for access, commitment, and engagement. International Journal of Qualitative Methods, 12, 606–623.

    Google Scholar 

  • OECD. (2010). PISA 2009 results: what students know and can do: student performance in reading, mathematics and science (Volume I). Paris: OECD.

    Book  Google Scholar 

  • Ogawa, R. T., & Collom, E. (2000). Using performance indicators to hold schools accountable: implicit assumptions and inherent tensions. Peabody Journal of Education, 75(4), 200–215.

    Article  Google Scholar 

  • Organisation for Economic Co-operation and Development (OECD). (2010a). PISA 2009 results: overcoming social background: equity in learning opportunities and outcomes (Vol. 2). Paris: OECD.

    Google Scholar 

  • Organisation for Economic Co-operation and Development (OECD). (2010b). PISA 2009 results: vol. IV. What makes a school successful? Resources, policies and practice. Paris: OECD.

    Google Scholar 

  • Organisation for Economic Co-operation and Development (OECD). (2011). PISA 2009 at a glance. Paris: OECD.

    Google Scholar 

  • Organisation for Economic Co-operation and Development (OECD). (2012). Equity and quality in education: supporting disadvantaged students and schools. Paris: OECD.

    Google Scholar 

  • Park, G. (2007). California under the microscope: An evaluation of the practical effect of different approaches to accountability. (Unpublished Ed.D dissertation). Los Angeles: University of Southern California.

  • Pennings, P., Keman, H., & Kleinnijenhuis, J. (2006). Doing research in political science. London: Sage.

    Google Scholar 

  • Piety, P. J. (2013). Assessing the educational data movement. New York: Teachers College Press.

    Google Scholar 

  • Ravela, P. (2005). A formative approach to national assessments. The Case of Uruguay: Prospects, 25(1), 21–43.

    Google Scholar 

  • Ravela, P., Arregui, P., Valverde, G., Wolfe, R., Ferrer, G., Martınez, F., Aylwin, M., & Wolff, L. (2008). The educational assessment Latin America needs. Working Paper Series No. 40. Washington, DC: PREAL.

  • Reimers, F. (1999). Educational opportunities for low-income families in Latin America. Prospects, 29(4), 535–549.

    Article  Google Scholar 

  • Reynolds, D. (2010). Failure free education? The past, present and future of school effectiveness and improvement. London: Routledge.

    Google Scholar 

  • Ritzen, J. (2013). International large scale assessments as change agents. In M. von Davier, E. Gonzalez, I. Kirsch, & K. Yamamoto (Eds.), The role of international large-scale assessments: perspectives from technology, economy, and educational research (pp. 13–24). Dordrecht: Springer.

    Chapter  Google Scholar 

  • Roman Catholic Archdiocese of Port of Spain (2011). Strategic plan 2011–2014. Port of Spain: Author.

  • Schildkamp, K., & Lai, M. K. (2013). Conclusions and a data use framework. In K. Schildkamp, M. K. Lai, & L. Earl (Eds.), Data-based decision making in education: challenges and opportunities (pp. 177–192). Dordrecht: Springer.

    Chapter  Google Scholar 

  • Schildkamp, K., & Teddlie, C. (2008). School performance feedback systems in the USA and in the Netherlands: a comparison. Educational Research and Evaluation, 14(3), 255–282.

    Article  Google Scholar 

  • Sirju, M. (2009). Catholic primary education needs overhaul Pt 1. Trinidad & Tobago Catholic News. Retrieved March 21, 2014 from http://www.catholicnews-tt.net/joomla/index.php?option=com_content&view=article&id=734%3Aviewpoint020809&Itemid=68.

  • Smith, W. J., & Ngoma-Maema, W. Y. (2003). Education for all in South Africa: developing a national system for quality assurance. Comparative Education, 39(3), 345–365.

    Article  Google Scholar 

  • Spence, J. (2012). Church schools must lead reforms. Trinidad Express. Retrieved from February 7, 2014 from http://www.trinidadexpress.com/commentaries/Church_schools_must_lead_reforms-136353608.html.

  • Stewart, W. (2013). Where you come from matters after all, says Gove. Times Educational Supplement. Retrieved March 3, 2014 from http://www.tes.co.uk/article.aspx?storycode=6314568.

  • Stobart, G. (2005). Fairness in multicultural assessment systems. Assessment in Education:Principles, Policy & Practice, 12(3), 275–287.

    Article  Google Scholar 

  • Thomas, V., & Yan, W. (2009). Distribution or opportunities key to development. In D. B. Holsinger & W. J. Jacob (Eds.), Inequality in education (pp. 34–58). Hong Kong: CERC and Springer.

    Google Scholar 

  • Tokman, A. (2002). Evaluation of the P900 program: A targeted education program for underperforming schools. Central Bank of Chile Working Papers, No. 170.

  • Trong, K. L. (2009). Using PIRLS 2006 to measure equity in reading achievement internationally. (Unpublished doctoral thesis). Chesnut Hill, Ma: Boston College.

  • Turner, E. O., & Coburn, C. E. (2012). Interventions to promote data use: an introduction. Teachers College Record, 114(11), 1–13.

    Google Scholar 

  • UNESCO. (2007). EFA global monitoring report 2008: education for all by 2015—will we make it? Paris: UNESCO.

    Google Scholar 

  • Vegas, E., & Petrow, J. (2008). Raising student learning in Latin America: the challenge for the 21st century. Washington: World Bank.

    Google Scholar 

  • Verhaeghe, G., Schildkamp, K., Luyten, H., & Valcke, M. (2015). Diversity in school performance feedback systems. School Effectiveness and School Improvement. doi:10.1080/09243453.2015.1017506. Published ahead of print.

    Google Scholar 

  • Visscher, A. J., & Coe, R. (Eds.). (2002). School improvement through performance feedback. Lisse: Swets and Zeitlinger.

    Google Scholar 

  • Visscher, A. J., & Coe, R. (2003). School performance feedback systems: conceptualisation, analysis, and reflection. School Effectiveness and School Improvement, 14(3), 321–349.

    Article  Google Scholar 

  • Wagner, D. A., Lockheed, M., Mullis, I., Martin, M. O., Kanjee, A., Gove, A., & Dowd, A. J. (2012). The debate on learning assessments in developing countries. Compare: A Journal of Comparative and International Education, 42(3), 509–545.

    Google Scholar 

  • Waslander, S., Pater, C., & van der Weide, M. (2010). Markets in education: An analytical review of empirical research on market mechanisms in education. Education Working Papers 52. Paris: OECD.

  • Whetton, C., Twist, E., & Sainsbury, M. (2000). National tests and target setting: Maintaining consistent standards. Paper presented at the Annual Meeting of the American Educational Research Association, New Orleans, LA.

  • Willms, J. D. (2003). Student engagement at school: a sense of belonging and participation. Paris: Organisation for Economic Co-Operation and Development.

    Google Scholar 

  • Winkler, D. (2000). Educating the poor in Latin America and the Caribbean: examples of compensatory education. In F. Reimers (Ed.), Unequal schools, unequal chances: the challenges to equal opportunity in the Americas (pp. 112–135). Cambridge: David Rockefeller Centre for Latin American Studies, Harvard University.

    Google Scholar 

  • World Bank (1993). Caribbean region: Access, quality, and efficiency in education. Washington, DC: Author.

  • World Bank. (1995a). Trinidad and Tobago: Poverty and unemployment in an oil based economy. (Report No. 14382-TR). Washington, DC: Author.

  • World Bank. (1995b). Priorities and strategies for education: A World Bank review. World Bank, Washington, DC.: Author.

  • Wößmann, L. (2003). Schooling resources, educational institutions and student performance: the international evidence. Oxford Bulletin of Economics and Statistics, 65(2), 117–170.

    Article  Google Scholar 

  • Yang, Y. (2003). Measuring socioeconomic status and its effects at individual and collective levels: a cross-country comparison. Göteborg: Educational Sciences 193, University of Gothenburg.

    Google Scholar 

  • Yin, R. K. (2013). Case study research: design and methods. Thousand Oaks: Sage.

    Google Scholar 

  • Zhang, Y. (2006). Urban‐rural literacy gaps in Sub‐Saharan Africa: the roles of socioeconomic status and school quality. Comparative Education Review, 50(4), 581–602.

    Article  Google Scholar 

  • Zvoch, K., & Stevens, J. J. (2008). Measuring and evaluating school performance: an investigation of status and growth-based achievement indicators. Evaluation Review, 32(6), 569–595.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jerome De Lisle.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

De Lisle, J. Evolving data use policy in Trinidad and Tobago: the search for actionable knowledge on educational improvement in a small island developing state. Educ Asse Eval Acc 28, 35–60 (2016). https://doi.org/10.1007/s11092-015-9232-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11092-015-9232-7

Keywords

Navigation