Skip to main content

Context Is Everything: An International Perspective of, and Its Challenges to, Research and the Evaluation of Educational Technology

  • Chapter
  • First Online:

Part of the book series: Globalisation, Comparative Education and Policy Research ((GCEP,volume 4))

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  • Adamides, M. H., & Nicolaou, K. (2004, July). Is technology “welcomed” in mathematics courses in the 21st century? Paper presented at the XIth symposium programme of the International Organization for Science and Technology Education. Lublin, Poland.

    Google Scholar 

  • American Association of School Librarians and Association for Educational Communications and Technology. (1998). Information literacy standards for student learning. Washington, DC: American Library Assoication.

    Google Scholar 

  • American Educational Research Association. (2003). Resolution on the essential elements of scientifically-based research. Retrieved December 1, 2003, from www.aera.net/meeting/council/resolution03.htm

  • Anderson, R. E. (2003). Introduction. In T. Plomp, R. E. Anderson, N. Law, & A. Quale (Eds.), Cross-national information and communication technology policy and practices in education (pp. 3–13). Greenwich, CT: Information Age Publishing.

    Google Scholar 

  • Arnove, R. F., & Torres, C. A. (Eds.). (2003). Comparative education: The dialectic of the global and the local (2nd ed.). Lanham, MD: Rowman & Littlefield.

    Google Scholar 

  • Barab, S. (2004, June). Ensuring rigor in the learning sciences: A call to arms. Paper presented at the Sixth International Conference of the Learning Sciences. Santa Monica, CA.

    Google Scholar 

  • Barab, S., & Squire, K. (2004). Design-based research: Putting a stake in the ground. Journal of the Learning Sciences, 13(1), 1–14.

    Article  Google Scholar 

  • Bennett, D., McMillan Culp, K., Honey, M., Tally, B., & Spielvogel, B. (2001). It all depends: Strategies for designing technologies for change in education. In W. F. Heinecke & L. Blasi (Eds.), Methods of evaluating educational technology (pp. 105–124). Greenwich, CT: Information Age Publishing.

    Google Scholar 

  • Berliner, D. C. (2002). Educational research: The hardest science of all. Educational Researcher, 31(8), 18–20.

    Article  Google Scholar 

  • Berliner, D. C., & Calfee, R. C. (Eds.). (1996). Handbook of educational psychology. New York: Macmillan.

    Google Scholar 

  • Bhola, H. S. (2003). Introduction: The social and contextual contexts of educational evaluation. In T. Kellaghan & D. L. Stufflebeam (Eds.), International handbook of educational evaluation: Part one: Perspectives (pp. 389–396). Dordrecht, The Netherlands: Kluwer.

    Google Scholar 

  • Bransford, J. D., Brown, A. L., & Cocking, R. R. (Eds.). (2000). How people learn: Brains, mind, experience, and school. Washington, DC: National Academy Press.

    Google Scholar 

  • Brown, A. L. (1992). Design experiments: Theoretical and methodological challenges in creating complex interventions in classroom settings. The Journal of the Learning Sciences, 2(2), 141–178.

    Article  Google Scholar 

  • Brown, A. (1994). The advancement of learning. Educational Researcher, 23(8), 4–12.

    Google Scholar 

  • Brown, A. L., & Campione, J. C. (1994). Guided discovery in a community of learners. In K. McGilly (Ed.), Classroom lessons: Integrating cognitive theory and classroom practice (pp. 229–270). Cambridge, MA: MIT Press.

    Google Scholar 

  • Brown, A. L., & Campione, J. C. (1996). Psychological theory and the design of innovative learning environments: On procedures, principles, and systems. In R. Glaser (Ed.), Innovations in learning: New environments for education (pp. 289–325). Mahwah, NJ: Lawrence Erlbaum.

    Google Scholar 

  • Brown, J. S., Collins, A., & Duguid, P. (1989a). Debating the situation: A rejoinder to Palinscar and Wineburg. Educational Researcher, 18(4), 10–12.

    Google Scholar 

  • Brown, J. S., Collins, A., & Duguid, P. (1989b). Situated cognition and the culture of learning. Educational Researcher, 18(1), 32–42.

    Google Scholar 

  • Butler, D. (2000). Case study 4: The training implications of the ICT resolution in secondary mathematics. In C. Wright (Ed.), Issues in education & technology (pp. 115–127). London: Commonwealth Secretariat.

    Google Scholar 

  • Campbell, D. T., & Fiske, D. W. (1959). Convergent and discriminant validation by the multitrait-multimethod matrix. Psychological Bulletin, 56, 81–105.

    Article  Google Scholar 

  • Center for Children and Technology. (2002). The evaluation toolkit: A work-in-progress. New York: Education Development Center, Center for Children and Technology.

    Google Scholar 

  • Cline, H. F., & Mandinach, E. B. (2000). The corruption of a research design: A case study of a curriculum innovation project. In A. E. Kelly & R. A. Lesh (Eds.), Handbook of research design in mathematics and science education (pp. 169–189). Mahwah, NJ: Lawrence Erlbaum.

    Google Scholar 

  • Coalition for Evidence-Based Policy. (2002). Bringing evidence-driven progress to education: A recommended strategy for the U.S. Department of Education. Retrieved December 11, 2002, from www.excelgov.org

  • Cognition and Technology Group at Vanderbilt. (1996). Looking at technology in context: A framework for understanding technology and education research. In D. C. Berliner & R. C. Calfee (Eds.), Handbook of educational psychology (pp. 807–840). New York: Macmillan.

    Google Scholar 

  • Cochran-Smith, M., & Lytle, S. (1990). Research on teaching and teacher research: The issues that divide. Educational Researcher, 19(2), 2–11.

    Google Scholar 

  • Cochran-Smith, M., & Lytle, S. (1993). Inside/outside: Teacher research and knowledge. New York: Teachers College Press.

    Google Scholar 

  • Cohen, D. K., & Barnes, C. A. (1999). Research and the purposes of education. In E. Lagemann & L. Shulman (Eds.), Issues in education research: Problems and possibilities (pp. 17–41). San Francisco, CA: Jossey-Bass.

    Google Scholar 

  • Collins, A. (1999). The changing infrastructure of education research. In E. Lageman & L. S. Shulman (Eds.), Issues in education research: Problems and possibilities (pp. 289–298). San Francisco, CA: Jossey-Bass.

    Google Scholar 

  • Collins, A., Joseph, D., & Bielaczyc, K. (2004). Design research: Theoretical and methodological issues. Journal of the Learning Sciences, 13(1), 15–42.

    Article  Google Scholar 

  • Commission on Technology and Adult Learning. (2001). A vision of e-learning for America’s workforce. Alexandria, VA/Washington, DC: American Society for Training and Development and the National Governors Association.

    Google Scholar 

  • Committee on Information Technology Literacy. (1999). Being fluent with information technology. Washington, DC: National Academy Press.

    Google Scholar 

  • Cook, T. D. (2002). Randomized experiments in educational policy research: A critical examination of the reasons the educational evaluation community has offered for not doing them. Educational Evaluation and Policy Analysis, 24(3), 175–199.

    Article  Google Scholar 

  • Cook, T. D., & Campbell, D. T. (1979). Quasi-experimentation: Design and analysis issues for field settings. Chicago, IL: Rand McNally.

    Google Scholar 

  • Cronbach, L. J. (1963). Course improvement through evaluation. Teachers College Record, 64, 672–683.

    Google Scholar 

  • Cronbach, L. J. (1971). Test validation. In R. L. Thorndike (Ed.), Educational measurement (2nd ed., pp. 443–507). Washington, DC: American Council on Education.

    Google Scholar 

  • Cronbach, L. J. (1975). Beyond the two disciplines of scientific psychology. American Psychologist, 30(2), 116–127.

    Article  Google Scholar 

  • Cronbach, L. J. (1982). Designing evaluations of educational and social programs. San Francisco, CA: Jossey-Bass.

    Google Scholar 

  • Cronbach, L. J., Gleser, G. C., Nanda, H., & Rajaratnam, N. (1972). The dependability of behavioral measurements: Theory of generalizability for scores and profiles. New York: Wiley.

    Google Scholar 

  • Cronbach, L. J., Ambron, S. R., Dornbusch, S. M., Hess, R. D., Hornik, R. C., Phillips, D. C., Walker, D. F., & Weiner, S. S. (1980). Toward reform of program evaluation. San Francisco, CA: Jossey-Bass.

    Google Scholar 

  • Crossley, M. (2001). Reconceptualizing comparative and international education. In K. Watson (Ed.), Doing comparative education research: Issues and problems (pp. 43–68). Oxford: Symposium Books.

    Google Scholar 

  • Dede, C. (1998). The scaling-up process for technology-based educational innovations. In C. Dede (Ed.), Learning with technology. 1998 yearbook of the Association for Supervision and Curriculum Development (pp. 199–215). Alexandria, VA: ASCD.

    Google Scholar 

  • Dede, C. (2003). Foreword. In R. B. Kozma (Ed.), Technology, innovation, and educational change: A global perspective (pp. ix–xii). Eugene, OR: International Society for Technology and Education.

    Google Scholar 

  • Delaware Department of Education. (2002). Science curriculum framework. Retrieved November 13, 2002, from www.doe.state.de.us/standards/science.

  • Design-Based Research Collective. (2003). Design-based research: An emerging paradigm for educational inquiry. Educational Researcher, 32(1), 5–8.

    Google Scholar 

  • Dynarski, M., Agodini, R., Heaviside, S., Novak, T., Carey, N., Campuzano, L., Means, B., Murphy, R., Penuel, W., Javitz, H., Emery, D., & Sussex, W. (2007). Effectiveness of reading and mathematics software products: Findings from the first student cohort (NCEE 2007–4005). Washington, DC: Institute of Education Sciences.

    Google Scholar 

  • Education Week. (2004, May 6). Global links: Lessons from the world: Technology counts 2004. Education Week, 23(35).

    Google Scholar 

  • Ercikan, K., Domene, J. F., Law, D., & Lacroix, S. (2004, April). Identifying the source of DIF using think-aloud protocols: Comparing thought processes of examinees taking tests in English versus in French. Paper presented at the annual meeting of the National Council of Measurement in Education, San Diego, CA.

    Google Scholar 

  • Fensham, P. J. (2004, July). Beyond knowledge: Other scientific questions as outcomes for school science education. Paper presented at the XIth symposium programme of the International Organization for Science and Technology Education, Lublin, Poland.

    Google Scholar 

  • Gaskell, J., Mehta, J., & Ogawa, M. (2004, July). Globalisation and localization: Competing tendencies or inevitable companions. Paper presented at the eleventh symposium programme of the International Organization for Science and Technology Education, Lublin, Poland.

    Google Scholar 

  • Gersick, A., Pasnik, S., Brunner, C., Honey, M., & Parris, J. (2004). Skills for the twenty-first century: Supporting digital literacy in the classroom: Pre-final report. New York: EDC Center for Children and Technology.

    Google Scholar 

  • Guba, E. G., & Lincoln, Y. S. (1989). Fourth generation evaluation. Newbury Park, CA: Sage.

    Google Scholar 

  • Heinecke, W. F., & Blasi, L. (Eds.). (2001). Methods of evaluating educational technology. Greenwich, CT: Information Age Publishing.

    Google Scholar 

  • Honey, M. (2001, July 25). Benefits of educational technology. Testimony and statement before the Labor, HHS, and Education Appropriations Subcommittee, United States Senate.

    Google Scholar 

  • Honey, M. (2002). New approaches to assessing students’ technology-based work. In N. Dickard (Ed.), Great expectations: Leveraging America’s investment in educational technology (pp. 24–28). Washington, DC: Benton Foundation.

    Google Scholar 

  • Information Technology Association of America. (2000). Bridging the gap: Information technology skills for a new millennium. Arlington, VA: Author.

    Google Scholar 

  • Information Technology Association of America. (2001). When can you start? Building better information technology skills and careers. Arlington, VA: Author.

    Google Scholar 

  • International ICT Literacy Panel. (2002). Digital transformation: A framework for ICT literacy. Princeton, NJ: Educational Testing Service.

    Google Scholar 

  • International Society for Technology in Education. (2002). National educational technology standards for teachers: Preparing teachers to use technology. Eugene, OR: Author.

    Google Scholar 

  • Jacob, E., & White, C. S. (Eds.). (2002). Theme issue on scientific research in education. Educational Researcher, 31(8).

    Google Scholar 

  • Janiuk, R. M., & Samonek-Micuk, E. (Eds.). (2004). International Organization for Science and Technology Education: XIth symposium proceedings. Lublin, Poland: Marie-Curie Sklodowska University Press.

    Google Scholar 

  • Jonassen, D. H. (Ed.). (1996). Handbook for research for educational communications and technology. New York: Macmillan.

    Google Scholar 

  • Jonassen, D. H. (Ed.). (2004). Handbook for research for educational communications and technology (2nd ed.). New York: Macmillan.

    Google Scholar 

  • Kankaanranta, M., & Linnakylä, P. (2003). National policies and practices on ICT in education: Finland. In T. Plomp, R. E. Anderson, N. Law, & A. Quale (Eds.), Cross-national information and communication technology policy and practices in education (pp. 213–231). Greenwich, CT: Information Age Publishing.

    Google Scholar 

  • Kellaghan, T., & Stufflebeam, D. L. (Eds.). (2003). International handbook of educational evaluation. Dordrecht, The Netherlands: Kluwer.

    Google Scholar 

  • Kelly, A. E. (Ed.). (2003). Theme issue: The role of design in educational research. Educational Researcher, 32(1), 3–4.

    Google Scholar 

  • Kennedy, M. M. (1997). The connection between research and practice. Educational Researcher, 26(7), 4–12.

    Google Scholar 

  • Kozma, R. B. (2003a). ICT and educational change: A global phenomenon. In R. B. Kozma (Ed.), Technology, innovation, and educational change: A global perspective (pp. 1–18). Eugene, OR: International Society for Technology and Education.

    Google Scholar 

  • Kozma, R. B. (2003b). Study procedures and first look at the data. In R. B. Kozma (Ed.), Technology, innovation, and educational change: A global perspective (pp. 19–41). Eugene, OR: International Society for Technology and Education.

    Google Scholar 

  • Kozma, R. B. (2003c). Summary and implications for ICT-based educational change. In R. B. Kozma (Ed.), Technology, innovation, and educational change: A global perspective (pp. 217–239). Eugene, OR: International Society for Technology and Education.

    Google Scholar 

  • Kozma, R. B. (Ed.). (2003d). Technology, innovation, and educational change: A global perspective. Eugene, OR: International Society for Technology in Education.

    Google Scholar 

  • Kozma, R. B., & McGhee, R. (2003). ICT and innovative classroom practices. Kozma, R. B. (2003a). ICT and educational change: A global phenomenon. In R. B. Kozma (Ed.), Technology, innovation, and educational change: A global perspective (pp. 43–88). Eugene, OR: International Society for Technology and Education.

    Google Scholar 

  • Lagemann, E. (2000). An elusive science: The troubling history of education research. Chicago, IL: The University of Chicago.

    Google Scholar 

  • Law, N., & Plomp, T. (2003). Curriculum and staff development for ICT in education. In T. Plomp, R. E. Anderson, N. Law, & A. Quale (Eds.), Cross-national information and communication technology policy and practices in education (pp. 15–30). Greenwich, CT: Information Age Publishing.

    Google Scholar 

  • Lennon, M., Kirsch, I., Von Davier, M., Wagner, M., & Yamamoto, K. (2003). Feasibility study for the PISA ICT literacy assessment. Princeton, NJ: Australian Council for Educational Research, Educational Testing Service, and the National Institute for Policy Research of Japan.

    Google Scholar 

  • Lesgold, A. (2000). What are the tools for? Revolutionary change does not follow the usual norms. In S. P. Lajoie (Ed.), Computers as cognitive tools: No more walls: Volume II (pp. 399–408). Mahwah, NJ: Lawrence Erlbaum.

    Google Scholar 

  • Lewis, A. E., & Mandinach, E. B. (2009). Informal learning on the Internet. To appear inT. L. Good (Ed.), 21st century education: A reference handbook. Thousand Oaks, CA: Sage.

    Google Scholar 

  • Long, L., Rivas, L., Light, D., & Mandinach, E. B. (2008). The evolution of a homegrown data warehouse: TUSDStats. In E. B. Mandinach & M. Honey (Eds.), Data-driven school improvement: Linking data and learning (pp. 209–232). New York: Teachers College Press.

    Google Scholar 

  • Love, A. J. (2003, January 17–18). Implementation analysis for feedback on program progress and results. San Francisco, CA: Evaluators’ Institute.

    Google Scholar 

  • Mandinach, E. B. (2008). Creating an evaluation framework for data-driven instructional decision making (final report to the National Science Foundation). New York and Alexandria, VA: EDC Center for Children and Technology and the CNA Corporation.

    Google Scholar 

  • Mandinach, E. B., & Cline, H. F. (1994). Classroom dynamics: Implementing a technology-based learning environment. Hillsdale, NJ: Lawrence Erlbaum.

    Google Scholar 

  • Mandinach, E. B., & Cline, H. F. (2000). It won’t happen soon: Practical, curricular, and methodological problems in implementing technology-based constructivist approaches in classrooms. In S. P. Lajoie (Ed.), Computer as cognitive tools: No more walls (Vol. II, pp. 377–395). Mahwah, NJ: Lawrence Erlbaum.

    Google Scholar 

  • Mandinach, E. B., Rivas, L., Light, D., & Heinze, C. (2006, April). The impact of data-driven decision making tools on educational practice: A systems analysis of six school districts. Paper presented at the meeting of the American Educational Research Association, San Francisco.

    Google Scholar 

  • Martin, M. O., Rust, K., & Adams, R. J. (1999). Technical standards for IEA studies. Amsterdam: International Association for the Evaluation of Educational Achievement.

    Google Scholar 

  • Martin, W., Mandinach, E., Kanaya, T., & McMillan Culp, K. (2004). Intel international interim report. New York: EDC Center for Children and Technology.

    Google Scholar 

  • Mattsson, G., & Svensson, M. (2004, July). The identity of the school subject technology: How teachers in primary school and university students in teacher training perceive school subject technology. Paper presented at the XIth symposium programme of the International Organization for Science and Technology Education, Lublin, Poland.

    Google Scholar 

  • McCandliss, B. D., Kalchman, M., & Bryant, P. (2003). Design experiments and laboratory approaches to learning: Steps toward collaborative exchange. Educational Researcher, 32(1), 14–16.

    Article  Google Scholar 

  • McMillan Culp, K., Honey, M., & Mandinach, E. (2005). A retrospective on twenty years of education technology policy. Journal of Educational Computing Research, 32(3), 313–341.

    Google Scholar 

  • Mosteller, F., & Boruch, R. (Eds.). (2002). Evidence matters: Randomized trials in education research. Washington, DC: Brookings Press.

    Google Scholar 

  • Mui, Y. H., Kan, E., & Chun, T. Y. (2003). National policies and practices on ICT in education: Singapore. In T. Plomp, R. E. Anderson, N. Law, & A. Quale (Eds.), Cross-national information and communication technology policy and practices in education (pp. 495–508). Greenwich, CT: Information Age Publishing.

    Google Scholar 

  • Muller, A. (2003). National policies and practices on ICT in education: South Africa. In T. Plomp, R. E. Anderson, N. Law, & A. Quale (Eds.), Cross-national information and communication technology policy and practices in education (pp. 541–555). Greenwich, CT: Information Age Publishing.

    Google Scholar 

  • Mullis, I., Martin, M., Smith, T., Garden, A., Gregory, K., Gonzalez, E., Chrostowski, S., & O’Connor, K. (2003). TIMSS assessment frameworks and specifications 2003. Chestnut Hill, MA: International Study Center, Lynch School of Education, Boston College.

    Google Scholar 

  • Mullis, I. V. S., Martin, M. O., Gonzalez, E. J., & Chrostowski, S. J. (2004). TIMSS 2003 international mathematics report: Findings from IEA’s Trends in International Mathematics and Science study at the fourth and eighth grades. Chestnut Hill, MA: TIMSS & PIRLS International Study Center, Boston College.

    Google Scholar 

  • Murphy, R. T. (1988). Evaluation of Al Manaahil: An original Arabic children’s television series in reading (RR88–45). Princeton, NJ: Educational Testing Service. (ERIC Document Reproduction Service No. 395007)

    Google Scholar 

  • National Educational Technology Plan. (2004). Retrieved July 6, 2004, from www.nationaledtechplan.org

  • NCES. (2004). TIMSS USA: Trends in International Mathematics and Science Study. Retrieved July 14, 2004, from nces.ed.gov/timss/

    Google Scholar 

  • Nelson, C. A., Post, J., & Bickel, W. (2003). Evaluating the institutionalization of technology in schools and classrooms. In T. Kellaghan & D. L. Stufflebeam (Eds.), International handbook of educational evaluation: Part two: Practice (pp. 843–870). Dordrecht, The Netherlands: Kluwer.

    Google Scholar 

  • New Jersey Department of Education. (2002). New Jersey core curriculum content standards for mathematics. Retrieved November 12, 2002, from www.state.nj.us/njded/cccs/02s4_math.htm

  • Newman, D. (1990). Opportunities for research on the organizational impact of school computers. Educational Researcher, 19(3), 8–13.

    Google Scholar 

  • Newman, D., & Cole, M. (2004). Can scientific research from the laboratory be of any help to teachers? Theory into Practice, 43(4), 260–267.

    Article  Google Scholar 

  • Ogena, E. B., & Brawner, F. G. (2003). National policies and practices on ICT in education: Philippines. In T. Plomp, R. E. Anderson, N. Law, & A. Quale (Eds.), Cross-national information and communication technology policy and practices in education (pp. 445–463). Greenwich, CT: Information Age Publishing.

    Google Scholar 

  • Organisation for Economic Co-operation and Development. (2000). Measuring student knowledge and skills: The PISA 2000 assessment of reading, mathematics, and scientific literacy. Paris: Organisation for Economic Co-operation and Development.

    Google Scholar 

  • Organisation for Economic Co-operation and Development. (2001). Schooling for tomorrow: Learning to change: ICT in schools. Paris: Organisation for Economic Co-operation and Development.

    Google Scholar 

  • Organisation for Economic Co-operation and Development. (2002a). Manual for the PISA 2000 database. Paris: Organisation for Economic Co-operation and Development.

    Google Scholar 

  • Organisation for Economic Co-operation and Development. (2002b). PISA 2000 technical report. Paris: Organisation for Economic Co-operation and Development.

    Google Scholar 

  • Organisation for Economic Co-operation and Development. (2003). Literacy skills for the world tomorrow: Further results from PISA 2000. Paris: Organisation for Economic Co-operation and Development.

    Google Scholar 

  • Organisation for Economic Co-operation and Development. (2004a). An OECD survey of upper secondary schools: A technical report. Paris: Organisation for Economic Co-operation and Development.

    Google Scholar 

  • Organisation for Economic Co-operation and Development. (2004b). Completing the foundation for lifelong learning: An OECD survey of upper secondary schools. Paris: Organisation for Economic Co-operation and Development.

    Google Scholar 

  • Organisation for Economic Co-operation and Development. (2004c). The OECD Programme for International Student Assessment. Retrieved July 14, 2004, from www.pisa.oecd.org/.

  • Partnership for 21st Century Skills. (2003). Learning for the 21st century. Washington, DC: Partnership for 21st Century Skills.

    Google Scholar 

  • Pearson, G., & Young, A. (2002). Technically speaking: Why all Americans need to know more about technology. Washington, DC: National Academy Press.

    Google Scholar 

  • Pelgrum, W., & Anderson, R. (Eds.). (2001). ICT and the emerging paradigm for lifelong learning (2nd ed.). Amsterdam: International Association for the Advancement of Educational Achievement.

    Google Scholar 

  • Plomp, T., Anderson, R. E., Law, N., & Quale, A. (Eds.). (2003a). Cross-national information and communication technology policy and practices in education. Greenwich, CT: Information Age Publishing.

    Google Scholar 

  • Plomp, T., Howie, S., & McGaw, B. (2003b). International studies of educational achievement. In T. Kellaghan & D. L. Stufflebeam (Eds.), International handbook of educational evaluation: Part two: Practice (pp. 951–978). Dordrecht, The Netherlands: Kluwer.

    Google Scholar 

  • Popham, W. J. (2003). Are your state’s tests instructionally sensitive?: High-quality assessments share three attributes. In Harvard Education Letter (Eds.), Spotlight on high-stakes testing (pp. 17–22). Cambridge, MA: Harvard Education Press.

    Google Scholar 

  • Popham, W. J. (2005, April/May). F for assessment. Edutopia, 38–41.

    Google Scholar 

  • President’s Committee of Advisors on Science and Technology, Panel on Educational Technology. (1997). Report to the president on the use of technology to strengthen K-12 education in the United States. Washington, DC: President’s Committee of Advisors on Science and Technology, Panel on Educational Technology.

    Google Scholar 

  • Pressman, J. L., & Wildavsky, A. (1984). Implementation. Berkeley, CA: University of California Press.

    Google Scholar 

  • Preston, R. (2001). Contextual and methodological influences on trends in comparative and international educational research. In K. Watson (Ed.), Doing comparative education research: Issues and problems (pp. 69–84). Oxford: Symposium Books.

    Google Scholar 

  • Quale, A. (2003). Trends in instructional ICT infrastructure. In T. Plomp, R. E. Anderson, N. Law, & A. Quale (Eds.), Cross-national information and communication technology policy and practices in education (pp. 31–42). Greenwich, CT: Information Age Publishing.

    Google Scholar 

  • Reynolds, D. (1999). Creating a new methodology for comparative educational research: The contribution of the International School Effectiveness Research Project. In R. Alexander, R. Broadfoot, & D. Phillips (Eds.), Learning from comparing: New directions in comparative educational research: Volume One: Contexts, classrooms, and outcomes (pp. 135–148). Oxford: Symposium Books.

    Google Scholar 

  • Rossi, P. H., & Freeman, H. E. (1993). Evaluation: A systematic approach (5th ed.). Newbury Park, CA: Sage.

    Google Scholar 

  • Russell, M. (2001). Framing technology program evaluations. In W. F. Heinecke & L. Blasi (Eds.), Methods of evaluating educational technology (pp. 149–162). Greenwich, CT: Information Age Publishing.

    Google Scholar 

  • Russell, M., & Plati, T. (2000). Mode of administration effects on MCAS composition performance for grades four, eight, and ten (Report prepared for the Massachusetts Department of Education). Chestnut Hill, MA: National Board on Educational Testing and Public Policy.

    Google Scholar 

  • Salomon, G., & Almog, T. (1998). Educational psychology and technology: A matter of reciprocal relations. Teachers College Record, 100(2), 222–241.

    Google Scholar 

  • Schmidt, W. H., & Houang, R. T. (2003). Cross-national curriculum evaluation. In T. Kellaghan & D. L. Stufflebeam (Eds.), International handbook of educational evaluation: Part two: Practice (pp. 979–996). Dordrecht, The Netherlands: Kluwer.

    Google Scholar 

  • Schoenfeld, A. H. (1999). Looking toward the 21st century: Challenges of educational theory and practice. Educational Researcher, 28(7), 4–14.

    Google Scholar 

  • Schweisfurth, M. (2001). Gleaning meaning from case studies in international comparison: Teachers’ experiences of reform in Russian and South Africa. In K. Watson (Ed.), Doing comparative education research: Issues and problems (pp. (211–223). Oxford: Symposium Books.

    Google Scholar 

  • Scriven, M. (1967). The methodology of evaluation. In R. E. Stake (Ed.), Curriculum evaluation (pp. 39–83). Chicago, IL: Rand McNally.

    Google Scholar 

  • Scriven, M. (2003). Evaluation theory and metatheory. In T. Kellaghan & D. L. Stufflebeam (Eds.), International handbook of educational evaluation: Part one: Perspectives (pp. 15–30). Dordrecht, The Netherlands: Kluwer.

    Google Scholar 

  • Senge, P. (1990). The fifth discipline. New York: Doubleday.

    Google Scholar 

  • Senge, P., Cambron-McCabe, N., Lucas, T., Smith, B., Dutton, J., & Kleiner, A. (2000). Schools that learn: A fifth discipline fieldbook for educations, parents, and everyone who cares about education. New York: Doubleday.

    Google Scholar 

  • Shadish, W. R., Cook, T. D., & Campbell, D. T. (2002). Experimental and quasi-experimental designs for generalized causal inference. Boston, MA: Houghton-Mifflin.

    Google Scholar 

  • Shavelson, R. J., & Towne, L. (Eds.). (2002). Scientific research in education. Washington, DC: National Academy Press.

    Google Scholar 

  • Shavelson, R. J., Phillips, D. C., Towne, L., & Feuer, M. J. (2003). On the science of education: Design studies. Educational Researcher, 32(1), 25–28.

    Article  Google Scholar 

  • Shimizu, K., Watanabe, R., Shimizu, Y., Miyake, M., Yamade, K., Horiguti, H., Saito, M., Yoshioka, R., Sakayauti, M., Saruta, Y., Ogura, Y., & Numano, T. (2003). National policies and practices on ICT in education: Japan. In T. Plomp, R. E. Anderson, N. Law, & A. Quale (Eds.), Cross-national information and communication technology policy and practices in education (pp. 335–355). Greenwich, CT: Information Age Publishing.

    Google Scholar 

  • Shulman, L. S. (1970). Reconstruction of educational research. Review of Educational Research, 40(3), 371–396.

    Google Scholar 

  • Singapore Ministry of Education. (2008). Mission and vision statement. Retrieved February 5, 2008, from www.moe.gov.sg/corporate/mission_statement.htm.

  • Sjoberg, S. (2004, July). Science and technology in the new millennium: Friend or foe? Paper presented at the XIth symposium programme of the International Organization for Science and Technology Education, Lublin, Poland.

    Google Scholar 

  • State Education Technology Directors Association. (2003). SETDA national leadership institute toolkit. Retrieved April 11, 2003, from www.setda.org/nli2002/CD/TLA/index.htm.

  • Stufflebeam, D. L. (2002). The CIPP model of evaluation. In T. Kellaghan & D. L. Stufflebeam (Eds.), International handbook of educational evaluation: Part one: Perspectives (pp. 31–62). Dordrecht, The Netherlands: Kluwer.

    Google Scholar 

  • Tapscott, D. (1998). Growing up digital: The rise of the net generation. New York: McGraw-Hill.

    Google Scholar 

  • Uchida, D., Cetron, M., & McKenzie, F. (1996). Preparing students for the 21st century. Arlington, VA: American Association of School Administrators.

    Google Scholar 

  • U.S. Department of Education. (2000). Teachers’ tools for the 21st century: A report on teachers’ use of technology (NCES 2000–102). Washington, DC: Author.

    Google Scholar 

  • U.S. Department of Education. (2003). Identifying and implementing educational practices supported by rigorous evidence: A user friendly guide. Washington, DC: U.S. Department of Education Institute for Education Sciences National Center for Education Evaluation and Regional Assistance.

    Google Scholar 

  • Voogt, J., & Pelgrum, W. J. (2003). ICT and the curriculum. In R. B. Kozma (Ed.), Technology, innovation, and educational change: A global perspective (pp. 81–124). Eugene, OR: International Society for Technology and Education.

    Google Scholar 

  • Vrasidas, C. (2001). Making the familiar strange-and interesting-again: Interpretivism and symbolic interactionism in educational technology research. In W. F. Heinecke & L. Blasi (Eds.), Methods of evaluating educational technology (pp. 85–103). Greenwich, CT: Information Age Publishing.

    Google Scholar 

  • Watson, K. (Ed.). (2001a). Doing comparative education research: Issues and problems. Oxford: Symposium Books.

    Google Scholar 

  • Watson, K. (2001b). Introduction: Rethinking the role of comparative education. In K. Watson (Ed.), Doing comparative education research: Issues and problems (pp. 9–20). Oxford: Symposium Books.

    Google Scholar 

  • Weiss, C. H. (1998). Evaluation: Methods for studying programs and policies (2nd ed.). Upper Saddle River, NJ: Prentice Hall.

    Google Scholar 

  • What Works Clearinghouse. (2004). Retrieved July 6, 2004, from www.w-w-c.org

  • Whitehurst, G. J. (2003a, April). The Institute for Education Sciences: New wine and new bottles. Paper presented at the annual meeting of the American Educational Research Association, Chicago, IL.

    Google Scholar 

  • Whitehurst, G. J. (2003b, August). Psychology and evidence-based education. Paper presented at the annual meeting of the American Psychological Association, Toronto, Canada.

    Google Scholar 

  • Whitehurst, G. J. (2007, April). Big challenges for education research. Paper presented at the annual meeting of the American Educational Research Association, Chicago, IL.

    Google Scholar 

  • World Bank Group. (2003). Lifelong learning in the global knowledge economy: Challenges for developing countries. Washington, DC: The World Bank.

    Book  Google Scholar 

  • Yin, R. K. (1995). New methods for evaluating programs in NSF’s Division of Research, Evaluation, and Dissemination. In J. A. Frechtling (Ed.), Footprints: Strategies for non-traditional program evaluation (pp. 25–36). Arlington, VA: National Science Foundation.

    Google Scholar 

  • Yong, G. K. (2005, November). Singapore’s Ministry of Education. Speech given at the meeting of the International Conference on Computers in Education, Singapore.

    Google Scholar 

  • Zehr, M. A. (2004, May 6). Africa. Global links: Lessons from the world: Technology counts 2004. Education Week, 23(35), 56–59.

    Google Scholar 

  • Zhao, Y., Byers, J., Pugh, K., & Sheldon, S. (2001). What’s worth looking for?: Issues in educational technology research. In W. F. Heinecke & L. Blasi (Eds.), Methods of evaluating educational technology (pp. 269–296). Greenwich, CT: Information Age Publishing.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2009 Springer Science + Business Media B.V.

About this chapter

Cite this chapter

Mandinach, E.B. (2009). Context Is Everything: An International Perspective of, and Its Challenges to, Research and the Evaluation of Educational Technology. In: Gibbs, D., Zajda, J. (eds) Comparative Information Technology. Globalisation, Comparative Education and Policy Research, vol 4. Springer, Dordrecht. https://doi.org/10.1007/978-1-4020-9426-2_10

Download citation

Publish with us

Policies and ethics