Advertisement

Emphasizing Transdisciplinary Prowess in the Evaluation of STEAM Programs

  • Kimberle KellyEmail author
  • Erin Burr
Chapter
  • 33 Downloads
Part of the Environmental Discourses in Science Education book series (EDSE, volume 5)

Abstract

STEM to STEAM transitions can affect not only the design and implementation of programs across the education spectrum, but also the design and execution of program evaluations. Without evidence of effectiveness derived from a program evaluation, it becomes difficult to publish about, obtain funding for, or convince various audiences to participate in or support a STEAM program. A program with well-understood goals and objectives, described in a logic model, articulates what the program is expected to accomplish and by what criteria success can be measured for whom. The logic model thus becomes the foundation for the design of a high-quality program evaluation. As STEAM programs are more likely to bridge disciplines by virtue of including the arts, they require the development and therefore the evaluation of transdisciplinary knowledge and skills across diverse stakeholder groups. One framework that can be used to describe what we define as transdisciplinary skills in STEAM programs is the 4 Cs – communication, collaboration, critical thinking, and creativity. We round out the discussion of each “C” with important evaluation practices and evaluator competencies to consider. Successful evaluators must possess a certain degree of transdisciplinary prowess themselves to stay current and to optimize the value and use of evaluation evidence in STEAM programs.

References

  1. AdvancED. (2018). STEM certification: Meeting the demands of the future. Retrieved from https://www.advanc-ed.org/services/stem-certification
  2. American Evaluation Association. (2011). American evaluation association public statement on cultural competence in evaluation. Retrieved from https://www.eval.org/p/cm/ld/fid=92
  3. American Evaluation Association. (2018a). The AEA Evaluator Competencies. Retrieved from https://www.eval.org/d/do/4382
  4. American Evaluation Association. (2018b). Guiding Principles for Evaluators. Retrieved from https://www.eval.org/p/cm/ld/fid=51
  5. American Evaluation Association. (2018c). About AEA. Retrieved from https://www.eval.org/p/cm/ld/fid=4
  6. American Evaluation Association. (2018d). Welcome to the STEM Education and Training TIG. Retrieved from http://comm.eval.org/stemeducationandtraining/home
  7. American Psychological Association. (2009). Publication manual of the American Psychological Association. Washington, DC: American Psychological Association.Google Scholar
  8. Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy of educational objectives, handbook I: The cognitive domain. New York: David McKay Co Inc.Google Scholar
  9. Bransford, J. D., Brown, A. L., & Cocking, R. R. (2000). How people learn: Brain, mind, experience, and school. Washington, DC: National Academy Press.Google Scholar
  10. Brown, C. (2005). Teaching American history: Creating an evaluation framework. Center for Evaluation & Education Policy, http://ceep.indiana.edu/index.html
  11. Byrd, M., & Olivieri, B. S. (2014). Measuring teacher cultural competence. International Review of Social Sciences and Humanities, 8(1), 55–67. www.irssh.com, ISSN 2248-9010 (Online), ISSN 2250-0715 (Print).
  12. Candy, L., & Bilda, Z. (2009).Understanding and evaluating creativity: A tutorial. Tutorial presented at ACM Creativity & Cognition 2009, Berkeley, CA. Retrieved from http://www.eecs.qmul.ac.uk/~nickbk/creativityandcognition09/tutorials/t1-candy.pdf.
  13. Century, J., Rudnick, M., & Freeman, C. (2010). A framework for measuring fidelity of implementation: A foundation for shared language and accumulation of knowledge. American Journal of Evaluation, 31(2), 199–218.  https://doi.org/10.1177/1098214010366173.CrossRefGoogle Scholar
  14. Davies, R. (2015). Evaluability assessment. BetterEvaluation. Retrieved from http://betterevaluation.org/themes/evaluability_assessment
  15. Diller, J., & Moule, J. (2005). Cultural competence: A primer for educators. Belmont: Wadsworth.Google Scholar
  16. Evans, J. (2014, July 29). What is transdisciplinarity? Retrieved from https://polytechnic.purdue.edu/blog/what-transdisciplinarity
  17. Evergreen, S., & Emery, A. K. (2016). Data visualization checklist. Retrieved from http://stephanieevergreen.com/wp-content/uploads/2016/10/DataVizChecklist_May2016.pdf
  18. Fleming, M., House, S., Hanson, V. S., Garbutt, J., McGee, R., Kroenke, K., & Runio, D. M. (2013). The mentoring competency assessment: Validation of a new instrument to evaluate skills of research mentors. Academic Medical Journal, 88(7), 1002–1008.  https://doi.org/10.1097/ACM.0b013e318295e298.CrossRefGoogle Scholar
  19. Gajda, R. (2004). Utilizing collaboration theory to evaluate strategic alliances. American Journal of Evaluation, 25(1), 65–77.  https://doi.org/10.1177/109821400402500105.CrossRefGoogle Scholar
  20. Ghanbari, S. (2015). Learning across disciplines: A collective case study of two university programs that integrate the arts with STEM. International Journal of Education and the Arts, 16(7), 1–21.Google Scholar
  21. Golder, B., WWF-US, & Gawler, G. (2005). Cross-cutting tool: Stakeholder analysis. Retrieved from https://intranet.panda.org/documents/folder.cfm?uFolderID=60976
  22. Gomez, J. G. (2007). What do we know about creativity? The Journal of Effective Teaching, 7(1), 31–43.Google Scholar
  23. Hardiman, M. M., & JohnBull, R. M. (this volume). From STEM to STEAM: How can educators meet the challenge? In Converting STEM into STEAM programs. Cham: Springer.Google Scholar
  24. Institute of Medicine. (2005). Facilitating interdisciplinary research. Washington, DC: The National Academies Press. https://doi.org/10.17226/11153.
  25. Joint Committee on Standards for Educational Evaluation. (2018). About JCSEE. Retrieved from http://www.jcsee.org/
  26. Kelley, C., & Meyers, J. (1995). CCAI: Cross-cultural adaptability inventory. Arlington: Vangent, Inc.Google Scholar
  27. Kelly, K. A., Murphrey, T. P., Koswatta, T. J., & Burr, E. M. (2018a, November). Application of a collaboration framework to facilitate and evaluate the development of collaboration in a postsecondary academic alliance. Poster presented at evaluation 2018: Speaking truth to power, Cleveland, OH.Google Scholar
  28. Kelly, K. A., Slawsky, E., Koury, S. T., & Carlin-Mentor, S. (2018b, November). A model for geospatial analysis and longer-term tracking of outcomes in STEM pipeline building. Poster presented at evaluation 2018: Speaking truth to power, Cleveland, OH.Google Scholar
  29. Misra, S., Stokols, D., & Cheng, L. (2015). The transdisciplinary orientation scale: Factor structure and relation to the integrative quality and scope of scientific publications. Journal of Collaborative Healthcare and Translational Medicine, 3(2): 1042. http://www.jscimedcentral.com/TranslationalMedicine/translationalmedicine-3-1042.pdf
  30. Oak Ridge Associated Universities. (2019). Evaluation repository. Retrieved from: https://www.orau.org/research-reviews-evaluations/stem-evaluation/evaluation-repository.html
  31. Partnership for 21st Century Learning. (2007). The intellectual policy foundations of the 21st century skills framework. Retrieved from http://www.p21.org/route21/images/stories/epapers/skills_foundations_final.pdf
  32. Patton, M. Q. (2008). Utilization-focused evaluation (4th ed.). Thousand Oaks: Sage.Google Scholar
  33. Pfund, C., House, S. C., Asquith, P., Fleming, M. F., Buhr, K. A., Burnham, E. L., et al. (2014). Training mentors of clinical and translational research scholars: A randomized controlled trial. Academic Medical Journal, 89(5), 774–782.  https://doi.org/10.1097/ACM.0000000000000218.CrossRefGoogle Scholar
  34. Salabarría-Peña, Y, Apt, B.S., Walsh, C.M. (2007). Practical use of program evaluation among sexually transmitted disease (STD) programs. Centers for Disease Control and Prevention: Atlanta.Google Scholar
  35. Sanabria, & Romero. (this volume). Emerging scenarios to enhance creativity in smart cities through STEAM education and the gradual immersion method. In Converting STEM into STEAM programs. Cham: Springer.Google Scholar
  36. Stein, B., Haynes, A., Redding, M., Ennis, T., & Cecil, M. (2007). Assessing critical thinking in STEM and beyond. In M. Iskander (Ed.), Innovations in E-learning, instruction technology, assessment, and engineering education (pp. 79–82). Dordrecht: Springer.CrossRefGoogle Scholar
  37. Taylor, M., Plastrik, P., Coffman, J., & Whatley, A. (2014). Evaluating networks for social change: A casebook. Retrieved from http://www.evaluationinnovation.org/sites/default/files/NetworkEvalGuidePt2_Casebook.pdf.Google Scholar
  38. U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, What Works Clearinghouse. (2017). WWC version 4.0 standards handbook. Retrieved from https://ies.ed.gov/ncee/wwc/Docs/referenceresources/wwc_standards_handbook_v4.pdf
  39. U.S. Institute of Education Sciences, & U.S. National Science Foundation. (2013). Common guidelines for education research and development. Retrieved from https://www.nsf.gov/pubs/2013/nsf13126/nsf13126.pdf.
  40. U.S. National Science Foundation. (2008). Broadening Participation at the National Science Foundation: A Framework for Action. Retrieved from https://www.nsf.gov/od/broadeningparticipation/nsf_frameworkforaction_0808.pdf
  41. Ubben, G. C. (this volume, Chapter 6). Using project-based learning to teach STEAM. In Converting STEM into STEAM programs. Cham: Springer.Google Scholar
  42. Ubben, G. C. (this volume, Chapter 7). How to structure project-based learning to meet STEAM objectives. In Converting STEM into STEAM programs. Cham: Springer.Google Scholar
  43. W. K. Kellogg Foundation. (2004). W.K. Kellogg Foundation logic model development guide. Battle Creek: W.K. Kellogg Foundation. Retrieved from https://www.wkkf.org/resource-directory/resource/2006/02/wk-kellogg-foundation-logic-model-development-guide
  44. Watkins, R., West Meiers, M., & Visser, Y. L. (2012). A guide to assessing needs: Essential tools for collecting information, making decisions, and achieving development results. World Bank. © World Bank. https://openknowledge.worldbank.org/handle/10986/2231
  45. Woodland, R. H., & Hutton, M. S. (2012). Evaluating organizational collaborations: Suggested entry points and strategies. American Journal of Evaluation, 33(3), 366–383.  https://doi.org/10.1177/1098214012440028.CrossRefGoogle Scholar
  46. Yarbrough, D. B., Shulha, L. M., Hopson, R. K., & Caruthers, F. A. (2011). The program evaluation standards: A guide for evaluators and evaluation users (3rd ed.). Thousand Oaks: Sage Publications.Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Oak Ridge Associated UniversitiesOak RidgeUSA

Personalised recommendations