Abstract
This research centers on the psychometric examination of the structure of an instrument, known as the 5E Lesson Plan (5E ILPv2) rubric for inquiry-based teaching. The instrument is intended to measure an individual’s skill in developing written 5E lesson plans for inquiry teaching. In stage one of the instrument’s development, an exploratory factor analysis on a fifteen-item 5E ILP instrument revealed only three factor loadings instead of the expected five factors, which led to its subsequent revision. Modifications in the original instrument led to a revised 5E ILPv2 instrument comprised of twenty-one items. This instrument, like its precursor, has a scoring scale that ranges from zero to four points per item. Content validity of the 5E ILPv2 was determined through the expertise of a panel of science educators. Over the course of five semesters, three elementary science methods instructors in three different universities collected post lesson plan data from 224 pre-service teachers enrolled in their courses. Each instructor scored their students’ post 5E inquiry lesson plans using the 5E ILPv2 instrument recording a score for each item on the instrument. A factor analysis with maximum likelihood extraction and promax oblique rotation provided evidence of construct validity for five factors and explained 85.5 % of the variability in the total instrument. All items loaded with their theoretical factors exhibiting high ordinal alpha reliability estimates of .94, .99, .96, .97, and .95 for the engage, explore, explain, elaborate, and evaluate subscales respectively. The total instrument reliability estimate was 0.98 indicating strong evidence of total scale reliability.
Similar content being viewed by others
References
AAAS. (1993). Benchmarks for scientific literacy. New York: Oxford University Press.
AAAS. (1998). Blueprints for reform: Science, mathematics, and technology education. New York: Oxford University Press.
Achieve. (2012). Next generation science standards: For states by states. Retrieved at http://www.nextgenscience.org/next-generation-science-standards/.
Ackerson, V., Abd-El-Khalick, F., & Lederman, N. (2000). Influence of a reflective explicit activity-based approach on elementary teachers’ conceptions of nature of science. Journal of Research in Science Teaching, 37(4), 295–317.
Appleton, K. (2008). Developing science pedagogical content knowledge through mentoring elementary teachers. Journal of Science Teacher Education, 19(6), 523–545.
Atkins, J. M., & Karplus, R. (1962). Discovery or invention? Science Teacher, 29(5), 45.
Barman, C. R. (1992). An evaluation of the use of a technique designed to assist prospective elementary teachers’ use of the learning cycle with science textbooks. School Science and Mathematics, 92(2), 59–63.
Barman, C. R. (1993). The learning cycle: A basic tool for teachers, too. Perspectives in Education and Deafness, 11(4), 7–11.
Barman, C., & Shedd, J. (1992). Program designed to introduce K-6 teachers to the learning cycle teaching approach. Journal of Science Teacher Education, 3(2), 58–64.
Beerer, K. & Bodzin, A. (2004). Promoting inquiry-based science instruction with the science teacher inquiry rubric (STIR). Paper presented at the 2004 Association for the Education of Teachers in Science (AETS), Annual International Conference in Nashville, TN.
Blank, L. (2000). A metacognitive learning cycle: A better warranty for student understanding. Science Education, 84(4), 486–506.
Bodzin, A. & Beerer, K. (2003). Promoting inquiry based science instruction: the validation of the science teacher inquiry. Journal of Elementary Science Education. 15(2), 39–49. Retrieved on line at http://www.thefreelibrary.com/Promoting+inquiry-based+science+instruction%3a+the+validation+of+the…-a0108967578.
Bybee, R. (1997). Achieving scientific literacy: From purpose to practice. Portsmouth, NH: Heinemann Press.
Bybee, R., Taylor, J., Gardner, A., Van Scotter, P., Carlson, J., Westbrook, A., & Landes, N. (2006). The BSCE 5e instructional model: Origins and effectiveness. A report for Office of Science Education National Institutes of Health. Retrieved on line at http://science.education.nih.gov/houseofreps.nsf/b82d55fa138783c2852572c9004f5566/$FILE/Appendix+D.pdf.
Cavallo, A., & Laubach, T. (2001). Students’ science perceptions and enrollment decisions in differing learning cycle classrooms. Journal of Research in Science Teaching, 38(9), 1029–1062.
Colburn, A. (2008). An inquiry primer. In E. Brunsell (Ed.), Readings in science methods K–8 (pp. 33–36). Arlington, VA: NSTA Press.
Dwyer, W., & Lopez, V. (2001). Simulations in the learning cycle: A case study involving ‘Exploring the Nardoo’. In J. Price, et al. (Eds.), Proceedings of the Society for Information Technology and Teacher Education International Conference (pp. 2556–2557). Chesapeake, VA: AACE.
Eick, C., Meadows, L., & Balkcom, R. (2005). Breaking into inquiry: Scaffolding support beginning efforts to implement inquiry in the classroom. The Science Teacher, 72(7), 49–53.
Fields, A. (2005). Discovering statistics using SPSS. London: Sage Publications.
Glasson, G., & Lilik, R. (1993). Reinterpreting the learning cycle from a social constructivist perspective: A qualitative study of teachers’ beliefs and practices. Journal of Research in Science Teaching, 30(2), 187–207.
Goldston, M. J., Day, J., Sundberg, C., & Dantzler, J. (2010). Psychometric analysis of a 5E learning cycle lesson plan assessment instrument. International Journal of Science and Mathematics Education, 8(4),633–645.
Hayton, J. C., Allen, D. G., & Scarpello, V. (2004). Factor retention decisions in exploratory factor analysis: A tutorial on parallel analysis. Organizational Research Methods, 7, 191–205.
Hodson, D. (1988). Toward a philosophically more valid science curriculum. Science Education, 72(1), 19–40.
Horn, J. L. (1965). A rationale and test for the number of factors in factor analysis. Psychometrika, 30, 179–185.
Hutcheson, G., & Sofroniou, N. (1999). The multivariate social scientist. London: Sage Publications.
Jacobson, W., & Kondo, A. (1968). SCIS elementary science sourcebook. Berkeley, CA: Science Curriculum Improvement Study.
Jinkins, D. (2002). Impact of the implementation of the teaching/learning cycle on teacher decision-making and emergent readers. Reading Psychology, 22(4), 267–288.
Karplus, R. (1979). Teaching for the development of reasoning. In A. Lawson (Ed.), 1980 AETS yearbook: The psychology of teaching for thinking and creativity. ERIC/SMEAC: Columbus, OH.
Karplus, R., Collea, F, Fuller, R., Paldy, L., & Renner, J. (1975). Workshop in physics teaching and the development of reasoning. Presented for the American Association of Physics Teachers.
Karplus, R., & Thier, H. D. (1967). A new look at elementary school science. Chicago, IL: Rand McNally.
Lawson, A. E. (1995). Science teaching and the development of thinking. Belmont, CA: Wadsworth.
Lawson, A. & Abraham, M. & Renner, J. (1989). A theory of instruction: Using the learning cycle to teach science concepts and thinking skills. NARST monograph, number one, National Association of Research in Science Teaching.
Lederman, N., Wade, P., & Bell, R. (1998). Assessing understanding of the nature of science: A historical perspective. In W. McComas (Ed.), The nature of science and science education: Rationales and strategies (pp. 331–350). Dordrecht, the Netherlands: Kluwer Academic.
Lovoie, D. (1999). Effects of emphasizing hypothetico-predictive reasoning within the science learning cycle on high school student’s process skills and conceptual understanding of biology. Journal of Research in Science Teaching, 36(10), 1127–1147.
Marek, E. (2008). Why the learning cycle? Journal of Elementary Science Education, 20(3), 63–69.
Marek, E., & Cavallo, A. (1997). Learning cycle: Elementary school science and beyond. Portsmith, NH: Heinnemann.
Marek, E., Eubanks, C., & Gallaher, T. (1990). Teachers’ understanding and the use of the learning cycle. Journal of Research in Science Teaching, 27(9), 821–834.
Marek, E., Laubach, T. A., & Pederson, J. (2003). Preservice elementary school teachers’ understandings of theory based science education. Journal of Science Teacher Education, 14(3), 147–159.
Marek, E., Maier, S., & McCann, F. (2008). Assessing understanding of the learning cycle: The ULC. Journal of Science Teacher Education, 19(4), 375–389.
Marek, E., & Methven, S. (1992). Effects of the learning cycle upon student and classroom teacher performance. Journal of Research in Science Teaching, 28(1), 41–53.
Munsheno, B., & Lawson, A. (1999). Effects of learning cycle and traditional text on comprehension of science concepts by students at differing reasoning levels. Journal of Research in Science Teaching, 36(1), 23–37.
National Assessment of Educational Progress. (2010a). The Nation’s Report Card: Science 2009. National Center for Educational Statistics (NCES Publication 2011-451 or 15654K PDF). Retrieved from http://nces.ed.gov/nationsreportcard/pubs/main2009/2011451.asp.
National Assessment of Educational Progress. (2010b). Hands-on and interactive computer assessment from 2009 Science Assessment. Retrieved from http://nationsreportcard.gov/science_2009/ict_summary.asp.
National Research Council. (1996). National science education standards. Washington, DC: National Academy Press.
National Research Council. (2011). A framework for k-12 science education: Practices, crosscutting concepts, and core ideas. Washington, DC: National Academies Press.
National Research Council (NRC). (2000). Inquiry and the national science education standards. Washington, DC: National Academy Press.
No Child Let Behind. (2002). No child left behind act of 2001. U. S. Pub.L. No. 107–110, 115 Stat. 435.
Nunnally, J. C. (1978). Psychometric theory. New York, NY: McGraw-Hill.
O’Connor, B. P. (2000). SPSS and SAS programs for determining the number of components using parallel analysis and Velicer’s MAP test. Behavior Research Methods, Instruments, & Computers, 32(3), 396–402.
Odom, A., & Kelly, P. (2001). Integrating concept mapping and the learning cycle to teach diffusion and osmosis concepts to high school biology students. Science Education, 85(6), 615–635.
Odom, A., & Settlage, J. J. (1996). Teachers’ understandings of the learning cycle as assessed with a two-tier test. Journal of Science Teacher Education, 7(4), 123–142.
Olson, S., & Loucks-Horsley, S. (2000). Inquiry and the national science education standards: A guide for teaching and learning. Washington DC: National Academy of Sciences.
Pomperoy, D. (1993). Implications of teachers’ beliefs about the nature of science: Comparison of the beliefs of scientists, secondary science teachers, and elementary teachers. Science Education, 77(3), 26–278.
Settlage, J. J. (2000). Understanding the learning cycle: Influences on abilities to embrace the approach by preservice elementary school teachers. Science Education, 84, 43–50.
Settlage, J., Meadows, L., Olson, M., & Blanchard, M. (2008). Teacher knowledge about inquiry: Incorporating conceptual change theory. In E. Abrams, S. Southerland, & P. Silva (Eds.), Inquiry in the classroom: Realities and opportunities (pp. 172–191). Greenwich, CT: Information Age Publishing.
Slotta, J. D. (2004). The web-based inquiry science environment (WISE): Scaffolding knowledge integration in the science classroom. In M. C. Linn, P. Bell & E. Davis (Eds.), Internet Environments for Science Education (pp. 203–232). LEA.
Streiner, D. L. (1998). Factors affecting reliability of interpretations of scree plots. Psychological Reports, 83, 687–694.
Sunal, D., & Wright, E. (2006). Teacher perceptions of science standards in K-12 classrooms: An Alabama case study. In D. Sunal & E. Wright (Eds.), The impact of state and national standards on k-12 science teaching (pp. 7–49). Greenwich, CT: Information Age Publishing.
Tabachnick, B., & Fidell, L. (2006). Using multivariate statistics (5th ed.). Boston, MA: Allyn & Bacon.
Trowbridge, L., & Bybee, R. (1996). Teaching secondary school science: Strategies for developing scientific literacy (6th ed.). Engelwood Cliffs, NJ: Merrill.
Velicer, W. F. (1976). Determining the number of components from the matrix of partial correlations. Psychometrika, 41, 321–327.
Velicer, W. F., Eaton, C. A., & Fava, J. L. (2000). Construct explication through factor or component analysis: A review and evaluation of alternative procedures for determining the number of factors or components. In R. D. Goffin & E. Helmes (Eds.), Problems and solutions in human assessment (pp. 41–71). Boston: Kluwer.
Weiss, I. (2006). A framework for investigating the influence of the national science standards. In D. Sunal & E. Wright (Eds.), The impact of state and national standards on K-12 science teaching (pp. 123–152). Greenwich, CT: Information Age Publishing.
Weiss, I., Pasley, J. D., Smith, P. S., Banilower, E. R., & Heck, D. J. (2003). Looking inside the classroom: A study of K-12 mathematics and science education in the United States. Chapel Hill, NC: Horizon Research.
Welch, W., Klopfer, L. E., Aikenhead, G., & Robinson, J. (1981). The roles of inquiry in science education: Analysis and recommendations. Science Education, 65(1), 33–50.
Worthingtong, R. L., & Whittaker, T. A. (2006). Scale development research. A content analysis and recommendations for best practices. The Counseling Psychologist, 34, 806–838.
Zumbo, B. D., Gadermann, A. M., & Zeisser, C. (2007). Ordinal versions of coefficients alpha and theta for likert rating scales. Journal of Modern Applied Statistical Methods, 6, 21–29.
Zwick, W. R., & Velicer, W. F. (1986). Comparison of five rules for determining the number of components to retain. Psychological Bulletin, 99, 432–442.
Author information
Authors and Affiliations
Corresponding author
Appendix
Appendix
5E Inquiry Lesson Plan Version 2 Rubric (5E ILPv2)
Science Learning Cycle Lesson Plan Rubric v1
0 | 1 | 2 | 3 | 4 | Concepts and/or skills selected for the lesson align with National Science Education Standards and relevant state/local standards |
0 | 1 | 2 | 3 | 4 | The lesson plan contains objectives that are clear, appropriate, measurable, and align with the assessment/evaluation |
0 | 1 | 2 | 3 | 4 | Materials list is present and complete |
Exploration—Phase 1 (Engage and Explore)
Engage item 1 | |||||
0 | 1 | 2 | 3 | 4 | The engage elicits students’ prior knowledge (based upon the objectives) |
Engage item 2 | |||||
0 | 1 | 2 | 3 | 4 | The engage raises student interest/motivation to learn |
Engage item 3 | |||||
0 | 1 | 2 | 3 | 4 | The engage provides opportunities for student discussion/questions (or invites student questions) |
Engage item 4 | |||||
0 | 1 | 2 | 3 | 4 | The engage leads into the exploration |
Explore item 1 | |||||
0 | 1 | 2 | 3 | 4 | During the explore phase, teachers present instructions |
Explore item 2 | |||||
0 | 1 | 2 | 3 | 4 | Learning activities in the exploration phase involves hands-on/minds-on activities |
Explore item 3 | |||||
0 | 1 | 2 | 3 | 4 | Learning activities in the exploration phase are student-centered (When appropriate, teacher questions evoke the learners’ ideas and/or generate new questions from students. Student inquiry may involve student questioning, manipulating objects, developing inquiry skills (as appropriate) and developing abstract ideas). *See back for list of typical inquiry skills |
Explore item 4 | |||||
0 | 1 | 2 | 3 | 4 | The inquiry activities of the explore show evidence of student learning (formative/authentic assessment). *See back for a list of formative assessment methods |
Invention—Phase 2 (Explain)
Explain item 1 | |||||
0 | 1 | 2 | 3 | 4 | There is a logical transition from the explore phase to the explain phase |
Explain item 2 | |||||
0 | 1 | 2 | 3 | 4 | The explain includes teacher questions that lead to the development of concepts and skills (Draws upon the explore activities/or data collected during the explore activities) |
Explain item 3 | |||||
0 | 1 | 2 | 3 | 4 | The explain includes mixed divergent and convergent questions for interactive discussion facilitated by teacher and/or students to develop concepts or skills |
Explain item 4 | |||||
0 | 1 | 2 | 3 | 4 | The explain includes a complete explanation of the concept (s) and/or skill(s) taught |
Explain item 5 | |||||
0 | 1 | 2 | 3 | 4 | The explain phase provides a variety of approaches to explain and illustrate the concept or skill. (For example, approaches might include but are not limited to the use of technology, virtual field trips, demonstrations, cooperative group discussions, panel discussions, interview of guest speaker, video/print/audio/computer program materials, or teacher explanations.) |
Explain item 6 | |||||
0 | 1 | 2 | 3 | 4 | The discussions or activity during the explain phase allows the teacher to assess students’ present understanding of concept(s) or skill(s) |
Expansion—Phase 3 (Elaborate and Evaluate)
Elaborate item 1 | |||||
0 | 1 | 2 | 3 | 4 | There is a logical transition from the explain phase to the elaborate phase |
Elaborate item 2 | |||||
0 | 1 | 2 | 3 | 4 | The elaborate activities provide students with the opportunity to apply the newly acquired concepts and skills into new areas |
Elaborate item 3 | |||||
0 | 1 | 2 | 3 | 4 | The elaborate activities encourage students to find real-life (every day) connections with the newly acquired concepts or skills |
Evaluation item 1 | |||||
0 | 1 | 2 | 3 | 4 | The lesson includes summative evaluation, which can include a variety of forms/approaches. * See back for list of some methods of evaluation |
Evaluation item 2 | |||||
0 | 1 | 2 | 3 | 4 | The evaluation matches the objectives |
Evaluation item 3 | |||||
0 | 1 | 2 | 3 | 4 | The evaluation criteria are clear and appropriate |
Evaluation item 4 | |||||
0 | 1 | 2 | 3 | 4 | The evaluation criteria are measurable (i.e., rubrics) |
— Points
Additional Lesson Plan components:
0 | 1 | 2 | 3 | 4 | Relevant safety issues are addressed. Appropriate safety equipment is delineated. Selection of materials is age appropriate |
0 | 1 | 2 | 3 | 4 | The time specified in each of the lesson plan phases (exploration, invention, expansion) is appropriate |
0 | 1 | 2 | 3 | 4 | Accommodations for students with special needs are addressed. A variety of cognitive levels is addressed throughout the lesson. The lesson is appropriate for all students |
0 | 1 | 2 | 3 | 4 | The lesson plan includes a bibliography. Cited works include web sites, textbooks, children’s literature, and relevant articles. Using only children’s literature is not acceptable. Multiple sources must be used for content verification |
Scoring Criteria
4 | Excellent | All elements of the item are present, complete, appropriate, and accurate, with rich details. Another teacher can use the plan(or phase) as written |
3 | Good | Most of the elements of the item are present, complete, appropriate, and accurate, with rich details. Another teacher could use the plan (or phase) with a few modifications |
2 | Average | Approximately half of the elements of the item are present, complete, appropriate, and accurate, with some details. Another teacher could use the plan (or phase) with modifications |
1 | Poor | Few of the elements of the item are present, complete, appropriate, and accurate, with few details. Another teacher would have to re-write the lesson (or phase) in order to implement the lesson |
0 | Unacceptable | Key elements of the item are not present. Descriptions are inappropriate. Plan lacks coherence and is unusable as written |
*Typical inquiry skills—predicting, hypothesizing, observing, measuring, testing, recording, graphing, creating tables, drawing conclusions.
*Typical formative assessment methods: science journals, science notebooks, photonarratives, KWL charts, concept maps, writing assignments, art work, drawings/charts, graph, quiz, test, PowerPoint presentation, I-movie, movie, cartoons. Note that evaluation comes from the culmination of the formative assessments used during the lesson.
*Examples of appropriate experiences include the following: the use of technology, Internet field trips, field trips, hands-on/minds-on learning activities, cooperative group discussions, panel discussions, interview of guest speaker, video/print/audio/computer program materials, teacher explanations, Webquest, TrackStar, I-movie, PowerPoint.
About this article
Cite this article
Goldston, M.J., Dantzler, J., Day, J. et al. A Psychometric Approach to the Development of a 5E Lesson Plan Scoring Instrument for Inquiry-Based Teaching. J Sci Teacher Educ 24, 527–551 (2013). https://doi.org/10.1007/s10972-012-9327-7
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10972-012-9327-7