Advertisement

The Role of Concept Inventories in Course Assessment

  • Julie LibarkinEmail author
  • Sarah E. Jardeleza
  • Teresa L. McElhinny
Chapter
Part of the Innovations in Science Education and Technology book series (ISET, volume 20)

Abstract

Development of effective instructional materials, particularly those intended to address strongly held alternative conceptions about the natural world, is difficult. Research suggests that the most effective instruction stems from initial consideration of instructional goals and careful alignment of practice with those goals. Understanding whether or not instruction is effective itself requires development of assessment instruments that are written in direct correspondence to pre-articulated goals. Concept inventories (CIs), multiple-choice tests targeting specific content, are becoming increasingly popular mechanisms for assessing student learning, particularly in the USA. CIs have become popular because they target student alternative conceptions authentically and are relatively easy to implement even to very large lecture courses. The wide array of CIs available both in the USA and internationally reflects the importance that faculty place on addressing student conceptions. CIs can be used as both instructional tools and as research instruments; where used for research, scholars must be careful to evaluate the validity and reliability of the CI being used. In this chapter, we provide evidence of the value of CIs for use in both course and programmatic assessment. In addition, we illustrate the importance of community discourse in ensuring that CIs are appropriate for research.

Keywords

Conceptual Understanding Programmatic Assessment Summative Assessment Concept Inventory Science Education Community 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Notes

Acknowledgments

 Development of the GCI and initial efforts in building online resources were made possible by the US National Science Foundation (NSF) through grants DUE-0127765, DUE-0350395, DGE-9906479, DUE-0717790, and DUE-0717589. Any opinions, findings, conclusions, or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the NSF. This work was partially funded by the Center for Integrative Studies in General Science (CISGS) in the College of Natural Science at Michigan State University.

References

  1. Bardar, E. M., Prather, E. E., Brecher, K., & Slater, T. F. (2006). Development and validation of the light and spectroscopy concept inventory. Astronomy Education Review, 5, 103.CrossRefGoogle Scholar
  2. Black, A. A. (2005). Spatial ability and earth science conceptual understanding. Journal of Geoscience Education, 53, 402–414.Google Scholar
  3. Bowling, B. V., Acra, E. E., Wang, L., et al. (2008). Development and evaluation of a genetics literacy assessment instrument for undergraduates. Genetics, 178, 15–22.CrossRefGoogle Scholar
  4. DeMars, C. E. (2006). Application of the bi-factor multidimensional item response theory model to testlet-based tests. Journal of Educational Measurement, 43, 145–168.CrossRefGoogle Scholar
  5. DeVellis, D. R. F. (2003). Scale development: Theory and applications (2nd ed.). Newbury Park: Sage Publications, Inc.Google Scholar
  6. Elrod, S. L., Bartel, B. P. D., Walz, J. C., & Polacek, K. M. (2008, November 11–15). The genetics concept inventory (GenCI) Version 3.0: Identifying and addressing student misconceptions. Annual meeting of the American Society for Human Genetics, Philadelphia, PA.Google Scholar
  7. Garvin-Doxas, K., & Klymkowsky, M. W. (2008). Understanding randomness and its impact on student learning: Lessons learned from building the Biology Concept Inventory (BCI). CBE Life Sciences Education, 7, 227–233.CrossRefGoogle Scholar
  8. Hambrick, D. Z., Libarkin, J. C., Petcovic, H. L., et al. (2012). A test of the circumvention-of-limits hypothesis in scientific problem solving: The case of geological bedrock mapping. Journal of Experimental Psychology. General, 141(3), 397–403.CrossRefGoogle Scholar
  9. Hestenes, D., & Wells, M. (1992). A mechanics baseline test. The Physics Teacher, 30, 159.CrossRefGoogle Scholar
  10. Hott, A. M., Huether, C. A., McInerney, J. D., et al. (2002). Genetics content in introductory biology courses for non-science majors: Theory and practice. Bioscience, 52, 1024–1035.CrossRefGoogle Scholar
  11. Kelemen, D., & Rosset, E. (2009). The human function compunction: Teleological explanation in adults. Cognition, 111, 138–143.CrossRefGoogle Scholar
  12. Krajcik, J., McNeill, K. L., & Reiser, B. J. (2008). Learning-goals-driven design model: Developing curriculum materials that align with national standards and incorporate project-based pedagogy. Science Education, 92, 1–32.CrossRefGoogle Scholar
  13. Libarkin, J. (2008, October 13–14). Concept inventories in higher education science. National research council promising practices in undergraduate STEM education workshop 2, Washington, DC.Google Scholar
  14. Libarkin, J. C., & Anderson, S. W. (2005). Assessment of learning in entry-level geoscience courses: Results from the Geoscience Concept Inventory. Journal of Geoscience Education, 53, 394.Google Scholar
  15. Libarkin, J. C., & Anderson, S. W. (2006). Development of the Geoscience Concept Inventory.In Proceedings of the national STEM assessment conference (pp. 148–158). Washington DC.Google Scholar
  16. Libarkin, J. C., & Anderson, S. W. (2007). The geoscience concept inventory: Application of Rasch analysis to concept inventory development in higher education. In X. Liu & W. J. Boone (Eds.), Applications of Rasch measurement in science education. Maple Grove: JAM Press.Google Scholar
  17. Libarkin, J. C., & Geraghty Ward, E. M. (2011). The qualitative underpinnings of quantitative concept inventory questions. Geological Society of America Special Papers, 474, 37–48.CrossRefGoogle Scholar
  18. Libarkin, J. C., Ward, E. M. G., Anderson, S. W., et al. (2011). Revisiting the geoscience concept inventory: A call to the community. GSA Today, 21, 26–28.CrossRefGoogle Scholar
  19. Llerandi Roman, P. A. (2007). The effects of a professional development geoscience education institute upon secondary school science teachers in Puerto Rico. Ph.D., Curriculum and Instruction, Purdue University.Google Scholar
  20. Lord, F. M. (1980). Applications of item response theory to practical testing problems. New York: Routledge.Google Scholar
  21. McConnell, D. A., & van Der Hoeven Kraft, K. J. (2011). Affective domain and student learning in the geosciences. Journal of Geoscience Education, 59, 106.CrossRefGoogle Scholar
  22. McConnell, D. A., Steer, D. N., Owens, K. D., et al. (2006). Using conceptests to assess and improve student conceptual understanding in introductory geoscience courses. Journal of Geoscience Education, 54, 61–68.Google Scholar
  23. McElhinny, T. L., Dougherty, M. J., Bowling, B. V., & Libarkin, J. C. (2012). The status of genetics curriculum in higher education in the United States: Goals and assessment. Science & Education 1–20.Google Scholar
  24. Patton, M. Q. (2002). Qualitative research and evaluation methods (3rd ed.). Thousand Oaks: Sage Publications, Inc.Google Scholar
  25. Pellegrino, J. W., Chudowsky, N., & Glaser, R. (2001). Knowing what students know: The science and design of educational assessment. Washington, DC: National Academies Press.Google Scholar
  26. Petcovic, H., & Ruhf, R. (2008). Geoscience conceptual knowledge of preservice elementary teachers: Results from the Geoscience Concept Inventory. Journal of Geoscience Education, 56, 251–260.Google Scholar
  27. Reed-Rhoads, T., & Imbrie, P. K. (2008, October 13–14). Concept inventories in engineering education. National research council promising practices in undergraduate STEM education workshop 2, Washington, DC.Google Scholar
  28. Russell, D., Davies, M., & Totten, I. (2008). GEOWORLDS: Utilizing second life to develop advanced geosciences knowledge. In Proceedings of the 2008 second IEEE international conference on digital game and intelligent toy enhanced learning (pp. 93–97). Washington, DC: IEEE Computer Society.Google Scholar
  29. Smith, M. K., Wood, W. B., & Knight, J. K. (2008). The genetics concept assessment: A new concept inventory for gauging student understanding of genetics. CBE Life Sciences Education, 7, 422–430.CrossRefGoogle Scholar
  30. Smith, K. A., Douglas, T. C., & Cox, M. F. (2009). Supportive teaching and learning strategies in STEM education. New Directions for Teaching and Learning, 2009(117), 19–32.CrossRefGoogle Scholar
  31. Steer, D. N., Knight, C. C., Owens, K. D., & McConnell, D. A. (2005). Challenging students ideas about Earth’s interior structure using a model-based, conceptual change approach in a large class setting. Journal of Geoscience Education, 53, 415–421.Google Scholar
  32. Teed, R., & Slattery, W. (2011). Changes in geologic time understanding in a class for preservice teachers. Journal of Geoscience Education, 59, 151.CrossRefGoogle Scholar
  33. Theissen, K. (2011). Peer-reviewed multiple-choice question. http://geoscienceconceptinventory.wikispaces.com/Negative+Feedback+in+the+Climate+System. Accessed 25 Oct 2011.
  34. Treagust, D. (1986). Evaluating students’ misconceptions by means of diagnostic multiple choice items. Research in Science Education, 16, 199–207.CrossRefGoogle Scholar
  35. Treagust, D. F. (1988). Development and use of diagnostic tests to evaluate students’ misconceptions in science. International Journal of Science Education, 10, 159–169.Google Scholar
  36. Trochim, W. (2001). The research methods knowledge base (2nd ed.). Mason, OH.Google Scholar
  37. Tsui, C.-Y., & Treagust, D. (2010). Evaluating secondary students’ scientific reasoning in genetics using a two-tier diagnostic instrument. International Journal of Science Education, 32, 1073–1098.CrossRefGoogle Scholar
  38. Ward, E. M. G., Libarkin, J. C., Kortemeyer, G., & Raeburn, S. P. (2010). The geoscience concept inventory WebCenter provides new means for student assessment. eLearningPapers Google Scholar
  39. West, P., Rutstein, D. W., Mislevy, R. J., et al. (2010). A Bayesian Network approach to modeling learning progressions and task performance (CRESST Report 776). National Center for Research on Evaluation, Standards, and Student Testing (CRESST).Google Scholar
  40. Wiggins, G. P., & McTighe, J. (1998). Understanding by design. Alexandria: Association for Supervision & Curriculum Development.Google Scholar
  41. Wittmann, M. (1998). Making sense of how students come to an understanding of physics: An example from mechanical waves. Ph.D. University of Maryland.Google Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2014

Authors and Affiliations

  • Julie Libarkin
    • 1
    • 2
    Email author
  • Sarah E. Jardeleza
    • 3
    • 2
  • Teresa L. McElhinny
    • 3
    • 4
  1. 1.Geocognition Research Laboratory, Department of Geological SciencesMichigan State UniversityEast LansingUSA
  2. 2.Center for Integrative Studies in General ScienceMichigan State UniversityEast LansingUSA
  3. 3.Department of Geological SciencesMichigan State UniversityEast LansingUSA
  4. 4.Department of ZoologyMichigan State UniversityEast LansingUSA

Personalised recommendations