Advertisement

Journal of Science Education and Technology

, Volume 24, Issue 6, pp 818–834 | Cite as

Visual Representations on High School Biology, Chemistry, Earth Science, and Physics Assessments

  • Nicole D. LaDueEmail author
  • Julie C. Libarkin
  • Stephen R. Thomas
Article

Abstract

The pervasive use of visual representations in textbooks, curricula, and assessments underscores their importance in K-12 science education. For example, visual representations figure prominently in the recent publication of the Next Generation Science Standards (NGSS Lead States in Next generation science standards: for states, by states. Achieve, Inc. on behalf of the twenty-six states and partners that collaborated on the NGSS, 2013). Although assessments of the NGSS have yet to be developed, most students are currently evaluated on their ability to interpret science visuals. While numerous studies exist on particular visuals, it is unclear whether the same types of visuals are emphasized in all science disciplines. The present study is an evaluation of the similarities and differences of visuals used to assess students’ knowledge of chemistry, earth science, living environment (biology), and physics on the New York State Regents examination. Analysis of 266 distinct visual representations categorized across the four content examinations reveals that the frequency and type of visuals vary greatly between disciplines. Diagrams, Graphs, Tables, and Maps are the most prevalent across all science disciplines. Maps, Cartograms, and Time Charts are unique to the Earth Science examination, and Network Diagrams are unique to the living environment (biology) examination. This study identifies which representations are most critical for training students across the science disciplines in anticipation of the implementation and eventual assessment of the NGSS.

Keywords

Assessment K-12 Science Visual representations Diagrams Graphs 

Notes

Acknowledgments

Thank you to Sheldon P. Turner, Emily Geraghty Ward, and Scott K. Clark for reviewing early versions of this manuscript.

References

  1. Ainsworth S (2006) DeFT: a conceptual framework for considering learning with multiple representations. Learn Instr 16(3):183–198CrossRefGoogle Scholar
  2. Bishop JH (1998) The effect of curriculum-based external exit exam systems on student achievement. J Econ Educ 29(2):171–182CrossRefGoogle Scholar
  3. Bruton S, Ong F, Geeting G (2000) Science content standards for California public schools: kindergarten through grade twelve. California Dept. of Education, 721 Capitol Mall, PO Box 944272, Sacramento, CA 94244-2720Google Scholar
  4. Cheek Kim A (2013) Exploring the relationship between students’ understanding of conventional time and deep (geologic) time. Int J Sci Educ 35(11):1925–1945CrossRefGoogle Scholar
  5. Clark SC, Libarkin JC, Kortz KM, Jordan SC (2011) Alternative conceptions of plate tectonics held by nonscience undergraduates. J Geosci Educ 59(4):251–262. doi: 10.5408/1.3651696 CrossRefGoogle Scholar
  6. Contino J (2012) A case study of the alignment between curriculum and assessment in the New York State Earth science standards-based system. J Sci Educ Technol 22(1):62–72CrossRefGoogle Scholar
  7. Cromley JG, Snyder-Hogan LE, Luciw-Dubas UA (2010) Reading comprehension of scientific text: a domain-specific test of the direct and inferential mediation model of reading comprehension. J Educ Psychol 102(3):687CrossRefGoogle Scholar
  8. Dimopoulos K, Koulaidis V, Sklaveniti S (2003) Towards an analysis of visual images in school science textbooks and press articles about science and technology. Res Sci Educ 33:189–216CrossRefGoogle Scholar
  9. Dodick J (2012) Supporting students’ cognitive understanding of geological time: a needed “revolution” in science education. Geol Soc Am Spec Papers 486:31–33CrossRefGoogle Scholar
  10. Dodick J, Orion N (2003) Cognitive factors affecting student understanding of geologic time. J Res Sci Teach 40(4):415–442CrossRefGoogle Scholar
  11. Goodsell DS, Johnson GT (2007) Filling in the gaps: artistic license in education and outreach. PLoS Biol 5(12):e308. doi: 10.1371/journal.pbio.0050308 CrossRefGoogle Scholar
  12. Hegarty Mary (2014) Spatial Thinking in Undergraduate Science Education. Spat Cogn Comput 14(2):142–167Google Scholar
  13. Hegarty M, Canham MS, Fabrikant SI (2010) Thinking about the weather: how display salience and knowledge affect performance in a graphic inference task. J Exp Psychol Learn Mem Cogn 36(1):37CrossRefGoogle Scholar
  14. Jee BD, Gentner D, Uttal DH, Sageman B, Forbus K, Manduca CA, Ormond CA, Shipley TF, Tikoff B (2014) Drawing on experience: how domain knowledge is reflected in sketches of scientific structures and processes. Res Sci Educ, 1–25Google Scholar
  15. Kastens KA, Pistolesi L, Passow MJ (2014) Analysis of spatial concepts, spatial skills, and spatial representations of New York state regents earth science examinations. J Geosci Educ 62:278–289CrossRefGoogle Scholar
  16. Kozhevnikov M, Motes MA, Hegarty M (2007) Spatial visualization in physics problem solving. Cogn Sci 31(4):549–579CrossRefGoogle Scholar
  17. Kozma RB, Russell J (1997) Multimedia and understanding: expert and novice responses to different representations of chemical phenomena. J Res Sci Teach 34(9):949–968CrossRefGoogle Scholar
  18. Liben LS, Kastens KA, Christensen AE (2011) Spatial foundations of science education: the illustrative case of instruction on introductory geological concepts. Cogn Instr 29(1):45–87CrossRefGoogle Scholar
  19. Liu X, Fulmer G (2008) Alignment between the science curriculum and assessment in selected NY state regents exams. J Sci Educ Technol 17(4):373–383CrossRefGoogle Scholar
  20. Lohse GL, Biolsi K, Walker N, Rueler H (1994) A classification of visual representations. Commun ACM 37(12):36–49CrossRefGoogle Scholar
  21. McGraw KO, Wong SP (1996) Forming inferences about some intraclass correlation coefficients. Psychol Methods 1(1):30CrossRefGoogle Scholar
  22. Meisel RP (2010) Teaching tree-thinking to undergraduate biology students. Evol Educ Outreach 3(4):621–628CrossRefGoogle Scholar
  23. National Research Council [NRC] (1996) National science education standards. National Academies Press, Washington, DCGoogle Scholar
  24. National Research Council [NRC] (2012a) A framework for K-12 science education: practices, crosscutting concepts, and core ideas. The National Academies Press, Washington, DCGoogle Scholar
  25. National Research Council [NRC] (2012b) Discipline-based education research: understanding and improving learning in undergraduate science and engineering. The National Academies Press, Washington, DCGoogle Scholar
  26. National Research Council [NRC] (2014) Developing assessments for the next generation science standards. The National Academies Press, Washington, DCGoogle Scholar
  27. New York State Education Department [NYSED] (1987) History of regents examinations from 1865 to 1987. Retrieved from: http://www.p12.nysed.gov/assessment/hsgen/archive/rehistory.htm
  28. New York State Education Department [NYSED] (1996) Learning standards for mathematics, science, and technology. Retrieved from: http://www.p12.nysed.gov/ciai/mst/sci/ls.html
  29. New York State Education Department [NYSED] (2001) Physical setting/earth science core curriculum. Retrieved from: www.nysed.gov/ciai/mst/pub/earthsci.pdf
  30. New York State Education Department [NYSED] (2010) Reference tables for physical setting/earth science. Retrieved from: http://www.p12.nysed.gov/assessment/resources/home.html#es-trans-11
  31. New York State Education Department [NYSED] (2012a) June 2012 chemistry regents exam. Retrieved from: http://www.nysedregents.org/Chemistry/
  32. New York State Education Department [NYSED] (2012b) June 2012 earth science regents exam. Retrieved from: http://www.nysedregents.org/EarthScience/
  33. New York State Education Department [NYSED] (2012c) June 2012 living environment regents exam. Retrieved from: http://www.nysedregents.org/LivingEnvironment/
  34. New York State Education Department [NYSED] (2012d) June 2012 Physics regents exam. Retrieved from: http://www.nysedregents.org/Physics/
  35. New York State Education Department [NYSED] (2012e). August 2012 earth science regents exam. Retrieved from: http://www.nysedregents.org/EarthScience/
  36. New York State Educatoin Department [NYSED] (2013) New York State education department test development process. Available online at: http://www.p12.nysed.gov/assessment/teacher/home.html#process (accessed 18 July 2014)
  37. NGSS Lead States. (2013). Next generation science standards: for states, by states. Achieve, Inc. on behalf of the twenty-six states and partners that collaborated on the NGSSGoogle Scholar
  38. Novick LR, Stull AT, Catley KM (2012) Reading phylogenetic trees: the effects of tree orientation and text processing on comprehension. Bioscience 62(8):757–764CrossRefGoogle Scholar
  39. Petcovic HL, Stokes A, Caulkins JL (2014) Geoscientists’ perceptions of the value of undergraduate field education. GSA Today 24:7CrossRefGoogle Scholar
  40. Schönborn KJ, Anderson TR (2009) A model of factors determining students’ ability to interpret external representations in biochemistry. Int J Sci Educ 31(2):193–232CrossRefGoogle Scholar
  41. Shah P, Hoeffner J (2002) Review of Graph Comprehension Research: implications for Instruction. Educ Psychol Rev 14(1):47–69CrossRefGoogle Scholar
  42. Shrout PE, Fleiss JL (1979) Intraclass correlations: uses in assessing rater reliability. Psychol Bull 86(2):420CrossRefGoogle Scholar
  43. Southern Regional Education Board (2007) The changing roles of statewide high school exams. Last Accessed 23 July 14 from: http://publications.sreb.org/2007/07E03_Statewide_Exams.pdf
  44. Stieff M (2007) Mental rotation and diagrammatic reasoning in science. Learn Instr 17(2):219–234CrossRefGoogle Scholar
  45. Stieff M, Ryu M, Dixon B, Hegarty M (2012) The role of spatial ability and strategy preference for spatial problem solving in organic chemistry. J Chem Educ 89(7):854–859CrossRefGoogle Scholar
  46. Texas Education Agency (2010a) Texas essential knowledge and skills (TEKS), Chapter 112, Subchapter B. Middle School. Last accessed on 23 July 14: http://ritter.tea.state.tx.us/rules/tac/chapter112/index.html
  47. Texas Education Agency (2010b) Texas essential knowledge and skills (TEKS), Chapter 112, Subchapter C. High School. Last accessed on 23 July 14: http://ritter.tea.state.tx.us/rules/tac/chapter112/index.html
  48. The ACT (2013) Retrieved from: content covered by the ACT test. http://www.actstudent.org/testprep/descriptions/scicontent.html
  49. Trend R (2009) The power of deep time in geoscience education: linking ‘interest’, ‘threshold concepts’, and ‘self-determination theory’. Studia UBB Geol 54(1):7–12CrossRefGoogle Scholar
  50. United States Government Accountability Office (2013) Race to the top: states implementing teacher and principal evaluation systems despite challenges. GAO Highlights, September 2013. Retrieved from: http://www.gao.gov/assets/660/657937.pdf
  51. Virginia Department of Education. (2013). Standards of learning (SOL) and testing. Last accessed 23 July 14 from: http://www.doe.virginia.gov/testing/
  52. Wai J, Lubinski D, Benbow CP (2009) Spatial ability for STEM domains: aligning over 50 years of cumulative psychological knowledge solidifies its importance. J Educ Psychol 101(4):817CrossRefGoogle Scholar
  53. Wu HK, Krajcik JS, Soloway E (2001) Promoting understanding of chemical representations: students’ use of a visualization tool in the classroom. J Res Sci Teach 38(7):821–842CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2015

Authors and Affiliations

  • Nicole D. LaDue
    • 1
    Email author
  • Julie C. Libarkin
    • 2
  • Stephen R. Thomas
    • 3
  1. 1.Department of Geology and Environmental GeosciencesNorthern Illinois UniversityDekalbUSA
  2. 2.Geocognition Research LabMichigan State UniversityEast LansingUSA
  3. 3.Department of ZoologyMichigan State UniversityEast LansingUSA

Personalised recommendations