Reading and Writing

, Volume 31, Issue 2, pp 267–291 | Cite as

Secondary science teachers’ implementation of CCSS and NGSS literacy practices: a survey study

Article
  • 398 Downloads

Abstract

Most middle and high school students struggle with reading and writing in science. This may be because science teachers are reluctant to teach literacy in science class. New standards now require a shift in the way science teachers develop students’ literacy in science. This survey study examined the extent to which science teachers report implementing science literacy practices from the Common Core Literacy in Science and Technical Subjects and the Next Generation Science Standards with their students. A survey detailing these practices was emailed to all secondary science teachers (N = 2519) in one northeastern state and 14% of them (n = 343) responded. Practices that aligned more closely with disciplinary literacy skills and strategies were implemented more often when compared to the practices aligned with intermediate literacy skills and strategies. Since the development and intermediate skills are important to support students’ literacy progression from foundational to disciplinary, secondary science teachers may not be providing enough support for their students to be competently literate in science, in a fundamental literacy sense. This, in turn, impacts students’ ability to use fundamental literacy skills toward knowledge-building in science, achieving a derived sense of science literacy.

Keywords

Science literacy Disciplinary literacy Standards CCSS NGSS Survey 

References

  1. Achieve, Inc. (2013). Next generation science standards. Washington: Achieve, Inc. Retrieved from http://www.nextgenscience.org/.
  2. American Association for the Advancement of Science (AAAS). (1994). Benchmarks for science literacy. New York: Oxford University Press.Google Scholar
  3. Applebee, A. N., & Langer, J. A. (2011). A snapshot of writing instruction in middle schools and high schools. English Journal, 100(6), 14–27.Google Scholar
  4. Beavers, A., Lounsbury, J., Richards, J., Huck, S., Skoltis, G. & Esquivel, S. (2013). Practical considerations for using exploratory factor analysis in educational research. Practical Assessment, Research & Evaluation, 18(6), 1–11. Retrieved from http://pareonline.net/getvn.asp?v=18&n=6.
  5. Bereiter, C., & Scardamalia, M. (2010). Can children really create knowledge? Canadian Journal of Learning and Technology, 36. Retrieved from http://www.cjlt.ca/index.php/cjlt/article/view/585.
  6. Brindle, M. (2013). Examining relationships among teachers’ preparation, efficacy, and writing Practices (Doctoral dissertation). Retrieved from http://etd.library.vanderbilt.edu/available/etd-06092013-102827/unrestricted/BrindleDissertation.pdf.
  7. Brown, G. (2004). Measuring attitude with positively packed self-report ratings: Comparison of agreement and frequency scales. Psychological Reports, 2004(94), 1015–1024.CrossRefGoogle Scholar
  8. Brozo, W. G., Moorman, G., Meyer, C., & Stewart, T. (2013). Content area reading and disciplinary literacy: A case for the radical center. Journal of Adolescent & Adult Literacy, 56(5), 353–357.CrossRefGoogle Scholar
  9. Carnegie Council on Advancing Adolescent Literacy. (2010). Time to act: An agenda for advancing adolescent literacy for college and career success. New York: Carnegie Corporation of New York.Google Scholar
  10. Chuy, M., Scardamalia, M., Bereiter, C., Prinsen, F., Resendes, M., Messina, R., et al. (2010). Understanding the nature of science and scientific progress: A theory-building approach. Canadian Journal of Learning and Technology, 36, 1–21.Google Scholar
  11. de Winter, J., & Dodou, D. (2012). Five-point Likert items: t test versus Mann–Whitney–Wilcoxon. Practical Assessment, Research & Evaluation, 15(11). Retrieved from http://pareonline.net/getvn.asp?v=15&n=11.
  12. Dillman, D., Smyth, J., & Christian, L. M. (2009). Internet, mail, and mixed-mode surveys: The tailored design model (3rd ed.). Hoboken: Wiley.Google Scholar
  13. Drew, S. V. (2013). Literature review of writing practices in science classrooms, grades 4–12 (Unpublished comprehensive exam). Storrs: Department of Educational Psychology, University of Connecticut.Google Scholar
  14. Drew, S. V., Olinghouse, N. G., Faggella-Luby, M., & Welsh, M. E. (2017). Framework for disciplinary writing in science grades 6–12: A national survey. Journal of Educational Psychology. Advance online publication http://dx.doi.org/10.1037/edu0000186.
  15. Faggella-Luby, M., Graner, P. S., Deshler, D., & Drew, S. V. (2012). Building a house on sand: Why disciplinary literacy is not sufficient to replace general strategies for adolescent learners who struggle. Topics in Language Disorders, 32(1), 69–84.CrossRefGoogle Scholar
  16. Fang, Z., & Coatoam, S. (2013). Disciplinary literacy: What you want to know about it. Journal of Adolescent and Adult Literacy, 56(8), 627–632.CrossRefGoogle Scholar
  17. Fang, Z., & Schleppegrell, M. J. (2010). Disciplinary literacies across content areas: Supporting secondary reading through functional language analysis. Journal of Adolescent and Adult Literacy, 53(7), 587–597.CrossRefGoogle Scholar
  18. Fives, H., Huebner, W., Birnbaum, A. S., & Nicolich, M. (2014). Developing a measure of scientific literacy for middle school students. Science Education, 98(4), 549–580.CrossRefGoogle Scholar
  19. Fraze, S., Hardin, K., Brashears, M., Haygood, J., & Smith, M. (2003). The effects of delivery mode upon survey responses rate and perceived attitudes of Texas agri-science teachers. Journal of Agricultural Education, 44(2), 27–37.CrossRefGoogle Scholar
  20. Gillespie, A., Graham, S., Kiuhara, S., & Hebert, M. (2014). High school teachers use of writing to support learning: A national survey. Reading and Writing: An Interdisciplinary Journal, 27(6), 1043–1072.CrossRefGoogle Scholar
  21. Gillis, V. (2014). Disciplinary literacy. Journal of Adolescent & Adult Literacy, 57(8), 614–623.CrossRefGoogle Scholar
  22. Graham, S., Capizzi, A., Harris, K., Hebert, M., & Morphy, P. (2014). Teaching writing to middle school students: A national survey. Reading and Writing: An Interdisciplinary Journal, 27(6), 1015–1042.CrossRefGoogle Scholar
  23. Hand, B. M., Alvermann, D. E., Gee, J., Guzzetti, B. J., Norris, S. P., Phillips, L. M., et al. (2003). Message from the “Island Group”: What is literacy in science literacy? Journal of Research in Science Teaching, 40(7), 607–615.CrossRefGoogle Scholar
  24. Hayton, J. C., Allen, D. G., & Scarpello, V. (2004). Factor retention decisions in exploratory factor analysis: A tutorial on parallel analysis. Organizational Research Methods, 7, 191–205.CrossRefGoogle Scholar
  25. Hurst, B., & Pearman, C. J. (2013). Teach reading? But I’m not a reading teacher! Critical Questions in Education, 4(3), 225–234.Google Scholar
  26. Hutchinson, A., & Reinking, D. (2011). Teachers’ perceptions of integrating information and communication technologies into literacy instruction: A national survey in the United States. Reading Research Quarterly, 46(4), 312–333.CrossRefGoogle Scholar
  27. Kiuhara, S., Graham, S., & Hawken, L. (2009). Teaching writing to high school students: A national survey. Journal of Educational Psychology, 101(1), 136–160.CrossRefGoogle Scholar
  28. Koomen, M., Weaver, S., Blair, R., & Oberhausuer, K. (2016). Disciplinary literacy in the science classroom; Using adaptive primary literature. Journal of Research in Science Teaching, 53(6), 847–894.CrossRefGoogle Scholar
  29. Kwak, N., & Radler, B. (2002). A comparison between mail and web surveys: Response pattern, respondent profile, and data quality. Journal of Official Statistics, 18(2), 257–273.Google Scholar
  30. Lee, O. (2017). Common core state standards for ELA/literacy and next generation science standards: Convergences and discrepancies using argument as an example. Educational Researcher, 46(2), 90–102.CrossRefGoogle Scholar
  31. McCoach, D. B., Gable, R. K., & Madura, J. P. (2013). Evidence based on the internal structure of the instrument: Factor analysis. In D. McCoach, R. Gable, & J. Madura (Eds.), Instrument development in the affective domain: School and corporate applications (3rd ed., pp. 109–162). New York: Springer.CrossRefGoogle Scholar
  32. Moje, E. B. (2008). Foregrounding the disciplines in secondary literacy teaching and learning: A call for change. Journal of Adolescent and Adult Literacy, 52(2), 96–107.CrossRefGoogle Scholar
  33. Mongillo, M. B. (2016). Creating mathematicians and scientists: Disciplinary literacy in the early childhood classroom. Early Child Development and Care, 187(3–4), 331–341.Google Scholar
  34. National Commission on Writing in America’s Schools and Colleges. (2003). The neglected R: The need for a writing revolution. Retrieved from http://www.collegeboard.com/prod_downloads/writingcom/neglectedr.pdf.
  35. National Governors Association Center for Best Practices (NGA Center), & Council of Chief State School Officers (CCSSO). (2010). The common core state standards. Washington: NGA Center, CCSSO.Google Scholar
  36. National Research Council (NRC). (1996). National science education standards. Washington: National Academy of Science Press.Google Scholar
  37. National Research Council (NRC). (2012). A framework for K-12 science education: Practices, crosscutting concepts, and core ideas. Committee on a conceptual framework for new K-12 science education standards. Board on Science Education, Division of Behavioral and Social Sciences and Education. Washington: The National Academies Press.Google Scholar
  38. National Research Council (NRC). (2014). Literacy for science: Exploring the intersection of the next generation science standards and common core for ELA standards, a workshop summary, H. Rhodes and M.A. Feder, Rapporteurs. steering committee on exploring the overlap between “literacy in science” and the practice of obtaining, evaluating, and communicating information. Board on Science Education, Division of Behavioral and Social Sciences and Education. Washington: The National Academies Press.Google Scholar
  39. Norris, S., & Phillips, L. (2003). How literacy in its fundamental sense is central to scientific literacy. Science Education, 87(2), 224–240.CrossRefGoogle Scholar
  40. Pearson, P. D., Moje, E., & Greenleaf, C. (2010). Literacy and science: Each in the service of the other. Science, 328(5977), 459–463.CrossRefGoogle Scholar
  41. Pett, M. A., Lackey, N. R., & Sullivan, J. J. (2003). Making sense of factor analysis: The use of factor analysis for instrument development in health care research. Thousand Oaks: Sage.CrossRefGoogle Scholar
  42. Prinsley, R., & Baranyai, K. (2015). STEM skills in the workforce: What do employers want? Australian Government Office of the Chief Scientist Occasional Paper Series, 9. Retrieved from http://www.chiefscientist.gov.au/wp-content/uploads/OPS09_02Mar2015_Web.pdf.
  43. Putra, G., & Tang, K. (2016). Disciplinary literacy instructions on writing scientific explanations: A case study from a chemistry classroom in an all-girls school. Chemistry Education Research and Practice, 17(3), 569–579.CrossRefGoogle Scholar
  44. Rampey, B. D., Finnegan, R., Goodman, M., Mohadjer, L., Krenzke, T., Hogan, J., & Provasnik, S. (2016). Skills of U.S. unemployed, young, and older adults in sharper focus: Results from the program for the international assessment of adult competencies (PIAAC) 2012/2014: First look (NCES 2016-039 rev). U.S. Department of Education. Washington: National Center for Education Statistics. Retrieved [date] from http://nces.ed.gov/pubsearch.
  45. Reiser, B. J. (2013). What professional development strategies are needed for successful implementation of the next generation science standards? In K-12 Center at ETS, invitational research symposium on science assessment. Symposium conducted at ETS, Washington. Retrieved from http://www.ets.org/research/policy_research_reports/publications/paper/2013/jvhf.
  46. Rutherford, F. J., & Ahlgren, A. (1991). Science for all Americans. New York: Oxford University Press.Google Scholar
  47. Shanahan, T., & Shanahan, C. (2008). Teaching disciplinary literacy to adolescents: Rethinking content-area literacy. Harvard Educational Review, 78(1), 40–59.CrossRefGoogle Scholar
  48. Tang, K. S. (2015). Reconceptualising science education practices from new literacies research. Science Education International, 26(3), 307–324.Google Scholar
  49. The Social and Economic Sciences Research Center (SESRC). (2014). A classic citation: The tailored design method. Pullman: Board of Regents at Washington State University. Retrieved from http://www.sesrc.wsu.edu/sesrcsite/methods/tdm.html.
  50. Velicer, W. F., Eaton, C. A., & Fava, J. L. (2000). Construct explication through factor or component analysis: A review and evaluation of alternative procedures for determining the number of factors or components. In R. Goffin & E. Helmes (Eds.), Problems and solutions in human assessment (pp. 41–71). New York: Springer.CrossRefGoogle Scholar
  51. Wright, K. L., Franks, A. D., Kuo, L. J., McTigue, E. M., & Serrano, J. (2016). Both theory and practice: Science literacy instruction and theories of reading. International Journal of Science and Mathematics Education, 14(7), 1275–1292.CrossRefGoogle Scholar
  52. Yore, L. D., & Treagust, D. F. (2006). Current realities and future possibilities: Language and science literacy—empowering research and informing instruction. International Journal of Science Education, 28(2–3), 291–314.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media B.V. 2017

Authors and Affiliations

  1. 1.Department of Special Education and InterventionsCentral Connecticut State UniversityNew BritainUSA
  2. 2.Department of Geological Sciences, Copernicus HallCentral Connecticut State UniversityNew BritainUSA

Personalised recommendations