Advertisement

Impact of Secondary Students’ Content Knowledge on Their Communication Skills in Science

  • Christoph KulgemeyerEmail author
Article

Abstract

The expert blind spot (EBS) hypothesis implies that even some experts with a high content knowledge might have problems in science communication because they are using the structure of the content rather than their addressee’s prerequisites as an orientation. But is that also true for students? Explaining science to peers is a crucial part of cooperative learning methods such as the “jigsaw method”. Our study examined the relationship between science communication competence (SCC) and content knowledge (CK) of 10th-grade students (N = 213). Using latent class analysis, we identified two types of students with a different relationship between CK and SCC. Using path analysis, we found that the first type of 109 students primarily used their science CK as the “resource” for addressee-oriented science communication and both their SCC and their CK were correlated with each other. For the second type of 104 students (who used other resources), their CK even had a small negative effect on their SCC. Using t tests, we found that those students primarily using their CK as the resource for communication performed significantly worse in the communication test than did those students who used other resources. Using the EBS hypothesis, we suggest that students’ CK might have ambiguous effects in communication if the content structure—rather than their addressee’s prior knowledge—is used as the primary orientation for communication. We suggest that an effective use of cooperative learning techniques in classroom requires a special prior training for their science communication skills.

Keywords

Science communication Content knowledge Physics education Cooperative learning Expert blind spot hypothesis Explaining 

References

  1. Aikenhead, G. (2001). Science communication: A cross cultural event. In S. Stocklmayer, M. Gore & C. Bryant (Eds.), Science communication in theory and practice (pp. 23–46). Dordrecht, The Netherlands: Kluwer.CrossRefGoogle Scholar
  2. Berger, R. & Hänze, M. (2015). Impact of expert teaching quality on novice academic performance in the jigsaw cooperative learning method. International Journal of Science Education, 37(2), 294–320.CrossRefGoogle Scholar
  3. Berland, L. K. & McNeill, K. L. (2012). For whom is argument and explanation a necessary distinction? A response to Osborne and Patterson. Science Education, 96(5), 808–813.CrossRefGoogle Scholar
  4. Berland, L. K. & Reiser, B. J. (2009). Making sense of argumentation and explanation. Science Education, 93(1), 26–55.CrossRefGoogle Scholar
  5. Bernholt, S. (2010). Kompetenzmodellierung in der Chemie - Theoretische und empirische Reflexion am Beispiel des Modells hierarchischer Komplexität [Modelling competencies in chemistry—theoretical and empirical considerations referring to the model of hierarchical complexity]. Berlin, Germany: Logos.Google Scholar
  6. Bernholt, S., Eggert, S. & Kulgemeyer, C. (2012). Capturing the diversity of students’ competences in science classrooms: Differences and commonalities of three complementary approaches. In S. Bernholt, K. Neumann & P. Nentwig (Eds.), Making it tangible: learning outcomes in science education (pp. 173–201). Münster, Germany: Waxmann.Google Scholar
  7. Bond, T. & Fox, C. (2001). Applying the Rasch model. Fundamental measurement in the human sciences. London, United Kingdom: Lawrence Erlbaum Associates.Google Scholar
  8. Bricker, L. & Bell, P. (2008). Conceptualizations of argumentation from learning science studies and the learning sciences and their implications for the practices of science education. Science Education, 92, 437–498.CrossRefGoogle Scholar
  9. Bromme, R., Rambow, R. & Nückles, M. (2001). Expertise and estimating what other people know: The influence of professional experience and type of knowledge. Journal of Experimental Psychology: Applied, 7, 317–330.Google Scholar
  10. Bucchi, M., & Trench, B. (Eds.) (2008). Handbook of public communication of science and technology. Abingdon, United Kingdom: Routledge.Google Scholar
  11. Bühner, M. (2004). Einführung in die Test- und Fragebogenkonstruktion [Test development: an introduction]. München, Germany: Pearson.Google Scholar
  12. Chi, M. T. H., Siler, S. & Jeong, H. (2004). Can tutors monitor students’ understanding accurately? Cognition and Instruction, 22, 363–387.CrossRefGoogle Scholar
  13. Chi, M. T. H., Siler, S. A., Jeong, H., Yamauchi, T. & Hausmann, R. G. (2001). Learning from human tutoring. Cognitive Science, 25, 471–533.CrossRefGoogle Scholar
  14. Clark, H. (1996). Using language. Cambridge, United Kingdom: Cambridge University Press.CrossRefGoogle Scholar
  15. Clark, L. A. & Watson, D. (1995). Constructing validity: Basic issues in objective scale development. Psychological Assessment, 7(3), 309–319.CrossRefGoogle Scholar
  16. Davier, M. V. (2000). WINMIRA—A program system for analyses with the Rasch model, with the latent class analysis and with the mixed Rasch model. Kiel, Germany: IPN.Google Scholar
  17. Davis, M. (1980). A multidimensional approach to individual differences in empathy. Catalogue of Selected Documents in Psychology, 10, 85–104.Google Scholar
  18. Driver, R., Newton, P. & Osborne, J. (2000). Establishing the norms of scientific argumentation in classrooms. Science Education, 84(3), 287–312.CrossRefGoogle Scholar
  19. Edmonston, J., Dawson, V. & Schibeci, R. (2010). Undergraduate biotechnology students’ view of science communication. International Journal of Science Education, 32(18), 2451–2474.CrossRefGoogle Scholar
  20. Eggert, S. & Bögeholz, S. (2006). Göttinger Modell der Bewertungskompetenz - Teilkompetenz Bewerten, Entscheiden und Reflektieren für Gestaltungsaufgaben Nachhaltiger Entwicklung [The Göttingen model of judging competence—sub-competence judging, deciding and reflecting for tasks in sustainable development]. Zeitschrift für Didaktik der Naturwissenschaften, 12, 177–197.Google Scholar
  21. Einhaus, E. (2007). Schülerkompetenzen im Bereich Wärmelehre. Entwicklung eines Testinstruments zur Überprüfung und Weiterentwicklung eines normativen Modells fachbezogener Kompetenzen [Student’s competences in thermodynamics. Developing a test and a refining a normative model of domain-specific competences]. Berlin, Germany: Logos.Google Scholar
  22. Erduran, S., Simon, S. & Osborne, J. (2004). Tapping into argumentation: Developments in the application of Toulmin’s Argument Pattern for studying science discourse. Science Education, 88(6), 915–933.CrossRefGoogle Scholar
  23. Geelan, D. (2012). Teacher explanations. In B. Fraser, K. Tobin & C. McRobbie (Eds.), Second international handbook of science education (pp. 987–999). Dordrecht, The Netherlands: Springer.CrossRefGoogle Scholar
  24. Gollwitzer, M. (2007). Latent-class-analysis. In H. Moosbrugger & A. Kelava (Eds.), Testtheorie und Fragebogenkonstruktion (pp. 280–306). Berlin, Germany: Springer.Google Scholar
  25. Gonzales, L. & Carter, K. (1996). Correspondence in cooperating teachers’ and student teachers’ interpretations of classroom events. Teaching and Teacher Education, 12, 39–47.CrossRefGoogle Scholar
  26. Hänze, M. & Berger, R. (2007). Kooperatives Lernen im Gruppenpuzzle und im Lernzirkel [Cooperative learning with the jigsaw method]. Unterrichtswissenschaft, 35(3), 227–240.Google Scholar
  27. Hartings, M. & Fahy, D. (2011). Communicating chemistry for public engagement. Nature Chemistry, 3, 674–677.CrossRefGoogle Scholar
  28. Heller, K. & Perleth, C. (2000). KFT 4-12+R - Kognitiver Fähigkeiten Test für 4. bis 12 [The Cognitive Abilities Test for grades 4 to 12]. Klassen. Göttingen, Germany: Beltz.Google Scholar
  29. Hinds, P. J. (1999). The curse of expertise: The effects of expertise and debiasing methods on predictions of novice performance. Journal of Experimental Psychology: Applied, 5, 205–221.Google Scholar
  30. Jiménez-Aleixandre, M. & Erduran, S. (2007). Argumentation in science education: An overview. In S. Erduran & M. Jiménez-Aleixandre (Eds.), Argumentation in science education (pp. 3–27). Heidelberg, Germany: Springer.CrossRefGoogle Scholar
  31. Jiménez-Aleixandre, M., Rodríguez, A. & Duschl, R. (2000). “Doing the lesson” or “doing science”: Argument in high school genetics. Science Education, 81(11), 533–559.Google Scholar
  32. Kattman, U., Duit, R., Gropengießer, H. & Komorek, M. (1997). Das Modell der Didaktischen Rekonstruktion. Ein Rahmen für naturwissenschaftsdidaktische Forschung und Entwicklung [The model of educational reconstruction: A framework for science education research and development]. Zeitschrift für Didaktik der Naturwissenschaften, 3(3), 3–18.Google Scholar
  33. Kobow, I. (2015). Entwicklung und Validierung eines Testinstrumentes zur Erfassung der Kommunikationskompetenz im Fach Chemie [Developing and validating a test inventory for communication competence in chemistry]. Berlin, Germany: Logos.Google Scholar
  34. Kulgemeyer, C. (2010). Physikalische Kommunikationskompetenz. Modellierung und Diagnostik [Modelling and measuring science communication competence]. Berlin, Germany: Logos.Google Scholar
  35. Kulgemeyer, C. (2015). Science communication competence test: Test booklet and solutions. Breman, Germany: University of Bremen. doi: 10.13140/RG.2.1.3626.9605.
  36. Kulgemeyer, C. & Schecker, H. (2009). Kommunikationskompetenz in der Physik: Zur Entwicklung eines domänenspezifischen Kompetenzbegriffs [Science communication competence: Developing a domain-specific concept of competence]. Zeitschrift für Didaktik der Naturwissenschaften, 15, 131–153.Google Scholar
  37. Kulgemeyer, C. & Schecker, H. (2012). Physikalische Kommunikationskompetenz - Empirische Validierung eines normativen Modells [Science communication competence - empirical validation of a noramtive model]. Zeitschrift für Didaktik der Naturwissenschaften, 18, 29–54.Google Scholar
  38. Kulgemeyer, C. & Schecker, H. (2013). Students explaining science – Assessment of science communication competence. Research in Science Education, 43, 2235–2256.Google Scholar
  39. Kulgemeyer, C. & Schecker, H. (2014). Research on Educational Standards in German Science Education – Towards a model of students’ competences. EURASIA Journal of Mathematics, Science & Technology Education, 10(4), 365–369.Google Scholar
  40. Krauss, S., Brunner, M. & Kunter, M. (2008a). Pedagogical content knowledge and content knowledge of secondary mathematics teachers. Journal of Educational Psychology, 100(3), 716–725.CrossRefGoogle Scholar
  41. Krauss, S., Neubrand, M., Blum, W., Baumert, J., Brunner, M., Kunter, M., … Köwen, K. (2008a). Die Untersuchung des professionellen Wissens deutscher Mathematik-Lehrerinnen und -Lehrer im Rahmen der COACTIV-Studie [Researching Professional Knowledge of German Math Teachers in COACTIV]. Journal für Mathematik-Didaktik, 29(3/4), 223–258.Google Scholar
  42. McNeill, K. & Krajcik, J. (2007). Inquiry and scientific explanations: helping students use evidence and reasoning. In J. Luft, R. Bell & J. Gess-Newsome (Eds.), Science as an inquiry in the secondary setting (pp. 121–134). Arlington, TX: NSTA.Google Scholar
  43. Nathan, M. & Koedinger, K. (2000). An investigation of teachers’ beliefs of students’ algebra development. Cognition and Instruction, 18(2), 209–237.CrossRefGoogle Scholar
  44. Nathan, M. & Petrosino, A. (2003). Expert blind spot among preservice teachers. American Educational Research Journal, 40(4), 905–928.CrossRefGoogle Scholar
  45. Osborne, J., Erduran, S. & Simon, S. (2004). Enhancing the quality of argumentation in school science. Journal of Research in Science Teaching, 41(10), 994–1020.CrossRefGoogle Scholar
  46. Osborne, J. F. & Patterson, A. (2011). Scientific argument and explanation: A necessary distinction? Science Education, 95(4), 627–638.CrossRefGoogle Scholar
  47. Renkl, A., Wittwer, J. & Große, C. (2006). Instruktionale Erklärungen beim Erwerb kognitiver Fertigkeiten: sechs Thesen zu einer oft vergeblichen Bemühung [Instructional explanations and achievement: Six assumptions about a too often futile effort]. In I. Hosenfeld (Hrsg.): Schulische Leistung. Grundlagen, Bedingungen, Perspektiven [School achievement. basics, conditions and perspectives], (pp. 205–223). Münster, Germany: Waxmann.Google Scholar
  48. Rincke, K. (2011). It’s rather like learning a language: Development of talk and conceptual understanding in mechanics lessons. International Journal of Science Education, 33(2), 229–258.CrossRefGoogle Scholar
  49. Rosseel, Y. (2015). The lavaan tutorial. Department of data analysis. Ghent, Belgium: Ghent University.Google Scholar
  50. Schempp, P., Manross, D., Tan, S. & Fincher, M. (1998). Subject expertise and teacher’s knowledge. Journal of Teaching in Physics Education, 17, 342–356.CrossRefGoogle Scholar
  51. Schmidt, M. (2008). Kompetenzmodellierung und -diagnostik im Themengebiet Energie der Sekundarstufe I. Entwicklung und Erprobung eines Testinventars [Competence modeling in energy. Developing a test]. Berlin, Germany: Logos.Google Scholar
  52. Shulman, L. (1987). Knowledge and teaching: Foundations of the new reform. Harvard Education Review, 57(1), 1–22.CrossRefGoogle Scholar
  53. Spektor-Levy, O., Eylon, B.-S. & Scherz, Z. (2009). Teaching scientific communication skills in science studies: Does it make a difference? International Journal of Science and Mathematics Education, 7(5), 875–903.CrossRefGoogle Scholar
  54. Toulmin, S. (1958). The uses of argument. Cambridge, United Kingdom: Cambridge University Press.Google Scholar
  55. Treagust, D. & Harrison, A. (1999). The genesis of effective science explanations for the classroom. In J. Loughran (Ed.), Researching teaching: Methodologies and practices for understanding pedagogy (pp. 28–43). Abingdon, United Kingdom: Routledge.Google Scholar
  56. Vinzi, V. E., Trinchera, L. & Amato, S. (2010). PLS path modeling: From foundations to recent developments and open issues for model assessment and improvement. In V. E. Vinzi, W. Chin, J. Henseler & H. Wang (Eds.), Handbook of partial least squares (pp. 47–82). Berlin, Germany: Springer.CrossRefGoogle Scholar
  57. Wellnitz, N., Fischer, H., Kauertz, A., Mayer, J., Neumann, I., Pant, H. A.,…Walpuski, M. (2012). Evaluation der Bildungsstandards - eine fächerübergreifende Testkonzeption für den Kompetenzbereich Erkenntnisgewinnung [Evaluation of educational standards—A cross-domain approach for scientific methods]. Zeitschrift für Didaktik der Naturwissenschaften, 18, 261–291.Google Scholar
  58. Whittington, C., Pellock, S., Connungham, R. & Cox, J. (2014). Combining content and elements of communication into an upper-level biochemistry course. Biochemistry and Molecular Biology Education, 42(2), 136–141.CrossRefGoogle Scholar
  59. Wittwer, J., Nückles, M., Landmann, N. & Renkl, A. (2010). Can tutors be supported in giving effective explanations? Journal of Educational Psychology, 102(1), 74–89.CrossRefGoogle Scholar
  60. Wittwer, J. & Renkl, A. (2008). Why instructional explanations often do not work: A framework for understanding the effectiveness of instructional explanations. Educational Psychologist, 43(1), 49–64.CrossRefGoogle Scholar
  61. Wittwer, J. & Renkl, A. (2010). How effective are instructional explanations in example-based learning? A meta-analytic review. Educational Psychology Review, 22, 393–409.CrossRefGoogle Scholar
  62. Zohar, A. & Nemet, F. (2002). Fostering students’ knowledge and argumentation skills through dilemmas in human genetics. Journal of Research in Science Teaching, 39(1), 35–62.CrossRefGoogle Scholar

Copyright information

© Ministry of Science and Technology, Taiwan 2016

Authors and Affiliations

  1. 1.Institute of Science Education, Physics Education DepartmentUniversity of BremenBremenGermany

Personalised recommendations