Advertisement

Reading and Writing

, Volume 31, Issue 3, pp 533–557 | Cite as

Exploring early adolescents’ evaluation of academic and commercial online resources related to health

  • Carita KiiliEmail author
  • Donald J. Leu
  • Miika Marttunen
  • Jarkko Hautala
  • Paavo H. T. Leppänen
Article

Abstract

This study assessed the ability of 426 students (ages 12–13) to critically evaluate two types of online locations on health issues: an academic resource and a commercial resource. The results indicated limited evaluation abilities, especially for the commercial resource, and only a small, partial association with prior stance and offline reading ability. Only about half (51.4%) of the students questioned the credibility of the commercial online resource and only about 19% of the students showed an ability to fully recognize commercial bias. Wide variation existed in students’ ability to evaluate online information, as approximately one-fourth of the students performed poorly when evaluating the overall credibility of both online resources and one-fourth performed well. Logistic regression models showed that offline reading skills accounted for only 8.8% of the variance for the academic online resource and 15.1% of that for the commercial resource. No association appeared between evaluation and background knowledge, although an association with prior stance was observed for each online resource. The results are discussed in light of previous research and the need to pay greater attention to the critical evaluation of online resources during classroom instruction.

Keywords

Evaluation Online reading Digital literacy Adolescents Critical reading 

Notes

Acknowledgements

This work was supported by the Academy of Finland (No. 274022). We are also grateful to Sini Hjelm, Sonja Tiri and Paula Rahkonen for their valuable work with the data collection and data management.

References

  1. Afflerbach, P., & Cho, B.-Y. (2010). Determining and describing reading strategies: Internet and traditional forms of reading. In H. S. Waters & W. Schneider (Eds.), Metacognition, strategy use, and instruction (pp. 201–225). New York, NY: Guilford.Google Scholar
  2. Andreassen, R., & Strømsø, H. I. (2012). Reading about health risks: Who and what to trust? A research review. In K. P. Knutsen, S. Kvam, P. H. Langemeyer, K. Solfjeld, & A. Parianou (Eds.), Narratives of risk: Interdisciplinary studies (pp. 255–274). Münster: Waxman.Google Scholar
  3. Bates, B. R., Romina, S., Ahmed, R., & Hopson, D. (2006). The effect of source credibility on consumers’ perceptions of the quality of health information on the Internet. Medical Informatics and the Internet in Medicine, 31, 45–52.CrossRefGoogle Scholar
  4. Bogdan, R. C., & Biklen, S. K. (2003). Qualitative research for education: An introduction to theories and methods (4th ed.). Boston, MA: Allyn & Bacon.Google Scholar
  5. Brand-Gruwel, S., Wopereis, I., & Vermetten, Y. (2005). Information problem solving by experts and novices: Analysis of a complex cognitive skill. Computers in Human Behavior, 21, 487–508.CrossRefGoogle Scholar
  6. Brassart, D. G. (1996). Does a prototypical argumentative schema exist? Text recall in 8 to 13 years olds. Argumentation, 10, 163–174.Google Scholar
  7. Bråten, I., McCrudden, M. T., Lund, E. S., Brante, E. W., & Stømsø, H. I. (2017). Task-oriented learning with multiple documents: Effects of topic familiarity, author expertise, and content relevance on document selection, processing, and use. Reading Research Quarterly.  https://doi.org/10.1002/rrq.197.Google Scholar
  8. Bråten, I., Strømsø, H. I., & Britt, M. A. (2009). Trust matters: Examining the role of source evaluation in students’ construction of meaning within and across multiple texts. Reading Research Quarterly, 44, 6–28.CrossRefGoogle Scholar
  9. Bråten, I., Strømsø, H. I., & Salmerón, L. (2011). Trust and mistrust when students read multiple information sources about climate change. Learning and Instruction, 21, 180–192.CrossRefGoogle Scholar
  10. Britt, M. A., Perfetti, C. A., Sandak, R., & Rouet, J. F. (1999). Content integration and source separation in learning from multiple texts. In S. R. Goldman, A. C. Graesser, & P. van den Broek (Eds.), Narrative comprehension, causality, and coherence: Essays in honor of Tom Trabasso (pp. 209–233). Mahwah, NJ: Erlbaum.Google Scholar
  11. Byron, T. (2008). Safer children in a digital world: The report of the Byron Review: Be safe, be aware, have fun. London: Department for Children, Schools and Families, and the Department for Culture, Media and Sport. Retrieved from http://dera.ioe.ac.uk/7332/7/Final%20Report%20Bookmarked_Redacted.pdf. Accessed 15 Jan 2017.
  12. Coiro, J. (2011a). Predicting reading comprehension on the Internet contributions of offline reading skills, online reading skills, and prior knowledge. Journal of Literacy Research, 43, 352–392.CrossRefGoogle Scholar
  13. Coiro, J. (2011b). Talking about reading as thinking: Modeling the hidden complexities of online reading comprehension. Theory Into Practice, 50, 107–115.CrossRefGoogle Scholar
  14. Coiro, J., Coscarelli, C., Maykel, C., & Forzani, E. (2015). Investigating criteria that seventh graders use to evaluate the quality of online information. Journal of Adolescent & Adult Literacy, 59, 287–297.CrossRefGoogle Scholar
  15. Coiro, J., & Dobler, E. (2007). Exploring the online reading comprehension strategies used by sixth-grade skilled readers to search for and locate information on the Internet. Reading Research Quarterly, 42, 214–257.CrossRefGoogle Scholar
  16. Eastin, M. S. (2008). Toward a cognitive development approach to youth perceptions of credibility. In M. J. Metzger & A. J. Flanagin (Eds.), Digital media, youth, and credibility (pp. 29–48). Cambridge, MA: MIT Press.Google Scholar
  17. Eastin, M. S., Yang, M. S., & Nathanson, A. I. (2006). Children of the net: An empirical exploration into the evaluation of Internet content. Journal of Broadcasting & Electronic Media, 50, 211–230.CrossRefGoogle Scholar
  18. Eccles, J. S. (1999). The development of children ages 6 to 14. The Future of Children, 9, 30–44.CrossRefGoogle Scholar
  19. Eklund, K., Torppa, M., Leppänen, P. H. T., & Lyytinen, H. (2015). Literacy skill development of children with familial risk for dyslexia through grades 2, 3, and 8. Journal of Educational Psychology, 107, 126–140.CrossRefGoogle Scholar
  20. Fabos, B. (2008). The price of information: Critical literacy, education, and today’s Internet. In J. Coiro, M. Knobel, C. Lankshear, & D. J. Leu (Eds.), Handbook of research on new literacies (pp. 839–870). New York, NY: Erlbaum.Google Scholar
  21. Flanagin, A. J., & Metzger, M. J. (2008). Digital media and youth: Unparalleled opportunity and unprecedented responsibility. In M. J. Metzger & A. J. Flanagin (Eds.), Digital media, youth, and credibility (pp. 5–27). Cambridge, MA: MIT Press.Google Scholar
  22. Fogg, B. J., Soohoo, C., Danielson, D., Marable, L., Stanford, J., & Tauber, E. R. (2002). How do people evaluate a Web site’s credibility: Results from a large study. Retrieved from http://www.consumerwebwatch.org/dynamic/web-credibility-reports-evaluate-abstract.cfm. Accessed 20 Feb 2018.
  23. Fuchs, L. S., Fuchs, D., Hosp, M. K., & Jenkins, J. R. (2001). Oral reading fluency as an indicator of reading competence: A theoretical, empirical, and historical analysis. Scientific Studies of Reading, 5, 239–256.CrossRefGoogle Scholar
  24. Gasser, U., Cortesi, S., Malik, M., & Lee, A. (2012). Youth and digital media: From credibility to information quality (February 16, 2012). Berkman Center Research Publication No. 2012-1. Retrieved from http://dx.doi.org/10.2139/ssrn.2005272. Accessed 5 Jan 2017.
  25. Goldman, S. R., & Scardamalia, M. (2013). Managing, understanding, applying, and creating knowledge in the information age: Next-generation challenges and opportunities. Cognition and Instruction, 31, 255–269.CrossRefGoogle Scholar
  26. Gray, N. J., Klein, J. D., Noyce, P. R., Sesselberg, T. S., & Cantrill, J. A. (2005). Health information-seeking behaviour in adolescence: The place of the Internet. Social Science and Medicine, 60, 1467–1478.CrossRefGoogle Scholar
  27. Hahnel, C., Goldhammer, F., Naumann, J., & Kröhne, U. (2016). Effects of linear reading, basic computer skills, evaluating online information, and navigation on reading digital text. Computers in Human Behavior, 55, 486–500.CrossRefGoogle Scholar
  28. Hartman, D., Hagerman, M. S., & Leu, D. J. (in press). Towards a new literacies perspective of synthesis: Multiple source meaning construction. To appear in J. Braasch and I. Bråten (Eds.), Handbook of research on multiple source use. London: Routledge. Google Scholar
  29. Howe, P., & Teufel, B. (2014). Native advertising and digital natives: The effects of age and advertisement format on news website credibility judgments. The Journal of the International Symposium on Online Journalism, 4, 78–90.Google Scholar
  30. Kammerer, Y., Kalbfell, E., & Gerjets, P. (2016). Is this information source commercially biased? How contradictions between web pages stimulate the consideration of source information. Discourse Processes, 53, 430–456.CrossRefGoogle Scholar
  31. Kervin, L., Jones, S. C., & Mantei, J. (2012). Online advertising: Examining the content and messages within websites targeted at children. E-Learning and Digital Media, 9, 69–82.CrossRefGoogle Scholar
  32. Kiili, C., Laurinen, L., & Marttunen, M. (2008). Students evaluating Internet sources: From versatile evaluators to uncritical readers. Journal of Educational Computing Research, 39, 75–95.CrossRefGoogle Scholar
  33. Kintsch, W. (1998). Comprehension: A paradigm for cognition. New York, NY: Cambridge University Press.Google Scholar
  34. Krippendorff, K. (2004). Content analysis: An introduction to its methodology (2nd ed.). Thousand Oaks, CA: Sage.Google Scholar
  35. Le Bigot, L., & Rouet, J. F. (2007). The impact of presentation format, task assignment, and prior knowledge on students’ comprehension of multiple online documents. Journal of Literacy Research, 39, 445–470.CrossRefGoogle Scholar
  36. Leu, D. J., Forzani, E., Rhoads, C., Maykel, C., Kennedy, C., & Timbrell, N. (2015). The new literacies of online research and comprehension: Rethinking the reading achievement gap. Reading Research Quarterly, 50, 37–59.CrossRefGoogle Scholar
  37. Leu, D. J., Kinzer, C. K., Coiro, J., Castek, J., & Henry, L. A. (2013). New literacies: A dual level theory of the changing nature of literacy, instruction, and assessment. In D. E. Alvermann, N. J. Unrau, & R. B. Ruddell (Eds.), Theoretical models and processes of reading (6th ed., pp. 1150–1181). Newark, DE: International Reading Association.CrossRefGoogle Scholar
  38. Lindeman, J. (1998). Ala-asteen lukutesti ALLU [Reading test for primary school ALLU]. Turku: Oppimistutkimuksen keskus.Google Scholar
  39. Macedo-Rouet, M., Braasch, J. L., Britt, M. A., & Rouet, J. F. (2013). Teaching fourth and fifth graders to evaluate information sources during text comprehension. Cognition and Instruction, 31, 204–226.CrossRefGoogle Scholar
  40. Morahan-Martin, J. M. (2004). How Internet users find, evaluate, and use online health information: A cross-cultural review. Cyberpsychology and Behavior, 7, 497–510.CrossRefGoogle Scholar
  41. Nevala, J., & Lyytinen, H. (2000). Sanaketjutesti [Word chain test]. Jyväskylä: Niilo Mäki Instituutti.Google Scholar
  42. Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2(2), 175–220.CrossRefGoogle Scholar
  43. Onwuegbuzie, A. J., & Teddlie, C. (2003). A framework for analyzing data in mixed methods research. In A. Tashakkori & C. Teddlie (Eds.), Handbook of mixed methods in the social and behavioral sciences (pp. 351–385). Thousand Oaks, CA: Sage.Google Scholar
  44. Perfetti, C. A., Rouet, J.-F., & Britt, M. A. (1999). Towards a theory of documents representation. In H. van Oostendorp & S. Goldman (Eds.), The construction of mental representations during reading (pp. 99–122). Mahwah, NJ: Erlbaum.Google Scholar
  45. Rouet, J. F., Le Bigot, L., de Pereyra, G., & Britt, M. A. (2016). Whose story is this? Discrepancy triggers readers’ attention to source information in short narratives. Reading and Writing, 29, 1549–1570.CrossRefGoogle Scholar
  46. Stanford History Education Group. (2016). Evaluating information: The cornerstone of civic responsibility. An executive summary. Retrieved from https://sheg.stanford.edu/upload/V3LessonPlans/Executive%20Summary%2011.21.16.pdf. Accessed 1 March 2017.
  47. Strømsø, H. I., Bråten, I., Britt, M. A., & Ferguson, L. E. (2013). Spontaneous sourcing among students reading multiple documents. Cognition and Instruction, 31, 176–203.CrossRefGoogle Scholar
  48. Tseng, S., & Fogg, B. J. (1999). Credibility and computing technology. Communications of the ACM, 42(5), 39–44.CrossRefGoogle Scholar
  49. van Strien, J. L., Kammerer, Y., Brand-Gruwel, S., & Boshuizen, H. P. (2016). How attitude strength biases information processing and evaluation on the web. Computers in Human Behavior, 60, 245–252.CrossRefGoogle Scholar
  50. Walraven, A., Brand-Gruwel, S., & Boshuizen, H. P. (2009). How students evaluate information and sources when searching the World Wide Web for information. Computers & Education, 52, 234–246.CrossRefGoogle Scholar
  51. Walton, D. N. (1991). Bias, critical doubt and fallacies. Argumentation and Advocacy, 28, 1–22.CrossRefGoogle Scholar
  52. Wiley, J., Goldman, S. R., Graesser, A. C., Sanchez, C. A., Ash, I. K., & Hemmerich, J. A. (2009). Source evaluation, comprehension, and learning in Internet science inquiry tasks. American Educational Research Journal, 46, 1060–1106.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media B.V. 2017

Authors and Affiliations

  1. 1.Department of EducationUniversity of OsloOsloNorway
  2. 2.Department of EducationUniversity of JyvaskylaJyväskyläFinland
  3. 3.Neag School of EducationUniversity of ConnecticutStorrsUSA
  4. 4.Department of PsychologyUniversity of JyvaskylaJyväskyläFinland

Personalised recommendations