Skip to main content

The relationship between vocabulary and writing quality in three genres

Abstract

The purpose of this study was to examine the role of vocabulary in writing across three genres. Fifth graders (N = 105) wrote three compositions: story, persuasive, and informative. Each composition revolved around the topic of outer space to control for background knowledge. Written compositions were scored for holistic writing quality and several different vocabulary constructs: diversity, maturity, elaboration, academic words, content words, and register. The results indicated that students vary their vocabulary usage by genre. Story text had higher diversity than informative text as well as higher maturity as compared to persuasive text. Persuasive text contained higher diversity than informative text, and higher register than both of the other genres. Informative text included more content words and elaboration than the other text types as well as more maturity than persuasive text. Additionally, multiple regression and commonality analysis indicated that the vocabulary constructs related to writing quality differed by genre. For story text, vocabulary diversity was a unique predictor, while for persuasive text, content words and register were unique predictors. Finally, for informative text content words was the strongest unique predictor explaining almost all of the total variance in the five factor model, although maturity was also a unique predictor.

This is a preview of subscription content, access via your institution.

References

  • Barenbaum, E. M., Newcomer, P. L., & Nodine, B. F. (1987). Children’s ability to write stories as a function of variation in task, age, and developmental level. Learning Disability Quarterly, 10, 175–188.

    Article  Google Scholar 

  • Bar-Ilan, L., & Berman, R. A. (2007). Developing register differentiation: The Latinate–Germanic divide in English. Linguistics, 45(1), 1–35.

    Article  Google Scholar 

  • Benton, S., Corkill, A., Sharp, J., Downey, R., & Khramtsova, I. (1995). Knowledge, interest, and narrative writing. Journal of Educational Psychology, 87, 66–79.

    Article  Google Scholar 

  • Berman, R. A., & Nir-Sagiv, B. (2007). Comparing narrative and expository text construction across adolescence: A developmental paradox. Discourse Processes, 43(2), 79–120.

    Google Scholar 

  • Berman, R. A., & Verhoeven, L. (2002). Cross-linguistic perspectives on the development of text-production abilities: Speech and writing. Written Language and Literacy, 5(1), 1–43. doi:10.1075/wll.5.1.02ber.

    Article  Google Scholar 

  • Biber, D. (1988). Variation across speech and writing. Cambridge: Cambridge University Press.

    Book  Google Scholar 

  • Biber, D. (2009). A corpus-driven approach to formulaic language in English: Multi-word patterns in speech and writing. International Journal of Corpus Linguistics, 14, 275–311.

    Article  Google Scholar 

  • Biber, D., & Kurjian, J. (2007). Towards a taxonomy of web registers and text types: A multi-dimensional analysis. Language and Computers: Studies in Practical Linguistics, 59(1), 109–131.

    Google Scholar 

  • Biber, D., & Vasquez, C. (2008). Writing and speaking. In C. Bazerman (Ed.), Handbook of research on writing: History, society, school, individual, text (pp. 535–547). New York, NY: Lawrence Erlbaum.

    Google Scholar 

  • Brown, G. T. L., Glasswell, K., & Harland, D. (2004). Accuracy in the scoring of writing: Studies of reliability and validity using a New Zealand writing assessment system. Assessing Writing, 9, 105–121.

    Article  Google Scholar 

  • Carroll, J. B. (1964). Language and thought. Englewood Cliffs, NJ: Prentice Hall.

    Google Scholar 

  • Cobb, T. (n.d.). Web Vocabprofile [Accessed June 10, 2011, from http://www.lextutor.ca/vp/], an adaptation of Heatley & Nation’s (1994) Range.

  • Cohen, J., Cohen, P., West, S. G., & Aiken, L. S. (2003). Applied multiple regression/correlation analysis for the behavioral sciences (3rd ed.). Mahwah, NJ: Lawrence Erlbaum.

    Google Scholar 

  • Common Core State Standards Initiative. (2010). Common core state standards for English language arts & literacy in history/social studies, science, and technical subjects. Retrieved from http://www.corestandards.org/assets/CCSSI_ELA%20Standards.pdf.

  • Cooley, W. W., & Lohnes, P. R. (1976). Evaluation research in education: Theory, principles, and practice. New York: Irvington Publishers.

    Google Scholar 

  • Coxhead, A. (2000). A new academic word list. TESOL Quarterly, 34(2), 213–238. http://www.jstor.org/stable/3587951.

  • DeGroff, L. (1987). The influence of prior knowledge on writing, conferencing, and revising. Elementary School Journal, 88, 105–118.

    Article  Google Scholar 

  • DeRemer, M. L. (1998). Writing assessment: Raters’ elaboration of the rating task. Assessing Writing, 5, 7–30.

    Article  Google Scholar 

  • Duke, N. K., & Benett-Armistead, S. V. (2003). Reading and writing informational text in the primary grades: Research-based practices. New York: Scholastic Teaching Resources.

    Google Scholar 

  • Flower, L. S., & Hayes, J. R. (1980). The dynamics of composing: Making plans and juggling constraints. In L. W. Gregg & E. R. Sternberg (Eds.), Cognitive processes in writing (pp. 3–29). Hillsdale, NJ: Lawrence Erlbaum.

    Google Scholar 

  • Flower, L., & Hayes, J. R. (1981). A cognitive process theory of writing. In R. B. Ruddell, M. R. Ruddell, & H. Singer (Eds.), Theoretical models and processes of reading (pp. 928–950). Newark, DE: International Reading Association.

    Google Scholar 

  • Gardner, D. (2004). Vocabulary input through extensive reading: A comparison of words found in children’s narrative and expository reading materials. Applied Linguistics, 25(1), 1–37.

    Article  Google Scholar 

  • Godshalk, F. I., Swineford, F., & Coffman, W. E. (1966). The measurement of writing ability. New York: College Entrance Examination Board.

    Google Scholar 

  • Graham, S., Harris, K. R., Fink-Chorzempa, B., & MacArthur, C. A. (2003). Primary grade teachers’ instructional adaptations for struggling writers: A national survey. Journal of Educational Psychology, 95, 279–292.

    Article  Google Scholar 

  • Graham, S., Harris, K., & Hebert, M. A. (2011). Informing writing: The benefits of formative assessment. A Carnegie Corporation Time to Act report. Washington, DC: Alliance for Excellent Education.

    Google Scholar 

  • Grobe, G. (1981). Syntactic maturity, mechanics, and vocabulary as predictors of quality ratings. Research in the Teaching of English, 15(1), 75–85.

    Google Scholar 

  • Halliday, M. A. K., & Hasan, R. (1976). Cohesion in English. London: Longman.

    Google Scholar 

  • Hammill, D. D., & Larsen, S. C. (1996). Test of written langauge-3. Austin, TX: Pro-ed.

    Google Scholar 

  • Harmon, J. M., Hedrick, W. B., & Wood, K. D. (2005). Research on vocabulary instruction in the content areas: Implications for struggling readers. Reading and Writing Quarterly, 21, 261–280.

    Article  Google Scholar 

  • Harmon, J. M., Wood, K. D., & Medina, A. L. (2009). Vocabulary learning in the content areas: Research-based practices for middle and secondary school classrooms. In K. D. Wood & W. E. Blanton (Eds.), Literacy instruction for adolescents: Research based practice (pp. 344–367). New York: Guilford.

    Google Scholar 

  • Hayes, J. R. (1996). A new framework for understanding cognition and affect in writing. In R. B. Ruddell & N. J. Unrau (Eds.), Theoretical models and processes of reading (5th ed., pp. 1399–1430). Newark, DE: International Reading Association.

    Google Scholar 

  • Heatley, A., & Nation, P. (1994). Range. Victoria University of Wellington, NZ. [Computer program, available at http://www.vuw.ac.nz/lals/].

  • Hiebert, E., & Cervetti, G. (2011). What differences in narrative and informational texts mean for the learning and instruction of vocabulary (Reading Research Report No. 11.01). Retrieved from TextProject website: http://textproject.org/assets/publications/TextProject_RRR-11.01_Vocabularies-of-Narrative-and-Informational-Texts.pdf.

  • Johansson, V. (2008). Lexical diversity and lexical density in speech and writing: A developmental perspective. Lund University, Department of Linguistics and Phonetics: Working Papers, 53, 61–79.

    Google Scholar 

  • Johnson, W. (1944). Studies in language behavior: I. A program of research. Psychological Monographs, 56, 1–15.

    Article  Google Scholar 

  • Johnson, R. L., Penny, J., Fisher, S., & Kuhs, T. (2003). Score resolution: An investigation of the reliability and validity of resolved scores. Applied Measurement in Education, 16, 299–322.

    Article  Google Scholar 

  • Kerlinger, F. N., & Pedhazur, E. J. (1973). Multiple regression in behavioral research. New York: Holt, Rinehart, and Winston.

    Google Scholar 

  • Langer, J. (1984). The effects of available information on responses to school writing tasks. Research in the Teaching of English, 18, 27–44.

    Google Scholar 

  • Lawrence, J. F., White, C., & Snow, C. E. (2010). The words students need. Educational Leadership, 68(2), 23–26.

    Google Scholar 

  • Marzano, R. J. (2002). A comparison of selected methods of scoring classroom assessments. Applied Measurement in Education, 15, 249–267.

    Article  Google Scholar 

  • Marzano, R. J., & Pickering, D. J. (2005). Building academic vocabulary: Teacher’s manual. Alexandria, VA: Association for Supervision and Curriculum Development.

    Google Scholar 

  • McCarthy, P. M., & Jarvis, S. (2007). Vocd: A theoretical and empirical evaluation. Language Testing, 24, 459–488.

    Article  Google Scholar 

  • McCarthy, P. M., & Jarvis, S. (2010). MTLD, vocd-D, and HD-D: A validation study of sophisticated approaches to lexical diversity assessment. Behavior Research Methods, 42(2), 381–392.

    Article  Google Scholar 

  • McCarthy, P. M., Watanabi S., & Lamkin, T. A. (in press). The Gramulator: A tool to identify differential linguistic features of correlative text types. In P. M. McCarthy & C. Boonthum-Denecke (Eds.), Applied natural language processing: Identification, investigation, and resolution. Hershey, PA: IGI Global.

  • McCutchen, D. (1986). Domain knowledge and linguistic knowledge in the development of writing ability. Journal of Memory and Language, 25, 431–444.

    Article  Google Scholar 

  • McNamara, D. S., Louwerse, M. M., Cai, Z., & Graesser, A. (2005, January 1). Coh-Metrix version 1.4. Retrieved June 10, 2011, from http//:cohmetrix.memphis.edu.

  • Nagy, W. E., & Scott, J. A. (2000). Vocabulary processing. In M. L. Kamil, P. Mosenthal, P. D. Pearson, & R. Barr (Eds.), Handbook of reading research (Vol. 3, pp. 269–284). Mahwah, NJ: Erlbaum.

    Google Scholar 

  • National Assessment Governing Board. (2010). Writing framework for the 2011 National Assessment of Education Progress. Washington, DC: US Government Printing Office.

    Google Scholar 

  • Olinghouse, N. G., & Graham, S. (2009). The relationship between the discourse knowledge and the writing performance of elementary-grade students. Journal of Educational Psychology, 101(1), 37–50.

    Article  Google Scholar 

  • Olinghouse, N. G., & Leaird, J. T. (2009). The relationship between measures of vocabulary and narrative writing quality in second- and fourth-grade students. Reading and Writing, 22, 545–565.

    Article  Google Scholar 

  • Pappas, C. (1991). Young children’s strategies in learning the ‘book language’ of information books. Discourse Processes, 14, 203–225.

    Article  Google Scholar 

  • Pedhazur, E. J. (1997). Multiple regression in behavioral research (3rd ed.). Fort Worth, TX: Harcourt Brace.

    Google Scholar 

  • Penny, J., Johnson, R. L., & Gordon, B. (2000a). The effect of rating augmentation on inter-rater reliability: An empirical study of a holistic rubric. Assessing Writing, 7, 143–164.

    Article  Google Scholar 

  • Penny, J., Johnson, R. J., & Gordon, B. (2000b). Using rating augmentation to expand the scale of an analytic rubric. Journal of Experimental Education, 68, 269–287.

    Article  Google Scholar 

  • Rehder, B., Schreiner, M. E., Wolfe, M. B. W., Laham, D., Landauer, T. K., & Kintsch, W. (1998). Using latent semantic analysis to assess knowledge: Some technical considerations. Discourse Processes, 25(2&3), 337–354.

    Article  Google Scholar 

  • Scardamalia, M., & Bereiter, M. (1987). Knowledge telling and knowledge transforming in written composition. In S. Rosenberg (Ed.), Advances in applied psycholinguistics, Vol. 2: Reading, writing, and language learning (pp. 142–175). New York: Cambridge University Press.

    Google Scholar 

  • Scott, J. A., Lubliner, S., & Hiebert, E. H. (2006). Constructs underlying word selection and assessments tasks in the archival research on vocabulary instruction. In J. V. Hoffman, D. L. Schallert, C. M. Fairbanks, J. Worthy, & B. Maloch (Eds.), 55th Yearbook of the national reading conference (pp. 264–275). Oak Creek, WI: National Reading Conference, Inc.

    Google Scholar 

  • Stemler, S. E. (2004). A comparison of consensus, consistency, and measurement approaches to estimating interrater reliability. Practical Assessment, Research & Evaluation, 9(4).

  • Swales, J. (1988). Discourse communities, genres and English as an international language. World Englishes, 7(2), 211–220.

    Article  Google Scholar 

  • Watt, R. J. C. (2000). Concordance (computer software). Dundee, Scotland: R. J. C. Watt.

    Google Scholar 

  • West, M. (1953). A general service list of English words. London: Longman, Green & Co.

    Google Scholar 

  • Zeno, S. M., Ivens, S. H., Millard, R. T., & Duvvuri, R. (1995). The educator’s word frequency guide. New York, NY: Touchstone Applied Science Associates.

Download references

Acknowledgments

This study was funded in part by the University of Connecticut’s Large Faculty Grant program.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Natalie G. Olinghouse.

Rights and permissions

Reprints and Permissions

About this article

Cite this article

Olinghouse, N.G., Wilson, J. The relationship between vocabulary and writing quality in three genres. Read Writ 26, 45–65 (2013). https://doi.org/10.1007/s11145-012-9392-5

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11145-012-9392-5

Keywords

  • Writing
  • Vocabulary
  • Genre