Advertisement

Reading and Writing

, Volume 32, Issue 6, pp 1345–1357 | Cite as

Argumentative writing: theory, assessment, and instruction

  • Ralph P. FerrettiEmail author
  • Steve Graham
Article

Abstract

Despite the early emergence of oral argumentation, written argumentation is slow to develop, insensitive to alternative perspectives, and generally of poor quality. These findings are unsettling because high quality argumentative writing is expected throughout the curriculum and needed in an increasingly competitive workplace that requires advanced communication skills. In this introduction, we provide background about the theoretical perspectives that inform the papers included in this special issue and highlight their contributions to the extant literature about argumentative writing.

Keywords

Argumentative writing Theory Assessment Instruction 

The argumentative impulse originates with the anticipation of a real or imagined difference of opinion about a controversial issue (van Eemeren et al., 2014). Given people’s inherently self-interested tendencies, it is likely that the appearance of argumentation as a form of verbal communication was nearly coincident with the emergence of human speech itself. In any case, we know that the systematic study of argumentation, its purposes, and the discursive strategies used to argue have a long and venerable history in Western thought. In fact, many theoretical and metalinguistic concepts that we now use to understand the varieties of argumentation originate in antiquity (van Eemeren et al., 2014).

The fifth century BC is seminal in the development of argumentation and human rationality because we see for the first time the construction of a written meta-representational system designed to formalize principles of reasonable argumentation (Harris, 2009). Aristotle (1962) clearly had an inchoate understanding of the importance of meta-representation when he wrote “Spoken words are the symbols of mental experience and written words are the symbols of spoken words”. The importance of this realization cannot be overstated because it suggests “…any full writing system is capable of rendering in visual form anything that is spoken (Olson, 2016, p. 22),” and by implication, anything that can be mentally represented. Furthermore, and perhaps more important in this context, the creation of written meta-representational concepts and principles focusing on reasonable argumentation resulted in a sapient consciousness of the relevance, validity, and evidential basis for reasons (Olson, 2016). Simply put, writing enabled people to record, examine, and evaluate representations of reasoning as objects of reflection. The consequences of this discovery for the development of Western civilization are incalculable.

Naturalistic studies of argumentative discourse show that very young children engage in a variety of discursive tactics to influence other people (Bartsch, Wright, & Estes, 2009; Dunn, 1988; Dunn & Munn, 1985). Between 18 and 24 months, toddlers use sentences to argue with parents and siblings (Kuczynski & Kochanska, 1990; Perlman & Ross, 2005), and at 36 months, they are able to produce negative and positive reasons to justify a decision (Stein & Bernas, 1999). Despite this precocity, children and adults are prone to my-side bias (Kuhn, 1991; Perkins, Farady, & Bushey, 1991) and are predisposed to use self-interested standards to evaluate their arguments and those of other people (Ferretti & Fan, 2016). The insensitivity to alternative perspectives and neglect of evaluative standards are also seen in students’ written arguments (National Center for Educational Statistics, 2012). The NAEP report showed that only about 25% of students’ argumentative essays provide strong reasons and supporting examples, but they often fail to consider alternative perspectives. Evidence for my-side bias is widely found in the empirical literature (see Ferretti and Fan, 2016). These findings have sparked research about argumentative writing and given impetus to the design of interventions to improve the quality of students’ written arguments (Ferretti & Lewis, 2019).

Concern about students’ preparedness for the modern workplace has also heightened interest in their argumentative writing (Ferretti & De La Paz, 2011). Opportunities for blue-collar jobs are diminishing, and work increasingly depends upon the use of sophisticated technology and the acquisition of specialized reading and writing skills (Biancrosa & Snow, 2006; Graham & Perin, 2007). Furthermore, reading, writing, and content area learning have become inextricably interconnected throughout the curriculum (Shanahan & Shanahan, 2008; 2012). As a result, students are expected to make and evaluate interpretative claims by using disciplinary strategies and evaluative standards when reading and writing (Ferretti & De La Paz, 2011). These expectations are reflected in the emphasis in Common Core State Standards (Common Core State Standards Initiative, 2010) on argumentative writing across the curriculum.

Current theories of argumentative writing (Ferretti & Fan, 2016) recognize its intrinsically social and dialogical nature, and that it involves the presentation of a constellation of propositions intended to achieve the interlocutors’ discursive goals (van Eemeren, 2018). There are, however, theoretically-motivated differences of perspective about the foci of argumentative writing research, the methods used to study written arguments, and the instructional strategies that can be implemented to improve written argumentation. In what follows, we provide some background about these matters for the purpose of contextualizing the contributions to this special issue.

Sociocultural perspective

Sociocultural theorists investigate how social mediation shapes meaning-making in historical and cultural context (Bazerman, 2016; Beach, Newell, & VanDerHeide, 2016; Englert, Mariage, & Dunsmore, 2006; Newell, Beach, Smith, & VanDerHeide, 2011). In the sociocultural view, writing is a semiotic tool that supports communication and social relationships, is learned and practiced in social situations, and is used to accomplish inherently social goals (Bazerman, 2016; Graham, 2018; Newell, Bloome, Kim, & Goff, 2018). Given the focus on communication and social interaction, writing research in this tradition focuses on the situations within which people write and the influence of those situations on the writer’s participation is social activities. The writing context is never static (Bazerman, 2016). New texts become available, new communicative relationships develop, and new social practices emerge that influence human communication. Hence, the sociocultural tradition focuses on the interactions that take place among people over time and in different situations, and how writing creates shared meanings and representations that have consequences for the participants (Bazerman, 2016; Beach et al., 2016).

Sociocultural theorists also believe that writing development is influenced by participating in situations that afford opportunities to appropriate semiotic tools and social practices (Bazerman, 2016; Newell et al., 2011). Research in this tradition tends to use qualitative methods to reveal aspects of the context that affect and are affected by participation in social activity. Newell et al.’s (2018) study of a teacher’s shifting argumentative epistemologies during instructional interactions with her students illustrates how ethnographic methods can be used to capture the contextual and situational influences on her representation of argumentation, the development of her teaching practices, and the standards she used to evaluate her students’ argumentative writing.

In a similar vein, Monte-Sano and Allen (2018) used comparative case study methods to investigate the development of novice history teachers’ writing instruction after completing their pre-service teaching program. This study, which involved comparisons across multiple units of analysis, found that the types and sophistication of students’ written arguments depended on the kind of historical work they were assigned, the types of prompts to which they were asked to respond, and the degree to which their argumentative writing was supported by their teachers. Both studies relied on careful analysis of the contextual factors that influenced teachers’ instructional practices and students’ argumentative writing. Newell et al. (2018) and Monte-Sano and Allen (2018) also provide information about how the appropriation of disciplinary processes and standards in the English Language Arts and History influenced the development of teaching practices related to argumentative writing.

Cognitive perspective

The cognitive perspective (Graham, 2018; Hayes, 1996; Hayes & Flower, 1986; MacArthur & Graham, 2016) views argumentative writing as a problem-solving process that requires self-regulation to achieve the author’s rhetorical goals (Bereiter & Scardamalia, 1987; Graham & Harris, 1997). Problem solving is done in a problem space that results from the person’s internal representation of the task environment (Newell & Simon, 1972). The internal representation amounts to the problem solver’s understanding of the task environment, and the problem space is a network of paths for transforming this understanding into the goal. In the cognitive view, problem solving operates within an information processing system that is constrained by the writer’s available capacities and resources (Flower & Hayes, 1980, 1981). Skilled writers manage these constraints by setting goals and then planning, writing, and revising their essays. Research shows that the failure to strategically allocate limited cognitive resources adversely impacts writing performance (Ferretti & Fan, 2016).

Writers draw on their knowledge of argumentative discourse, the topic, their interlocutor, and critical standards of evaluation to write arguments (Ferretti & De La Paz, 2011; Ferretti & Lewis, 2019). Expert writers possess fluent linguistic skills, genre and topic knowledge (McCutchen, 1986; 2011), and are skilled at setting goals to guide the writing process. In contrast, novices are less fluent, possess less genre and topic knowledge, and have difficulty strategically regulating the writing process (Graham, Harris, & McKeown, 2013; Harris, Graham, MacArthur, Reid, & Mason, 2011; McCutchen, 2011). In contrast to experts, novices write down topically relevant information that is used to generate related information (Bereiter & Scardamalia, 1987). Difficulties with self-regulation are seen in all aspects of the problem solving of unskilled writers (Graham et al., 2013).

Studies in the cognitive tradition often use experimental procedures and quantitative analyses to make inferences about the factors that influence argumentative writing. Ferretti and Lewis’s (2018) studied the effects of writing goals and knowledge of the persuasion genre on the quality of elementary and middle-school students’ argumentative writing. In addition, they examined students’ knowledge of persuasive discourse by analyzing the types of ideas they generated to help an imaginary student who was having difficulty writing. Their analyses showed that genre-specific writing goals and knowledge of persuasion predicted writing quality, and furthermore, that the ideas students generated to support an imaginary student revealed implicit knowledge about the intentions of other people that was not evidenced in their essays.

Graham et al. (2018) provided evidence about Alexander’s (1997, 1998) model of domain learning, which posits that knowledge, motivation, and strategic behavior impact students’ writing development. In particular, Graham et al. measured whether individual differences in these characteristics predicted growth in the argumentative writing of fifth-grade students before and after writing instruction. There were some differences in the predictive value of different variables before and after instruction, but the most robust predictor of writing quality was topic knowledge. This finding is consistent with Ferretti and Lewis’s findings (2018), and provides further evidence for the influence of topic and genre knowledge on students’ argumentative writing (Gillespie, Olinghouse, & Graham, 2013; Olinghouse, & Graham, 2009; Olinghouse, Graham, & Gillespie, 2015).

Sociocultural and cognitive perspectives

Many of the papers that appear in this special issue draw on the cognitive and sociocultural perspectives to conceptualize, analyze, and interpret their research. Three intervention studies (Harris, Ray, Graham, & Houston, 2018; McKeown et al., 2018; Ray, Graham, & Liu, 2018) were inspired by the Self-Regulated Strategy Development (SRSD) model of writing development (Harris & Graham, 1985, 2009, 2016; Harris et al., 2011). The SRSD approach is founded on multiple lines of theoretical and empirical inquiry that address the cognitive, social, and motivational dimensions of writing (Harris & Graham, 2016). The cognitive components address the aforementioned limits on students’ knowledge and processing capacities by explicitly teaching writing strategies that enable them to plan, write, and revise their essays. The social components include the dialogic interactions that take place between teachers and students to scaffold the student’s self-regulated problem solving. The motivational aspects are seen in the use of instructional procedures that are intended to improve students’ self-efficacy, increase their expectations for success, and attribute their success to effort and other controllable aspects of their performance. Collectively, these three papers contribute additional evidence to a well-established literature about the benefits of SRSD writing instruction.

Harris et al. (2018) investigated the effects of SRSD instruction for close reading of informational text to support the persuasive writing of unskilled fourth- and fifth-grade writers. The instruction focused on how material from the informational text could be used to elaborate and support students’ persuasive essays. SRSD instruction was associated with improvements in genre elements, the complexity of students’ plans, and the holistic quality of their essays. These finding highlight the integration of reading and writing instruction that is increasingly important as students make progress through the curriculum (Common Core State Standards Initiative, 2010; Ferretti & De La Paz, 2011).

Ray et al. (2018) developed a SRSD strategy to teach struggling high school writers to analyze prompts used on the ACT examination, and then plan and write their argumentative essays. Writing is a gateway skill for college success (Applebee & Langer, 2006), and high quality writing on admission tests can positively impact a student’s future educational prospects. Ray and her colleagues found that SRSD instruction for the ACT examination resulted in better plans, a greater number of genre elements, and higher ACT essay scores. These findings provide encouragement to students who may have difficulty writing arguments but seek the many benefits of attending a college of their choosing.

SRSD instruction is demonstrably effective in improving writing outcomes for novice and more experienced writers (Harris & Graham, 2016; Harris et al., 2011; Lewis & Ferretti, 2011; Song & Ferretti, 2013) when it is delivered under conditions that ensure its procedural fidelity. Unfortunately, many classroom teachers are poorly prepared to deliver high quality writing instruction with fidelity (Graham, in press), so there is a relative dearth of information about the effects of teacher-led, classroom-based interventions on the quality of students’ argumentative writing. McKeown et al. (2018) addressed this issue by comparing the writing quality of students in urban schools whose teachers either did or did not receive professional development for SRSD writing instruction. The authors found that the quality of students’ argumentative essays was better if their teachers received SRSD professional development despite the fact that procedural fidelity was not always observed. The authors surmised that the effects on students’ writing quality may have been even stronger if the instruction had been delivered with greater fidelity.

Earlier we mentioned that people generally fail to apply critical standards when evaluating arguments. Studies of argumentative writing have almost exclusively focused on the goal of persuading a real or imagined audience (Ferretti & Lewis, 2018). Audience considerations reflect a rhetorical judgment (van Eemeren & Grootendorst, 1992; Santos & Santos, 1999) because they are based a community’s prevailing standards of acceptability. However, audience considerations alone are insufficient because judgments about an argument’s reasonableness require the use of normative standards for evaluating the person’s argumentative strategies (Ferretti, Andrews-Weckerly, & Lewis, 2007; Ferretti & Fan, 2016). The reasonableness standard is tested when interlocutors answer critical questions about the argumentative strategies used by them (Walton, Reed, & Macagno, 2008).

Nussbaum et al. (2018) assessed whether dialogic interactions and instructional support for the use of critical questions affected college students’ argumentative writing. Students engaged in debates and wrote arguments about controversial issues associated with assigned reading materials. All students were provided with argumentation vee diagrams (AVD) that were used to represent the reasons for and against a position prior to and during class discussions. However, in contrast to the control condition, the AVDs of students in the experimental condition also included information about the critical questions that could be used to evaluate the argument from consequences strategy. The authors found that over time, students who used AVDs with critical questions generated more refutations than those in the control condition. Some transfer was also seen when students wrote without the critical questions. These findings contribute to a relatively meager literature about the benefits of supporting students’ use of critical questions to evaluate their written arguments (Nussbaum & Edwards, 2011; Song & Ferretti, 2013; Wissinger & De La Paz, 2016).

Linguistic, sociocultural, and cognitive perspectives

A number of studies reported in this special issue are informed by constructs and methods drawn from sociocultural, cognitive, and linguistic perspectives. Linguistic analyses can be helpful because texts are written in natural language by writers who have considerable discretion with respect to their goals, genre, word choice, and grammatical structures (Pirnay-Dummer, 2016). Skilled readers bring their knowledge of language, text structures, and world knowledge to bear on the interpretation of text (Duke, Pearson, Strachan, & Billman, 2011). However, even skilled readers can draw different interpretations about the simplest of texts. For this reason, considerable effort has been invested in conducting detailed analyses of linguistic features that are associated with high quality texts (McNamara, Crossley, & McCarthy, 2010).

MacArthur, Jennings, and Philippatkos (2018) analyzed the argumentative essays of basic college writers to determine the linguistic features that predicted their writing development. A corpus of argumentative essays was drawn from an earlier study focusing on the effects of strategy instruction on writing quality. Coh-Metrix, a natural language processing (NLP) tool (McNamara, Graesser, McCarthy, & Cai, 2014), was used to develop a model of linguistic constructs to predict writing quality before and after instruction, and also to analyze how those constructs changed in response to instruction. They found that essay length, referential cohesion, and lexical complexity were positively associated with writing quality. Furthermore, changes in writing in response to instruction were linked to improvements in referential cohesion and lexical complexity. These findings suggest that the text’s linguistic features are sensitive to instruction, and that NLP tools can be used to detect changes in those features. The latter finding is important because formative assessments using NLP-based scoring systems should be sensitive to changes in students’ writing in response to instruction (Chapelle, Cotos, & Lee, 2015).

Argumentative essays are difficult to score in vivo when the assessment goal is to guide timely instructional decisions and support student learning. Concerns about the time-sensitivity of writing assessments have led researchers to develop automated essay scoring (AES) systems (Shermis & Burstein, 2013). AES systems analyze observable components of text to identify approximations to intrinsic characteristics of writing (Shermis, Burstein, Higgins, & Zechner, 2010) These systems have traditionally been designed to yield a holistic score for on-demand, timed summative assessments that are correlated with human judgment (Deane, 2013). However, serious questions have been raised about the usefulness of AES systems in providing feedback for instructional purposes, as well as the construct validity of scores derived from these systems. Deane (2013) argues that these concerns may be mitigated if information derived from AES systems is augmented with data about the component reasoning skills related to writing collected from other tasks.

Deane et al. (2018) reported about the use of scenario-based assessments (SBAs) to measure the component skills that underlie written argumentation. SBAs provide students with a purpose for reading thematically related texts and engaging in tasks that are sequenced to assess increasingly complex reasoning skills. The sequence of SBAs is guided by an hypothesized learning progression (LP) framework that describes skills of increasing sophistication that are thought to contribute to proficiency in argumentative writing (Deane and Song, 2014). Deane and his colleagues measured students’ performance on SBAs that tapped the component skills of creating, evaluating, and summarizing arguments. In addition, linguistic features of students’ essays were measured with the AES system E-rater (Attali and Burstein, 2005). Measures of the linguistic features and component skills were used to predict the quality of students’ argumentative writing. Furthermore, the component skills were analyzed to see if they were aligned with the hypothesized LP. They found that linguistic features and the component skills contributed unique variance to the prediction of argumentative writing. Furthermore, the component skills were generally aligned with the hypothesized LP. These findings provide suggestive evidence for the hypothesized LP and for Deane’s (2013) conjecture about the value of measuring genre-related reasoning skills that influence students’ argumentative writing.

Allen, Likens, and McNamara (2018) observed that associations between linguistic features and writing quality can vary across a range of contextual factors, resulting in multiple linguistic profiles of high quality writing (Allen, Snow, & McNamara, 2016; Crossley, Roscoe, & McNamara, 2014). This finding has resulted in the hypothesis that skilled writing results from the flexible use of linguistic style rather than a fixed set of linguistic features (Allen et al., 2016). Allen and her colleagues examined this hypothesis by having high school students write and revise their argumentative essays in Writing Pal (W-PAL; Roscoe, Allen, Weston, Crossley, & McNamara, 2014; Roscoe & McNamara, 2013), a NLP-based intelligent tutoring system that can provide formative and summative feedback about writing, support practice for mechanics, and deliver strategy instruction. All students in this study received formative and summative feedback about their writing, and half of students also received feedback about spelling and grammar.

The authors were interested in whether feedback about spelling and grammar affected linguistic flexibility, and whether linguistic flexibility was related to writing quality. In addition, they sought information about the dimensions along which linguistic variation was observed. Statistical analyses showed that students’ essays varied along a number of linguistic dimensions across prompts and within drafts, and that variation in some of these dimensions was related to essay quality. However, feedback about writing mechanics did not influence the linguistic properties of their writing. These findings are consistent with the linguistic flexibility hypothesis and with Graham and Perin’s (2007) conclusion that writing quality is unaffected by spelling and grammar instruction.

We mentioned earlier that curricula increasingly emphasize the interdependence of reading and writing (Biancrosa & Snow, 2006; Graham & Perin, 2007). Students are expected to integrate and evaluate information from diverse sources when writing, identify arguments and evaluate specific claims in a text, and assess the adequacy of the evidence offered in support of those claims (Common Core State Standards Initiative, 2010). These are formidable tasks for native language (L1) speakers, and even more challenging for second language (L2) students. L2 students may have limited reading and writing proficiency, lack L2 fluency for academic communication, possess minimal background knowledge in L2, and have difficulty making inferences in L2, especially when those inferences rely of genre-specific cultural conventions (Grabe & Zhang, 2013). Confronted with these challenges, Cummins (2016) has argued that L2 students may draw on a shared pool of shared academic concepts and skills to support transfer across languages, that is, the linguistic interdependence hypothesis (LIH).

van Weijen, Rijlaarsdam, and Bergh (2018) tested the LIH by having Dutch speaking college students write essays in their native language and in English after reading sources that could be used as evidence for their argument. The authors sought information about the degree to which students’ essays were of comparable quality in L1 and L2, and whether their use of sources was similar across languages and predictive of essay quality. van Weijen and her colleagues found a relatively strong positive correlation between essay quality in L1 and L2. In addition, they found that students tended to rely more heavily on source material when writing in L2, but in general, writers tended to use common source features when writing in both languages. Students also tended to incorporate evidence for and against the proposition in L1 and L2. Finally, the same two features of source material predicted writing quality in L1 and L2, and that these relationships were not language dependent. In sum, these findings provide some support for the LIH, and suggest that students draw on a shared pool of concepts and skills when writing from source material in L1 and L2.

Final thoughts

The papers in this special issue highlight a range of theoretical perspectives and analytic methods that have been used to study argumentative writing and understand the conditions that influence its development. The sociocultural, cognitive, and linguistic perspectives have each made important contributions to our understanding of argumentative writing, but as the studies in this special issue show, unique synergies arise when scholarship is not constrained by theoretical, methodological, and analytic siloes.

Notes

References

  1. Alexander, P. (1997). Mapping the multidimensional nature of domain learning: The interplay of cognitive, motivational, and strategic forces. In M. Maehr & P. Pintrich (Eds.), Advances in motivational achievement (Vol. 10, pp. 213–250). Greenwich, CT: JAI.Google Scholar
  2. Alexander, P. (1998). The nature of disciplinary and domain learning: The knowledge, interest, and strategic dimensions of learning from subject-matter text. In C. Hynd (Ed.), Learning from text across conceptual domains (pp. 55–76). Mahwah, NJ: Erlbaum.Google Scholar
  3. Allen, L. K., Likens, A. D., & McNamara, D. S. (2018). Writing flexibility in argumentative essays: A multidimensional analysis. Reading and Writing: An Interdisciplinary Journal.  https://doi.org/10.1007/s11145-018-9921-y.Google Scholar
  4. Allen, L. K., Snow, E. L., & McNamara, D. S. (2016). The narrative waltz: The role of flexibility on writing performance. Journal of Educational Psychology, 108, 911–924.CrossRefGoogle Scholar
  5. Applebee, A. N., & Langer, J. A. (2006). The state of writing instruction in America’s schools: what existing data tell us (p. 2006). Albany: Center on English Learning & Achievement, University at Albany, State University of New York.Google Scholar
  6. Aristotle (trans. 1962). On interpretation. The University of Adelaide Library eBooks @Adelaide.Google Scholar
  7. Attali, Y., & Burstein, J. (2005). Automated essay scoring with E-rater v. 2.0. ETS research report series, 2004(2). Princeton, NJ: Educational Testing Service.Google Scholar
  8. Bartsch, K., Wright, J., & Estes, D. (2009). Young children’s persuasion in everyday conversation: Tactics and attunement to others’ mental states. Social Development, 23, 394–416.Google Scholar
  9. Bazerman, C. (2016). What to sociocultural studies of writing tell us about learning to write? In C. A. MacArthur, S. Graham, & J. Fitzgerald (Eds.), Handbook of writing research (2nd ed., pp. 11–23). NY: Guilford.Google Scholar
  10. Beach, R., Newell, G. E., & VanDerHeide, J. (2016). A sociocultural perspective on writing development: Toward an agenda for classroom research on students’ use of social practices. In C. A. MacArthur, S. Graham, & J. Fitzgerald (Eds.), Handbook of writing research (2nd ed., pp. 88–101). NY: Guilford.Google Scholar
  11. Bereiter, C., & Scardamalia, M. (1987). The psychology of written composition. Hillsdale, NJ: Lawrence Erlbaum Associates.Google Scholar
  12. Biancarosa, G., & Snow, C. E. (2006). Reading next: A vision for action and research in middle and high school literacy: A report from the Carnegie Corporation of New York (2nd ed.). Washington, DC: Alliance for Excellent Education.Google Scholar
  13. Chapelle, C. A., Cotos, E., & Lee, J. (2015). Validity arguments for diagnostic assessment using automated writing evaluation. Language Testing, 32, 385–405.CrossRefGoogle Scholar
  14. Common Core State Standards Initiative. (2010). Common core state standards for English language arts & literacy in history/social studies, science, and technical subjects. Retrieved from http://www.corestandards.org/assets/CCSSI_ELA%20Standards.pdf.
  15. Crossley, S. A., Roscoe, R. D., & McNamara, D. S. (2014). What is successful writing? An investigation into the multiple ways writers can write high quality essays. Written Communication, 31, 181–214.CrossRefGoogle Scholar
  16. Cummins, J. (2016). Reflections on cummins (1980), “The cross -lingual dimensions of language proficiency: Implications for bilingual education and the optimal age issue”. TESOL Quarterly, 50, 940–944.CrossRefGoogle Scholar
  17. Deane, P. (2013). On the relation between automated essay scoring and modern views of the writing construct. Assessing Writing, 18, 7–24.CrossRefGoogle Scholar
  18. Deane, P., & Song, Y. (2014). A case study in principled assessment design: Designing assessments to measure and support the development of argumentative reading and writing skills. Spanish Journal of Educational Psychology, 20, 99–108.Google Scholar
  19. Deane, P., Song, Y., van Rijn, P., O’Reilly, T., Fowles, M., Bennett, R., et al. (2018). The case for scenario-based assessment of written argumentation. Reading and Writing: An Interdisciplinary Journal.  https://doi.org/10.1007/s11145-018-9852-7.Google Scholar
  20. Duke, N. K., Pearson, P. D., Strachan, S. L., & Billman, A. K. (2011). Essential elements of fostering and teaching reading comprehension. In S. J. Samuels & A. E. Farstrup (Eds.), What research has to say about reading instruction (4th ed., pp. 51–93). Newark, DE: International Reading Association.CrossRefGoogle Scholar
  21. Dunn, J. (1988). The beginnings of social understanding. Cambridge: Harvard University Press.CrossRefGoogle Scholar
  22. Dunn, J., & Munn, P. (1985). Becoming a family member: Family conflict and the development of social understanding. Child Developmental, 56, 480–492.CrossRefGoogle Scholar
  23. Englert, C. S., Mariage, T. V., & Dunsmore, K. (2006). Tenets of sociocultural theory in writing instruction research. In C. A. MacArthur, S. Graham, & J. Fitzgerald (Eds.), Handbook of writing research (1st ed., pp. 208–221). New York: Guilford Press.Google Scholar
  24. Ferretti, R. P., Andrews-Weckerly, S., & Lewis, W. E. (2007). Improving the argumentative writing of students with learning disabilities: Descriptive and normative considerations. Reading and Writing Quarterly, 23, 267–285.CrossRefGoogle Scholar
  25. Ferretti, R. P., & De La Paz, S. (2011). On the comprehension and production of written texts: Instructional activities that support content-area literacy. In R. O’Connor & P. Vadasy (Eds.), Handbook of reading interventions (pp. 326–355). New York: Guilford Press.Google Scholar
  26. Ferretti, R. P., & Fan, Y. (2016). Argumentative writing. In C. A. MacArthur, S. Graham, & J. Fitzgerald (Eds.), Handbook of writing research (2nd ed., pp. 301–315). New York: Guilford Press.Google Scholar
  27. Ferretti, R. P., & Lewis, W. E. (2018). Knowledge of persuasion and writing goals predict the quality of children’s persuasive writing. Reading and Writing: An Interdisciplinary Journal.  https://doi.org/10.1007/s11145-018-9918-6.Google Scholar
  28. Ferretti, R. P., & Lewis, W. E. (2019). Best practices in teaching argumentative writing. In S. Graham, C. A. MacArthur, & J. Fitzgerald (Eds.), Best practices in writing instruction (3rd ed., pp. 135–161). New York: Guilford Press.Google Scholar
  29. Flower, L., & Hayes, R. H. (1980). The cognition of discovery: Defining a rhetorical problem. College Composition and Communication, 31, 21–32.CrossRefGoogle Scholar
  30. Flower, L., & Hayes, J. R. (1981). A cognitive process theory of writing. College Composition and Communication, 32, 365–387.CrossRefGoogle Scholar
  31. Gillespie, A., Olinghouse, N. G., & Graham, S. (2013). Fifth-grade students’ knowledge about writing process and writing genres. The Elementary School Journal, 113, 565–588.CrossRefGoogle Scholar
  32. Grabe, W., & Zhang, C. (2013). Reading and writing together: A critical component of English for academic purposes teaching and learning. TESOL Journal, 4, 9–24.CrossRefGoogle Scholar
  33. Graham, S. (2006). Strategy instruction and the teaching of writing: A meta-analysis. In C. A. MacArthur, S. Graham, & J. Fitzgerald (Eds.), Handbook of writing research (pp. 187–207). New York: Guilford Press.Google Scholar
  34. Graham, S. (2018). The writer(s)-within-community model of writing. Educational Psychologist, 53, 258–279.CrossRefGoogle Scholar
  35. Graham, S. (in press). Changing how writing is taught. In T. Pigott, Ryan, A., & C. Tocci (Eds). Review of research in education. Washington, DC: American Educational Research Association.Google Scholar
  36. Graham, S., & Harris, K. R. (1997). It can be taught, but it doesn’t develop naturally: Myths and realities in writing instruction. School Psychology Review, 26, 414–424.Google Scholar
  37. Graham, S., Harris, K. R., & McKeown, D. (2013). The writing of students with LD and a meta-analysis of SRSD writing intervention studies: Redux. In H. L. Swanson, K. Harris, & S. Graham (Eds.), The handbook of learning disabilities (2nd ed., pp. 405–438). New York: Guilford Press.Google Scholar
  38. Graham, S., Harris, K., Wijekumar, K., Lei, P., Barkel, A., Aitken, A., et al. (2018). The roles of writing knowledge, motivation, strategic behaviors, and skills in predicting elementary students’ persuasive writing from source material. Reading and Writing: An Interdisciplinary Journal.  https://doi.org/10.1007/s11145-018-9836-7.Google Scholar
  39. Graham, S., & Perin, D. (2007). Writing next: Effective strategies to improve writing of adolescents in middle and high schools. NY: Carnegie Corporation.Google Scholar
  40. Harris, K. R., & Graham, S. (1985). Improving learning disabled students’ composition skills: Self-control strategy training. Learning Disabilities Quarterly, 8, 27–36.CrossRefGoogle Scholar
  41. Harris, K. R., & Graham, S. (2009). Self-regulated strategy development in writing: Premises, evolution, and the future. British Journal of Educational Psychology, 6, 113–135.CrossRefGoogle Scholar
  42. Harris, K. R., & Graham, S. (2016). Self—regulated strategy development in writing: Policy implications of an evidence—based practice. Policy Insights from the Behavioral and Brain Sciences, 3, 77–84.CrossRefGoogle Scholar
  43. Harris, K. R., Graham, S., MacArthur, C., Reid, R., & Mason, L. H. (2011). Self-regulated learning processes and children’s writing. In B. J. Zimmerman & D. Schunk (Eds.), Handbook of self-regulation of learning and performance (pp. 187–202). New York: Routledge.Google Scholar
  44. Harris, K. R., Ray, A. B., Graham, S., & Houston, J. (2018). Answering the challenge: SRSD instruction for close reading of text to write to persuade with 4th and 5th grade students experiencing writing difficulties. Reading and Writing: An Interdisciplinary Journal.  https://doi.org/10.1007/s11145-018-9910-1.Google Scholar
  45. Harris, R. (2009). Rationality in the literate mind. London: Routledge.CrossRefGoogle Scholar
  46. Hayes, J. (1996). A new framework for understanding cognition and affecting writing. In M. Levy & S. Ransdell (Eds.), The science of writing: Theories, methods, individual differences, and applications (pp. 1–27). Mahwah, NJ: Erlbaum.Google Scholar
  47. Hayes, J. R., & Flower, L. S. (1986). Writing research and the writer. American Psychologist, 41, 1106–1113.CrossRefGoogle Scholar
  48. Kuczynski, L., & Kochanska, G. (1990). Development of children’s noncompliance strategies from toddlerhood to age 5. Developmental Psychology, 26, 398–408.CrossRefGoogle Scholar
  49. Kuhn, D. (1991). The skills of argument. New York: Cambridge University Press.CrossRefGoogle Scholar
  50. Lewis, W. E., & Ferretti, R. P. (2011). Topoi and literary interpretation: The effects of a critical reading and writing intervention on high school students’ analytic literary essays. Contemporay Educational Psychology, 36, 334–354.CrossRefGoogle Scholar
  51. MacArthur, C. A., & Graham, S. (2016). Writing research from a cognitive perspective. In C. A. MacArthur, S. Graham, & J. Fitzgerald (Eds.), Handbook of writing research (2nd ed., pp. 24–40). NY: Guilford.Google Scholar
  52. MacArthur, C. A., Jennings, A., & Philippakos, Z. A. (2018). Which linguistic features predict quality of argumentative writing for college basic writers, and how do those features change with instruction? Reading and Writing: An Interdisciplinary Journal.  https://doi.org/10.1007/s11145-018-9853-6.Google Scholar
  53. McCutchen, D. (1986). Domain knowledge and linguistic knowledge in the development of writing ability. Journal of Memory and Language, 25, 431–444.CrossRefGoogle Scholar
  54. McCutchen, D. (2011). From novice to expert: Implications of language skills and writing-relevant knowledge for memory during the development of writing skill. Journal of Writing Research, 3, 51–68.CrossRefGoogle Scholar
  55. McNamara, D. S., Crossley, S. A., & McCarthy, P. M. (2010). Linguistic features of writing quality. Written Communication, 27, 57–86.CrossRefGoogle Scholar
  56. McNamara, D. S., Graesser, A. C., McCarthy, P., & Cai, Z. (2014). Automated evaluation of text and discourse with Coh-Metrix. Cambridge: Cambridge University Press.CrossRefGoogle Scholar
  57. McKeown, D., FitzPatrick, E., Brown, M., Brindle, M., Owens, J., & Hendrick, R. (2018). Urban teachers’ implementation of SRSD for persuasive writing following practice-based professional development: positive effects mediated by compromised fidelit. Reading and Writing: An Interdisciplinary Journal.  https://doi.org/10.1007/s11145-018-9864-3.Google Scholar
  58. Monte-Sano, C., & Allen, A. (2018). Historical argument writing: The role of interpretative work, argument type, and classroom instruction. Reading and Writing: An Interdisciplinary Journal.  https://doi.org/10.1007/s11145-018-9891-0.Google Scholar
  59. National Center for Education Statistics. (2012). The nation’s report card: Writing 2011 (NCES 2012-470). Institute for Education Sciences, U.S: Department of Education, Washington, D.C.Google Scholar
  60. Newell, A., & Simon, H. A. (1972). Human problem solving. Englewood Cliffs, NJ: Prentice-Hall.Google Scholar
  61. Newell, G., Bloome, D., Kim, M.-Y., & Goff, B. (2018). Shifting epistemologies during instructional conversations about "good" argumentative writing in a high school English language arts classroom. Reading and Writing: An Interdisciplinary Journal.  https://doi.org/10.1007/s11145-018-9905-y.Google Scholar
  62. Newell, G. E., Beach, R., Smith, J., & VanDerHeide, J. (2011). Teaching and learning argumentative reading and writing: A review of research. Reading Research Quarterly, 46, 273–304.Google Scholar
  63. Nussbaum, E. M., Dove, I., Slife, N., Kardash, C. M., Turgut, R., & Vallett, D. B. (2018). Using critical questions to evaluate written and oral arguments in an undergraduate general education seminar: a quasi-experimental study. Reading and Writing: An Interdisciplinary Journal.  https://doi.org/10.1007/s11145-018-9848-3.Google Scholar
  64. Nussbaum, E. M., & Edwards, O. V. (2011). Critical questions and argument strategems: A framework for enhancing and analyzing students’ reasoning practices. The Journal of the Learning Sciences, 20, 443–488.CrossRefGoogle Scholar
  65. Olinghouse, N. G., & Graham, S. (2009). The relationship between the writing knowledge and the writing performance of elementary-school children. Journal of Educational Psychology, 101, 37–50.CrossRefGoogle Scholar
  66. Olinghouse, N. G., Graham, S., & Gillespie, A. (2015). The relationship of discourse and topic knowledge to fifth graders’ writing performance. Journal of Educational Psychology, 107, 391–406.CrossRefGoogle Scholar
  67. Olson, D. R. (2016). The mind on paper: Reading, consciousness and rationality. CAMBRIDE, UK: Cambridge University Press.Google Scholar
  68. Perkins, D. N., Faraday, M., & Bushey, B. (1991). Everyday reasoning and the roots of intelligence. In J. F. Voss, D. N. Perkins, & J. W. Segal (Eds.), Informal reasoning and education (pp. 83–105). Hillsdale, NJ: Erlbaum.Google Scholar
  69. Perlman, M., & Ross, H. (2005). If-then contingencies in children’s sibling conflicts. Merrill-Palmer Quarterly, 51, 42–66.CrossRefGoogle Scholar
  70. Ray, A. B., Graham, S., & Liu, X. (2018). Effects of SRSD college entrance essay exam instruction for high school students with disabilities or at-risk for writing difficulties. Reading and Writing: An Interdisciplinary Journal.  https://doi.org/10.1007/s11145-018-9900-3.Google Scholar
  71. Roscoe, R. D., Allen, L. K., Weston, J. L., Crossley, S. A., & McNamara, D. S. (2014). The Writing Pal intelligent tutoring system: Usability testing and development. Computers and Composition, 34, 39–59.CrossRefGoogle Scholar
  72. Roscoe, R. D., & McNamara, D. S. (2013). Writing pal: Feasibility of an intelligent writing strategy tutor in the high school classroom. Journal of Educational Psychology, 105, 1010–1025.CrossRefGoogle Scholar
  73. Santos, C. M. M., & Santos, S. L. (1999). Good argument, content and contextual dimensions. In J. Andriessen & P. Coirier (Eds.), Foundations of argumentative text processing (pp. 75–95). Amsterdam: Amsterdam University Press.Google Scholar
  74. Shanahan, T., & Shanahan, C. (2008). Teaching disciplinary literacy to adolescents: Rethinking content-area literacy. Harvard Educational Review, 78, 40–59.CrossRefGoogle Scholar
  75. Shanahan, T., & Shanahan, C. (2012). What is disciplinary literacy and why does it matter? Topics in Language Disorders, 32, 7–18.CrossRefGoogle Scholar
  76. Shermis, M. D., & Burstein, J. (Eds.). (2013). Handbook of automated essay evaluation: Current applications and new directions. New York: Routledge.Google Scholar
  77. Shermis, M. D., Burstein, J., Higgins, D., & Zechner, K. (2010). Automated Essay Scoring: Writing assessment and instruction. In P. Peterson, E. Baker, & B. McGaw (Eds.), International encyclopedia of education (3rd ed., Vol. 4, pp. 20–26). Oxford: Elsevier.CrossRefGoogle Scholar
  78. Song, Y., & Ferretti, R. P. (2013). Teaching critical questions about argumentation through the revising process: Effects of strategy instruction on college students’ argumentative essays. Reading and Writing: An Interdisciplinary Journal, 26, 67–90.CrossRefGoogle Scholar
  79. Stein, N. L., & Bernas, R. (1999). The early emergence of argumentative knowledge and skill. In G. Rijlaarsdam & E. Espéret (Series Eds.) & J. Andriessen & P. Coirier (Eds.), Studies in writing: Vol. 5Foundations of argumentative text processing (pp. 97–116). Amsterdam, The Netherlands: University of Amsterdam Press.Google Scholar
  80. van Eemeren, F. H. (2018). Argumentation theory: A pragma-dialectical perspective. Cham, Switzerland: Springer.CrossRefGoogle Scholar
  81. van Eemeren, F. H., Garssen, B., Krabbe, E. C. W., Henkemans, A. F. S., Verheij, B., & Wagemans, J. H. M. (2014). Handbook of argumementation theory. Heidelberg: Springer.Google Scholar
  82. van Eemeren, F. H., & Grootendorst, R. (1992). Argumentation, communication, and fallacies: A pragma-dialectical perspective. Mahwah, NJ: Erlbaum.Google Scholar
  83. Walton, D., Reed, C., & Macagno, F. (2008). Argumentation schemes. Cambridge: Cambridge University Press.CrossRefGoogle Scholar
  84. Wissinger, D. R., & De La Paz, S. (2016). Effects of critical discussions on middle school students’ written historical arguments. Journal of Educational Psychology, 108, 43–59.CrossRefGoogle Scholar
  85. van Weijen, D., Rijlaarsdam, G., & van den Bergh, H. (2018). Source use and argumentation behavior in L1 and L2 writing: a within-writer comparison. Reading and Writing. Reading and Writing: An Interdisciplinary Journal.  https://doi.org/10.1007/s11145-018-9842-9.Google Scholar

Copyright information

© Springer Nature B.V. 2019

Authors and Affiliations

  1. 1.School of EducationUniversity of DelawareNewarkUSA
  2. 2.Mary Lou Fulton Teachers CollegeArizona State UniversityTempeUSA

Personalised recommendations