If the 2020–2022 pandemic taught us anything, it was the interconnectedness of people across the globe. The pandemic caused major tears in the fabric that connects our lives in the social, economic, and political sectors. Higher education worldwide was not immune to the serious disruptions caused by COVID-19, the stealth virus underlying the pandemic. Universities and colleges cancelled classes, discontinued programs, and some closed their doors forever (Reis & Grady, 2020). The pandemic forced faculty and students to look deep inside their typical teaching and learning practices to find survival strategies.

Despite disruptions, universities need to fulfill the societal expectations for doctoral program graduates to be highly qualified and curious people who can address complex and multi-faceted problems. Doctoral program graduates add to the intellectual and creative firepower to douse our fears and lead us out of numerous and pervasive economic, educational, political, social, and health care challenges, revealed by the pandemic. EdD doctoral students are being prepared to address complex problems in their real world of practice. Because faculty are teaching the next generation of real-world problem-solvers, what they teach and how they teach becomes the foundation for research-informed practices of educational leaders in the future (Golde, 2006).

During the current educational context, faculty continue to be responsible for refining doctoral student research skills and fostering the development of educational leaders who address enduring problems of practice and contribute creative and research-grounded solutions in universities, schools, or other educational settings. Identifying practices that ensure doctoral student completion is essential in improving doctoral programs that invite, support, and sustain future educational leaders. Yet, even before the pandemic, a lingering problem in higher education worldwide was the number of doctoral students who enter but do not complete their studies (Lindsay, 2015). In the United States, more than 50% of PhD doctoral students do not complete their degrees (Church, 2009). In other countries such as Australia and the United Kingdom, the pattern of PhD non-completion degrees is also evident ranging from around 35–45% non-completion for full time students (Park, 2005). Unfortunately, most of the doctoral retention statistics are focused on PhD programs and do not include professional practice doctorates like the EdD. Given the pervasive impact of the pandemic on health, families, and even research funding (de Wit & Altbach, 2021), doctoral student completion rates are more likely to be impacted. However, according to Flaherty (2021), “It’s almost certain that COVID-19 played a role in how many students were able to finish their Ph.D.s… but it’s unclear from the data [Davies, 2020] how big a role” (p. 1). Yet, our 2002–2010 review of our own doctoral program in a U.S. urban comprehensive university shows our EdD students following a similar pattern as PhD students with an average of 45% non-completion rate.

Is doctoral student non-completion a worthwhile problem? Some say low retention and completion rates are acceptable because the programs are weeding out students who unprepared for the rigors of graduate work (Smallwood, 2004). Yet, others say that non-completion in U.S. doctoral education merits not only student but institutional attention (Lovitts, 2001; Park, 2005). With potential feelings of disappointment and, even failure, the psychological cost for doctoral program student dropouts is high (Lovitts, 2001; Smallwood, 2004). Because minority students tend to leave their programs at a higher rate than White students (Smallwood, 2004), the psychological cost for the minority student community is an additional concern for our core social justice values. Research seems to indicate that when faculty are sensitive and responsive to student needs in their teaching and interactions, students seem to be more engaged and, therefore, more likely to be successful (Amrein-Beardsley et al., 2012; Grant et al., 2014; Lee & Boud, 2003; Wellington, 2010; Zambo et al., 2014). In fact, a substantial body of research has demonstrated that faculty practices can have a powerful effect on undergraduate student success (Delmas & Childs, 2021; Tight, 2020; Umbach & Wawrzynski, 2005).

Some researchers sought to disentangle factors that undergird the lack of doctoral student program completion and looked for ways to support students through completion (Ahern & Manathunga, 2004; Bastalich et al., 2014; Lindsay, 2015; Zambo et al., 2014). Mentoring offered critical support (Caffarella & Barnett, 2000; Hilliard, 2012; Kamler, 2008; Mullen, 2001; Simpson & Matsuda, 2008; Thein & Beach, 2010). Writing groups fostered much needed peer support (Aitchison, 2009; Aitchison & Lee, 2006). Hilliard (2012) zeroed in on advisor quality:

…it is important for the advisor to continue to practice positive professional relationships and provide relevant academic support to candidates. The advisor should work closely with other faculty members and listen to the voices of candidates to ensure candidates’ success. (p. 7)

Another program practice, a cohort experience upon entry and throughout an EdD program, increased student retention and success (Friesen & Jacobsen, 2021; Zambo et al., 2013).

Many doctoral programs prepare candidates to earn either a PhD or an education doctorate (EdD). Many EdD students do not fit the traditional image of PhD graduate students (McAlpine et al., 2009). Typical PhD students attend the university full time with and spend their time teaching, assisting in a class, or doing research in a laboratory, and are fully immersed in a university setting. Whereas most EdD students seek to complete a degree while working full time in a professional setting, they can remain unaware and out-of-touch with the university campus facilities and student services (Taylor, 2007). Friesen and Jacobsen (2021) offer a more extensive discussion and comprehensive comparison of the EdD and PhD.

Preparation for a doctoral program most often begins with a master’s degree; yet master’s degree programs in education vary. Entering EdD doctoral students have very disparate master’s degree experiences, especially with academic writing and research. Some arrive grounded in the language and expectations surrounding research because they wrote a master’s thesis. Others complete a master’s program that offers in-depth understandings of the nature and practice of research as well as authentic research experiences, a strong foundation for an EdD degree (Brown et al., 2021; Jacobsen et al., 2018b). Still other master’s students may write and publish an article based on their research. For these three groups of master’s students, their background knowledge may bode well for a smooth transition to doctoral writing and research expectations. However, other EdD students may have earned their master’s degree alongside a teaching credential, and, therefore, lack essential research knowledge and experience. In addition, as working professionals, many EdD students may be returning to the university later in life. For some, the socialization practices of writing, studying, and relationship-building among students in graduate school may be unknown (Sverdlik et al., 2018). International EdD students may be full time students like typical PhD students and yet can be at a disadvantage because academic literacies for reading research and writing are in a second language. To combat attrition, university faculty need to design programs that account for this variance in master’s degree experience.

In recent years, many U.S. faculty doctoral programs have banded together to design EdD programs that better meet student needs, seek to increase student retention, and strengthen the potentially significant contribution of the EdD graduate to research-informed educational practice. Through a cross-institutional network, Carnegie Project on the Education Doctorate (CPED 2021) offers convenings and resources for faculty restructuring doctoral programs with the working professional in mind. In its underlying philosophy, CPED views education doctoral students as strong educational leaders who improve their schools and communities through research-informed practices. Then, EdD recipients can apply their research skills in their professional practice. Thus, CPED encourages faculty to aim their EdD programs at developing scholarly practitioners. The program goals are to teach doctoral students the skills, dispositions, and orientations to go forth into the world as problem-solvers in their professional practice settings. CPED EdD programs are designed to increase student retention and build solid research skills and practices without sacrificing the rigor of PhD programs (Shulman et al., 2006).

Our research was situated in an education doctoral program at an urban comprehensive university in the Western United States. Following interactions with CPED colleagues, we modified our program to meet the needs of our specific students. Our revised student learning outcomes included key elements from CPED: building a professional knowledge base, enhancing collaborative skills, and applying research-informed knowledge to the real-world problems in the workplace. In our own seminar, we sought to develop these CPED outcomes through encouraging interactions within a student community of practice (Wenger, 1998) where students reflected and applied their rich practical and professional background to real-world problems as scholarly practitioners (McClintock, 2004; Teeuwsen et al., 2014; Zambo et al., 2013). Thus, we expected our students to be scholars and apply that scholarly foundation to assess and address problems of practice in their professional settings.

Theoretical and Conceptual Frameworks

Our study focused on describing and explaining the experiences of two groups: our education doctoral students and us, the instructors. We used situated learning theory (Lave & Wenger, 1991) to frame our research. Within our faculty group, we used a conceptual framework: the Scholarship of Teaching and Learning (SoTL) (Felten, 2013). Referencing the SoTL framework in our investigation enhanced our research credibility (Billot et al., 2017 ). To share ideas and work together as faculty under the SoTL umbrella, we also used the practices of mutual engagement, joint enterprise, and shared repertoire (Wenger, 1998), key components of a community of practice (Tierney et al., 2020).

Situated Learning and Communities of Practice

When describing situated learning theory, Lave and Wenger (1991) explained that “learning is not merely situated in practice—as if it were some independently reifiable process that just happened to be located somewhere; learning is an integral part of generative social practice in the lived-world” (p. 35). In other words, social interaction and shared experiences lead to learning within a specific context. Situated learning experiences contrast with classroom experiences in which knowledge remains abstract and without context. In our program, we situated learning within the doctoral seminar to engage our students with experiences to build academic writing and research knowledge. As instructors our learning was situated in the doctoral student seminar and based on student responses to activities and experiences related to academic writing and research.

Lave and Wenger (1991) described the generative social practice, situated learning, as “legitimate peripheral participation” (p. 29)—a process by which newcomers become members of a “community of practice” (p. 29). Over time and through active participation within the community, members acquire more knowledge and experience; the newcomers or novice learners move from the periphery of the community to the center of learning community alongside the old-timers or expert learners. Peripheral participation is not just any participation but participation in the legitimate or authentic activities that signify the community’s unique qualities. In our case, we included assignments and classroom activities that mirrored the typical work and expectations of an academic community, such as submission of an Institutional Review Board application and presenting a research poster. Key features of a community of practice included mutual engagement, joint enterprise, and shared repertoire (Wenger, 1998).

Our goal was for students to experience working in a community of practice where they could tap their experiences as leaders and working professionals and, through legitimate participation, move toward the identity of scholarly practitioners and meet personal and professional needs (Caskey et al., 2020; Foot et al., 2014; Olson & Clark, 2009).

Because writing and research are essential throughout any doctoral program, we hoped that our students would succeed in the short run, within the first year, completing the high stakes comprehensive examination core paper, and, in the long run, within the four-year program plan, the dissertation.

Scholarship of Teaching and Learning

To observe and evaluate how our instructional practices supported student learning in a community of practice, we formed our own faculty community of practice. In his SoTL framework, Felten (2013) argued that “[F]or scholarly inquiry into student learning to be recognized as significant intellectual work in the academy, we (the community of practitioners) need to articulate our shared norms, our common principles of good practice of inquiry into student learning” (p. 122). We identified connections between Felten’s principles of SoTL practice and our instructional context (see Table 1).

Table 1 Application of SoTL Principles of Practice to Instructional Context

Our faculty community of practice included Wenger’s (1998) key indicators: mutual engagement, joint enterprise, and shared repertoire. Our mutual engagement entailed collaborating in planning, teaching, assessing, and reflecting on our teaching. We shared a common passion for teaching but had little experience in collaborating to teach the same class. With our overall shared goal of student learning, we were risk-takers and willing to try new methods. Being professors, we had extensive experience as university faculty; yet we had little experience working with a cohort of doctoral students over a year. Our joint enterprise required us to respond to situations as they arose as well as be consistent in our observation and reflection practices. From student formative assessments and our own reflections, we determined what was important, what to discuss, and what actions to take. Before each seminar, we met to develop the agenda. Afterwards, we took notes in our journals to reflect on what worked and what did not. Then we shared the notes with each other and used our observations and reflections to collaborate and develop the agenda and activities for the next seminar. We also rotated giving written feedback on student work every week. By developing a shared repertoire of practice, we developed a coherence in our community of practice through our instructional routines, activities, and strategies. Using the SoTL framework and working in a community of practice, we took a stance as investigators examining student responses to our instructional practices not as evaluators judging the merits of our teaching.

Based on the SoTL framework we had four assumptions about our process. First, data is valued and used regularly: we collected and analyzed formative assessments as information not evaluation. Second, reflection improves practice: we not only talked about our teaching; we wrote and shared reflections from our journals about our teaching. Third, teaching is a work-in-progress: we could never be perfect as teachers; the best stance was to view teaching as a journey not a destination and to learn from that journey to foster student learning. Finally, collaboration strengthens practice: we were present in all the seminar classes; when one of us was not teaching, the other was observing and taking notes. Our collaboration was based on what we did and how we did it. In Fig. 1, we illustrated the components of our faculty community of practice as applied to our SoTL project.

Fig. 1
figure 1

Application of SOTL to Our Faculty Community of Practice

We had two overarching research questions:

  1. 1.

    How did doctoral students respond to the seminar experiences within a community of practice?

  2. 2.

    Using a Scholarship of Teaching and Learning lens, what did we learn from our students about our teaching practices in our own community of practice?

Our study’s purposes were to (a) describe and explain how 12 EdD students responded to seminar experiences, and (b) apply a Scholarship of Teaching and Learning conceptual framework to our instructional practices to improve our teaching and impact student learning.

Method

In the methods section, we articulate our use of a qualitative case study method: describing the context, participants, and seminar features, identifying the data sources, and reporting methods of data analysis.

Case Study

We used a qualitative case study design because we were interested in investigating how our doctoral program seminar features helped our students be successful. In a case study, researchers investigate “a contemporary phenomenon (the ‘case’) in depth and within its real-world context…” (Yin, 2009, p. 16). A case study is developed within a specific bounded system (Creswell, 2013; Merriam, 2009). Thus, a case study method was appropriate for discovering and interpreting our EdD students’ and our own learning experiences within that bounded system, the doctoral seminar.

Context

Our educational doctoral program was in a Northwestern United States urban comprehensive university with an enrollment of approximately 28,600 students. Following a three-year pilot, our revised program admitted a cohort of 24 students to the 72-credit program comprised of courses, seminars, internships, and milestones (see Table 2). During the first year of our four-year doctoral program in an evening format which met fall, winter, and spring quarters, the full cohort took the three-course educational theory sequence. During the second year as foundation for developing their dissertation proposal by the end of the second year, the full cohort took the three-course research sequence.

The program also divided the full cohort into two smaller seminars to develop learning communities that met over three terms in each of the first two years of the program. This investigation focused on our experience as instructors with the smaller learning community, the doctoral seminar. In the second year we also supervised the students in a workplace internship where they piloted their ideas for their dissertation research. At the end of the second year, we wanted our students to be prepared to work with an advisor and defend the dissertation proposal during the third year and, finally, in the fourth year, to gather data and defend their dissertation.

Table 2 EdD Program of Study with Milestones by Year

Participants: Students

Our participants, 12 EdD students, were in the small learning community seminar. Most were female (9), spoke English as their first language (9), and were currently working professionals (9). About one-third of our students (4) were non-native speakers of English; of these, three were international students and former working professionals. The other eight held positions as adjunct faculty in education (3), teachers (2), a school administrator (1), an educational consultant (1), and a higher education administrator (1). Thus, many were currently grappling with the complex problems of practice in their professional settings. Their experience with academic writing and conducting research varied. Seven of 12 our students wrote a thesis for their master’s degree, while five completed a teacher work sample for a teaching credential and master’s degree.

Participants: Faculty

We, two female, White, and English-speaking professors, taught the doctoral student seminars. Prior to co-teaching in the doctoral program, we had taught in the same department for 14 years; we drew on extensive experience as public-school teachers. We were adept at developing curriculum, comfortable in the classroom, and experienced in teaching university classes for more than 30 combined years. We had served on student doctoral committees as advisors and committee members. Yet, we were new to co-planning curriculum and co-teaching a year-long doctoral student seminar.

Doctoral Seminar

We designed our doctoral seminar with two communities of practice in mind—the student community of practice and the faculty community of practice. To describe this seminar and the resulting communities of practice we will focus on two questions: how did we teach and what did students learn.

How Did We Teach? To guide students’ development of academic writing skills and knowledge of research methods, we used the cognitive apprenticeship instructional method (Collins, 2006) with an emphasis on modeling, coaching, and scaffolding. We applied this method often, as illustrated by the following example: how to develop an argument for a manuscript. First, we modeled how we used the one-page argument template (Graff & Birkenstein, 2018; Stevens, 2019) on our own manuscripts. Then, we coached the students as they used the template to frame the argument for their own research problem of practice for their comprehensive paper. After they developed their argument, we scaffolded the refinement of their argument to fit within their comprehensive paper using faculty and peer feedback within their small writing groups. We followed this instructional pattern of modeling, coaching, and scaffolding for teaching students how to do focused freewrites (Stevens & Cooper, 2009), write key sentences, develop an abstract, write an Institutional Review Board (IRB) proposal, set writing goals, and work in writing groups (Stevens, 2019). A fuller description of our research related to the use of the cognitive apprenticeship appeared in another manuscript (Caskey & Stevens, 2021).

The writing challenge for students was to develop their research and writing skills and, for the comprehensive exam paper, to identify a problem of practice and analyze it through the three theoretical perspectives taught in the program courses: learning, leadership, and program/policy. When students submitted successive drafts, we gave feedback and directed their attention to strategies and rhetorical structures that undergird academic writing (Stevens, 2019). For each draft, we used the track changes and comment features in MS (Microsoft) WORD™ and uploaded students’ drafts with our embedded feedback to their folder in the course management system. We created writing groups—smaller communities of practice in which our students set writing goals to be accountable for progress and to experience the shift of roles as readers (reviewers) and contributors (writers) (Guerin et al., 2015). We allocated at least 30 minutes of class time for their writing groups and expected groups to meet weekly for an hour outside of class. Their goals were to integrate faculty and peer feedback on three authentic assignments: the IRB proposal, the mini-research project, and the comprehensive paper.

What Did Students Learn? We wanted our students to learn how to be better thinkers and writers as well as become more familiar and comfortable with conducting research. We found that other doctoral programs included content about research ethics and expectations of IRB approval for student projects (Jacobsen et al., 2018a) as well as opportunities for students to design action-oriented research experiences (Murakami-Ramalho et al., 2013). We created low-stakes assignments that helped students become conversant with research terminology and procedures—without the high-stakes expectations of a dissertation proposal. For example, our doctoral students designed a mini-research project about their selected topic but had only one to three research participants. The research design paper included a title, abstract, a two to three-page introduction with a concise argument and a clear purpose, a brief two-three page literature review, and methods identifying participants and data collection tools. Next, after an introduction to the ethics of conducting research and analysis of one of our own IRB proposals, students developed a proposal for their mini-research project and submitted it to the university’s IRB. Once the university approved their project, they collected the data. During the seminars, they worked on data analysis, interpretation, displays, and summaries. Using our research poster template, they shared their posters with the full cohort of doctoral students in a mock-conference format. Although their proposal was a full manuscript, the only outcome to present the results was a poster presentation. We documented the impact of these experiences on students’ identity as researchers in a prior publication (Caskey et al., 2020).

Data Collection

Our study focus used two data sets: completion of seminar assignments and responses to a series of anonymous formative assessments. Completion of key seminar assignments would demonstrate student’s engagement and success. The series of anonymous formative assessments would tell us, the instructors, what teaching methods and situated learning experiences impacted their learning. To conduct our study, we secured approval from our university’s IRB.

Key Seminar Assignments

We examined student completion of IRB proposal with university approval, mini-research project, and successive drafts of the comprehensive paper, a program milestone that all students needed to pass for continuation into the program’s second year.

Formative Assessments

Because we were curious about student learning and improving our practice, we used multiple formative assessments over time to learn how students responded to our assignments and teaching methods. In Table 3, we listed the three, anonymous formative assessments along with collection dates. In February, we administered the first formative assessment—a Plus-Wishes chart. Students listed positive aspects of the program in the left column (Plus) and aspects that needed to change in the right column (Wishes). The chart included a narrative section for students to share their thoughts about their experiences in the doctoral seminar. Because we situated their learning within a community of practice, we gave students time to complete this open-ended chart in the seminar.

In March, we conducted our second formative assessment, the Experiences Questionnaire, which had two questions: “What two experiences stood out for you?” and “Please describe those experiences.” In this second assessment, we wanted students to distill their experiences into the most significant ones and invited their suggestions for improvement. Again, we provided students with time to complete this questionnaire during the seminar.

The final formative assessment was the Critical Incident Questionnaire (CIQ) (Brookfield, 2017), which we administered once a month for three months: April, May, June. Brookfield (2017) argued the CIQ could help faculty understand students’ perspectives and experiences and gain a more nuanced understanding of learning experiences. Because we were aware of the broader brushstrokes of what seemed to be working in our seminars, we used the CIQ to seek a clearer understanding about how specific assignments and activities worked within their student community of practice.

Table 3 Formative Assessments of Doctoral Student Seminar Experiences

Data Analysis

First, we tabulated the completion of the three seminar assignments: an IRB proposal, mini-research project, and successive drafts of comprehensive paper. Collecting and analyzing student responses to our teaching was part of our SoTL work, where data collected from students are used to improve teaching practice, not evaluate whether we are “good teachers” or not. We analyzed the formative assessment data each month in February, March, April, and May. We collaborated on the formative assessment analysis using a constant comparative method to identify patterns during an open-ended coding process (Merriam, 2009). After reading and re-reading the data, we compared data segments “to determine similarities and differences” (Merriam, 2009, p. 30) and assigned data to tentative categories that informed our instructional decisions. After reflecting on the data each month, we adjusted our instruction. Finally, we reviewed the clustered responses across the data to identify themes and select representative verbatim quotes for this paper.

Findings

During our two years of teaching the doctoral seminar, we studied our EdD students’ development as researchers and writers as well as our instructional methods. First, we examined the development of their identity as researchers as measured by the pre-post administration of the draw-a-researcher test (Caskey et al., 2020), which we based on the classic draw-a-scientist test (Finson, 2002, 2009; Foutz et al., 2015). We learned that our embedded writing support model contributed to the development of a researcher identity. Second, we conducted an end-of-year multi-level survey to zero in on our teaching method of modeling, coaching, and scaffolding in the cognitive apprenticeship model (Caskey & Stevens, 2021).

For this case study, we examined student assignment completion and formative assessments to investigate both our EdD students’ and our instructor doctoral seminar experiences. These two data sources connected specifically to the two purposes of this case study.

Purpose I: Describing and Explaining Students’ Responses to Seminar Assignments

To begin, we summarize the findings from three seminar assignments: IRB proposal, mini-research project, and comprehensive paper. Student learning was situated within the doctoral seminar. All 12 students completed and submitted an IRB proposal and received university approval before conducting their mini-research project. First, they acquired the skill and confidence to complete the IRB proposal. Within this student community of practice, they shared ideas and engaged in the mutual practice of writing an IRB proposal. Second, our EdD students designed, conducted, and presented a mini-research project. All 12 EdD students excelled at initiating and completing these two authentic, low-stakes assignments.

All students also turned in and received feedback on successive drafts of the comprehensive paper throughout the first year. They learned to incorporate extensive instructor feedback and respond to peer feedback within the safe space of the seminar. Again, student learning was situated within the doctoral seminar. This student community of practice allowed them to live the process of building a writing practice (Guerin et al., 2015) as evidenced by all 12 successfully completing and defending the comprehensive paper on time. Success on the seminar assignments seemed to bode well for student retention and program completion in this first year of the program. Although we could not attribute their final success in the program to the year-long seminar, 11 of 12 EdD students (92%) completed their dissertations and doctorate within the four-year timeline.

Purpose II: Improving Our Teaching and Student Learning Using a SoTL Perspective

In this section, we delineate the findings derived from our analysis of the three formative assessments: Plus-Wishes Chart, Experiences Questionnaire, and Critical Incident Questionnaires.

Plus-Wishes Chart

In February of the first year, we used an open-ended Plus-Wishes Chart to assess what was working in the program (plus) and what needed to change (wishes). This first assessment served as a mid-year assessment allowing us to make informed instructional decisions based on student feedback. Students listed aspects of the program on a plus as well as listed others on the wishes side. In total they listed 60 different program aspects with 47 responses across the chart on the plus side and 13 on the wishes side.

Of 47 plus responses, 21 related to academic writing instruction and support, 13 referenced advising and interactions with the instructors, and nine focused on opportunities for interactions within the learning community. Representative plus responses included:

  • Clear explanations; breaking things down in component parts; detailed explanations; skill-building activities (e.g., reading academic articles).

  • Individual meetings with advisors.

  • Community-building activities; structure of the whole session.

Regarding the wishes side, four responses listed the need for structural changes— features outside the purview of our instruction. Three responses noted wanting more opportunities for peer feedback on their academic writing. The remaining responses did not relate to the content of the program (e.g., parking, laptops). Representative wishes included:

  • Change the schedule of the seminars to weekdays.

  • More time for peer review/feedback on writing.

In the narrative section, we invited our students to comment on their pluses and wishes. Their comments included:

  • I consider this class the “glue” that holds the core classes together. I like the fact that this class is so well-structured because it helps us understand the big picture.

  • I feel very lucky to join (have) this seminar…where I am and what I need to do….

  • Taken as a whole, the leadership seminar has been a very positive one. The goals for this first term have been clearly defined, and the activities appropriately scaffolded to reach those goals.

  • Overall, the weekend classes, energizing rather than tiring. I like the work session nature of the class in which we get to get a good start on projects and get immediate feedback before going off on our own.

  • This seminar is invaluable for the success of doctoral students. I couldn’t imagine navigating the program without it.

Experiences Questionnaire

In March, we administered the second formative assessment, the Experiences Questionnaire on which the 12 students identified and described two experiences that “stood out for them.” Twelve identified writing (n = 8) and the authentic research task of writing an IRB proposal (n = 6) as significant. Five responses related to classroom discussion activities. We identified three themes that contributed to their learning: writing, research, and class discussions (see Table 4).

Table 4 Themes Derived from Experiences Questionnaire Responses

In the narrative section, typical responses included:

  • Going through IRB process forced me to focus on my topic in a tighter way and clearing up parts that were confusing. It was also great experience for my dissertation and gave me ideas on how to conduct my study.

  • I really like the feedback from the instructors and how to proceed on the next draft. Your clarification on what to do next and arranging the structure of the paper helped tremendously. Now it is just a matter of revision and tweeking the parts.

  • Core paper analysis helped me in writing my core paper and what structure I should use, reference list.

In the suggestions section, representative comments were:

  • very helpful class.

  • more frequent peer feedback would be nice. I like having many different eyes look at my paper so I can incorporate how other people understand my topic. I like to see what others are working on and how they are going about it.

  • I don’t have any suggestions, very useful class, exceeded my expectations.

Across these themes, we found that our students highlighted specific learning experiences, which had transpired or were fostered within the seminar.

Critical Incident Questionnaire

The third formative assessment was the Critical Incident Questionnaire (CIQ). Because we administered it monthly, we had three sets of CIQ questionnaires. Each month, we analyzed the student responses and adjusted our instructional methods (see Table 5). The questions in the CIQ were:

  1. 1.

    At what moment in the seminar did you feel most engaged with what was happening?

  2. 2.

    At what moment in the seminar were you most distanced from what was happening?

  3. 3.

    What action that anyone (teacher or student) took during the seminar did you find most affirming or helpful?

  4. 4.

    What action that anyone took during the seminar did you find most puzzling or confusing?

  5. 5.

    What about the seminar surprised you the most?

In each of the preceding CIQ questions, we italicized the key words to help guide our interpretation of students’ responses; these words were not italicized when we gave the CIQs to the students. In Table 5, we noted parenthetically to which CIQ question the response was found.

Table 5 Analysis of Students’ CIQ Responses and Our Instructional Decisions

Regarding writing, students noted comprehensive paper development, peer sharing of comprehensive paper, pre-writing activities, and writing scaffolds. Repeatedly, they referenced experiences within the seminar space—their community of practice. They listed research activities including the mini-research project, data analysis, and faculty research presentations. Further analysis of responses to CIQ questions across three datasets led us to categorize the findings into four clusters: instructional processes, content, personal preferences, and program structure. Overall student responses were 115 (81%) for instructional processes, 11 (8%) for content, 9 (6%) for personal, and 7 (6%) for program structure. The CIQ was a rich source of formative feedback.

Themes from Formative Assessments

After analysis of the three assessments (Plus-Wishes Chart, Experiences Questionnaire, CIQ) administered during winter and spring terms, we identified four themes.

Theme 1: Provide Authentic, Low-Stakes Practice on Key Components of the Dissertation

Across formative assessments, students consistently responded positively to the assignments related to components of the dissertation: writing an IRB proposal, receiving IRB approval, and presenting a poster session about their research.

Theme 2: Focus on Building a Toolkit of Writing and Research Strategies

Students learned a set of strategies that provided practice in identifying the rhetorical infrastructure of academic writing and research (Stevens, 2019). Within their community of practice, we helped them build a toolkit of these strategies that could be applied to other writing tasks. For example, the strategies of freewriting and writing a purpose statement could be used for developing conference proposals, writing grants, or composing a newsletter article. We agreed with Rai and Lillis (2013) who argued for a more explicit connection about writing expectations across academic and professional settings to strengthen practitioners’ professional practice writing and communication skills. Across all formative assessments students seemed to appreciate having a set of ‘go-to’ strategies to address research and writing expectations.

Theme 3: Create a Community of Practice

Our EdD students shared repeatedly the importance of their community of practice. The seminar structure created the space and time needed for legitimate peripheral practice and community-building. They experienced mutual engagement, joint enterprise, and shared repertoire within the larger seminar community as well as the smaller writing groups. Their mutual engagement required interaction, negotiation, and sustained relations where their own competence, and the competence of others emerged (Wenger, 1998). They gave and received peer feedback on written drafts and exchanged ideas about unfamiliar or complex topics. Across the seminar, our EdD students participated in the joint enterprise of traversing along a doctoral program path. Together, they held one another mutually accountable and worked collaboratively to improve their research skills and academic writing practice. Their shared repertoire of practice helped to bring “coherence to the melody of activities” (Wenger, 1998, p. 82) and ways of doing the work. Together, they acquired and applied strategies, skills, and knowledge to aid in academic writing and conducting research.

We learned how important it was for our students to give and receive feedback from their peers. Because of some disagreement about the value of peer feedback, we were initially hesitant incorporate opportunities for extensive peer feedback (Man et al., 2018). However, students mentioned the desire for more opportunities within their communities of practice at several junctures in the formative assessments. Thus, we created many more situated learning opportunities for peer feedback.

Limitations

We acknowledge the limitations of our case study focused on EdD students’ and instructors’ experiences within the doctoral seminars. First, our sample size was small. To mitigate this limitation, we need to gather data across multiple cohorts. Because cohorts only start every two years, it would be necessary to work with other institutions. Second, the formative assessments measured the students’ perceptions—in this case, their development as writers and researchers. We could include alternative data collection methods (e.g., observation) to address this limitation. Third, the use of multiple formative assessments over several months could be viewed as a drawback. However, from a SoTL perspective, the number of assessments gathered over time was informative and lead to strengthening our instructional practices as we taught the yearlong seminar. Fourth, completion of seminar assignments (i.e., comprehensive paper drafts, IRB proposal) provided only a snapshot of the quality of student work. The use of evaluative measures (e.g., rubrics) could produce detailed information about students’ abilities as writers and researchers. Fifth, this research is situated in a United States EdD program in which nearly half of our students did not complete a research-focused master’s degree. Our findings might not apply to PhD programs or doctoral programs in other countries.

Conclusion

According to the annual Survey of Earned Doctorates, declines in student enrollment and graduation reflect the pandemic’s effect on doctoral student retention (Flaherty, 2021; National Center for Science and Engineering Statistics & National Science Foundation, 2021). This trend can have a lasting impact on scholarly endeavors, and research-informed practice. While these are PhD data, we assume the EdD data has a similar trend. To address the tears in our social, economic, and political fabric due to the pandemic, we need more EdD doctoral students to become educational leaders charged with making critical decisions that affect students, families, and overall community life. EdD programs occupy a much-needed role in contributing to research on practice and producing professionals whose practice is deeply informed by research (Amrein-Beardsley et al., 2012; Friesen & Jacobsen, 2021; Kumar & Dawson, 2013; Shulman et al., 2006; Taylor, 2007). Our study provides research-based strategies on how doctoral programs can be structured to build a foundation for students as researchers and writers that seems to put them on a solid path toward graduation and educational leadership.

What are our recommendations for building a program with a solid foundation for education doctoral students? First, develop assignments that match the kind of work students are expected to do for their dissertation. Incorporate scaled-down, low-stakes versions of authentic tasks such as a mini-research and poster project that align with the university’s IRB processes. By creating these types of assignments, faculty build student confidence, deepen their familiarity with academic expectations, and extend their content knowledge. Because writing is one of the most challenging aspects of graduate work (Caskey & Stevens, 2021; Cotterall, 2011; Sverdlik et al., 2018), faculty should take steps to support students in practicing worthwhile writing and research strategies and imagining themselves as competent writers and researchers.

Second, create a student community of practice. Give students the time to meet in writing groups, be held accountable, and share their work. Not only do experienced faculty need to model how students can grapple with key elements of writing and research, but students also need to work with their peers. Students can practice and view themselves as a community of scholarly practitioners who engage in academic conversations (Huff, 1999) over the feasibility of research and its application to complex problems. Although many EdD programs use a cohort model (Bista & Cox, 2014), we found that to get full benefit from the cohort model, it is important to foster the development of a community of practice as well.

Third, gather formative assessment data while teaching because these data can be pivotal for improving our practice and fostering timely student learning. The SoTL Framework (Felten, 2013) reinforces our stance that teaching is a work in progress, and the best way to make improvements is to gather data from students about what and how they are learning. The time and effort of formative data analysis informs and strengthens a faculty community of practice where they are mutually engaged in teaching together as well as developing a joint enterprise. A shared repertoire of instructional practices can make a difference for students. The series of formative assessments can help to document, adjust, and refine instructional practices in real time.

Finally, several aspects of our work could benefit from further research. First, we need to more closely examine how student communities of practice work. What are students doing during the allocated time? What are they learning about academic writing and research? Second, we need more comprehensive data on EdD student completion rates to better evaluate the effectiveness and impact of programs. Although Kumar and Dawson (2013) assessed the post program impact of the EdD, it would be good to have more research comparing EdD programs with post dissertation EdD leadership completion rate and impact. Because the EdD makes a significant contribution to practice, research on practice and practice informing research, identifying quality EdD programs would strengthen those outcomes.