Keywords

“…the first one was understanding my responsibility is to always look at the data, from K through high school; even with juniors and seniors. There may not be state assessment, but there are a lot of things that we need to be looking at. Looking at all of that data driven instruction and that philosophy, my biggest push was for my teachers to understand and integrate that as being a valuable tool for themselves. Not because the principal said, I need to look at my data, but to fully understand the strengths and weaknesses of their curriculum map, of their instruction, of their materials, of surveys, of observation notes, and really use that data to improve what they’re doing in the classroom. And they did it, and could see some direct results from it…” -Principal

As noted in chapter one, recently, education policies in the U.S. and elsewhere have reflected a trend toward “scientific” evidence-based practice and data-driven decisions. The rise in the use of scientific evidence for educational policymaking rests on two common beliefs about knowledge: One is the belief that school knowledge is universal, and the other is the belief that empirical evidence or data is the primary indicator of knowledge and learning (Ylimaki et al., 2019). Policy documents under both Republican and Democratic administrations dating back to the No Child Left Behind Act (2002) and the subsequent grant program, Race to the Top (McGuinn, 2012), have reflected the importance of data or numeric evidence from externalized evaluations to guide school decisions. Internationally, multinational organizations, such as the Organization for Economic Co-operation and Development (OECD) and the World Bank have also made evidence-based policymaking a priority both in their own work as influential research and policy organizations as well as for their members (Wiseman, 2010).

Yet as Biesta (2010) cautions us, this particular use of evidence threatens to replace professional judgment and the wider democratic deliberation about the aims, ends and content of education. Rather, Biesta calls for a value-based education as an alternative for evidence-based education. Here evidence plays a role, but that role is subordinate to the values that constitute practice as educational practices. Thus, as demonstrated in Fig. 5.1, our school development project, we recognized that evidence-based values and humanistic values are at tension but held in balance through reflection and pedagogical activity in education.

Fig. 5.1
A diagram explains the balance between evidence-based values and humanistic values through reflection by team leaders or members in education. Evidence-based values are on the left, while humanistic values are on the right.

Conceptual model for school development project. (Ylimaki et al., 2019)

In AZiLDR, institutes and other meetings provided school team participants and district leaders with structured (discursive) spaces for dialogue and reflection about a range of data as evidence within and between levels. In turn, participants applied a balanced perspective on data with numerical evidence subordinated to educational values, using all as a source of reflection and growth.

With an education theory grounding in mind, we define data as information that educators, school teams and other agencies use to inform professional judgment and influence. Data included student achievement tests, benchmark assessments in classrooms, and surveys of teaching-leading-studying practices in the school. Data literacy then means the capacity to understand, reflect upon, and make professional judgments about a range of data. As Ryan (2011) concluded in his explanation of Dewey & Bentley’s concern about the future of humankind:

Education is still fixed on rote memorization and standardized tests rather than the synoptic problem-solving that worked so well in Dewey’s Chicago [lab] school....We can’t work together until we begin to see together—not some preconceived what, some universal good, but a common how that is experimental, inclusive, and pluralistic (p. 76, emphasis original).

We considered the philosophy of transaction that Dewey and Bentley sketch in Knowing and the Known is an invitation, left for us to shape and refine in the social as well as the natural sciences. In the next sections, we present our application of theory and research as well as case examples.

Application

Our application featured a broad definition of data, to include student achievement data, attendance data, behavioral data, survey data, and observational data. As indicated in Chap. 4, teams began the process by taking the Bennett Survey for Leadership Capacity (Bennett et al., 2013) and reflecting on what it revealed about a variety of topics, including Leadership Characteristics and Practices, Curriculum and Instruction, Achievement (both current and previous levels), School Capacity, and Leadership Tensions and Dilemmas. This allowed teams to identify differences in perception between the principal and the rest of the team/staff. Teams then identified strengths and gaps in order to begin to set some goals for development.

Another survey that provided data for reflection was the Culture Survey, also discussed in Chap. 3. Again, teams were able to use that data as a source of reflection in order to plan activities for staff participation. Additionally, student achievement data was examined by the teams early in the process. However, we quickly discovered that not everyone understood the state achievement data, how to interpret it, or what use it was to them. Therefore, we spent time in building data literacy with team members. We began with looking at six assumptions about data (Love et al., 2008) and asking teams to have candid discussions about each statement. These statements can be found in Activity Box 5.1.

Activity Box 5.1: Assumption Card Stack and Shuffle (Lipton & Wellman, 2011)

Divide into groups of 3−5. Distribute one full set of cards to each group. The cards are then dealt to each member of the group. One individual will begin the round by reading a card aloud. The entire group then discusses the prompt using the following mediating questions as a guide:

  • What is the thinking behind this assumption?

  • What are some inferences that can be made from it?

  • What might be some alternative interpretations?

  • To what degree is this assumption generalizable or context specific?

  • If ______________ were true, would this assumption still hold?

  • (Wellman & Lipton, 2004)

Each individual in turn reads a card, and the team discusses the prompt until all cards are completed or time is exhausted. This activity could take up to an hour, depending on the group.

The prompts which we used for the activity are as follows: (Love, N. Stiles, K.E., Mundry, S. & DiRanna, K., 2008. pp. 4−7.)

Assumption 1: making significant progress in improving student learning and closing achievement gaps is a moral responsibility and a real possibility in a relatively short amount of time – two to five years. It is not children’s poverty or race or ethnic background that stands in the way of achievement; it is school practices and policies and the beliefs that underlie them that pose the biggest obstacles.

Assumption 2: Data have no meaning. Meaning is imposed through interpretation. Frames of reference, the way we see the world, influence the meaning we derive from data. Effective data users become aware of and critically examine their frames of reference and assumptions (Wellman & Lipton, 2004, pp. ix−xi). Conversely, data themselves can also be a catalyst to questioning assumptions and changing practices based on new ways of thinking.

Assumption 3: Collaborative inquiry – a process where teachers construct their understanding of student-learning problems and invent and test out solutions together through rigorous and frequent use of data and reflective dialogue – unleashes the resourcefulness and creativity to continuously improve instruction and student learning.

Assumption 4: A school culture characterized by collective responsibility for student learning, commitment to equity, and trust is the foundation for collaborative inquiry. In the absence of such a culture, schools may be unable to respond effectively to the data they have.

Assumption 5: Using data itself does not improve teaching. Improved teaching comes about when teachers implement sound teaching practices grounded in cultural proficiency – understanding and respect for their students’ cultures – and a thorough understanding of the subject matter and how to teach it, including understanding student thinking and ways of making content accessible to all students.

Assumption 6: Every member of a collaborative school community can act as a leader, dramatically impacting the quality of relationships, the school culture, and student learning.

Activity Box 5.2: Sample Walk-Through Protocol with Cues (Sunnyside USD, 2015)

Rubric Indicators with Mathematics Look fors Walkthrough form

Goals for Mathematics Walkthroughs

  • Calibrate through formative observations of mathematics instruction in classrooms across the district

  • Build capacity for principals to provide feedback and lead mathematics improvement

Indicator: Mathematics Look Fors.

Instruction 1: EEI

  • Alignment of instruction to common core standards

  • Shifts in instruction - fluency, concept development, application

Instruction 2: Engagement

  • Use of mathematical practices

  • Use of math talk moves

Instruction 3: Differentiation

  • Student Partners/Groupings,

  • Menus

  • Vocabulary support (anchor charts)

  • Scaffolding prerequisite knowledge into grade level content instruction

Instruction 4: Checks for Understanding

  • Journal prompts

  • Use of white boards

  • Questioning

  • Chunking (10/2)

  • Review of skills check data with students

Management 3: Learning Environment

  • Organization of tools, computers, and materials

  • Productive and cooperative environment (i.e. cooperative learning roles, student initiated and self-help processes - 3 before me, reference charts)

Guiding Questions for Classroom Debrief

  • What did you see including examples and evidence from the indicators?

  • How does this relate to our goals for mathematics instruction (content focus/pacing for grade level, shifts in instruction, mathematical practices)?

  • What would be the priority for feedback and next steps (strength and stretch)?

  • How would you, as an instructional leader, coach the teacher to reinforce positive practices?

  • How would you, as an instructional leader, coach the teacher to grow in their use of math practices?

Guiding Questions for Summary

  • What trends did we see across the classrooms?

  • What would be the priority for next steps for the site (strength and stretch)?

  • How would you use the trends across the classrooms to inform professional learning in your building? How would you communicate that with staff?

  • Who will host the next visit (specific date and time frame)?

Teams then explored the differences between summative and formative data, looking closely at the uses and value of common formative assessments. They were provided with resources to assist them in looking at a variety of formats. Teams were then expected to create a sample formative assessment for their sites, and were given feedback by other teams on what they saw and how it might be refined.

Another aspect of working with data as a source of reflection was inherent in the work we did with teams around Professional Learning Communities (DuFour and DuFour, 2011). The purpose of implementing professional learning communities (PLCs) is to examine and reflect on data in order to align instruction across content areas and grade levels, to provide for reteaching of concepts to students who did not grasp the material during initial instruction, and to support teachers in their own professional learning. Reflection on the data provided the impetus for action taken by the PLCs.

Finally, during the school development process, we provided information about the use of data gathered from classroom walk-through observations. We considered this concept from several angles. First, we talked about what kind of data to collect during walk-throughs, and how it could be best used. For example, if teachers were asked to implement new instructional strategies in mathematics, the walk-through data collected and analyzed should be specific to the goal. Secondly, we asked teams to create a walk-through protocol to meet their own individual school needs, defining the specific “look fors”. An illustration of the goals, “look fors”, and questions for group reflection (as created by one district) is included in Activity Box 5.2. The last piece of this process was to walk through classrooms at a variety of campuses together, working with team members to understand what they were seeing during the observation times, and then reflecting together about the observations and their implications. It was apparent that teams grew in their understanding of how to collect data as well as what it meant.

Lessons Learned

The following cases illustrate some aspects of the ways in which two different schools used data for reflection in order to move their schools forward in the continuous school development process.

Case A: Sylvester Middle School

Sylvester Middle School is located south of the metropolitan area of Tucson, and is characterized as somewhat suburban, yet somewhat rural. Serving about 700 students in grades 6 through 8, the student population is approximately 50% Hispanic and 50% Anglo, with only about 10–20 students identified as English Language Learners. The principal, Susan Sussex, indicated that they are a growing school in a growing district, with many students moving into the area from out of state. For this reason, she suggested, the school is having an identity crisis, trying to decide who they are as a school; what does it mean to be a student at Sylvester Middle School.

The initial data that this leadership team looked at in order to see where they were was the data from the Bennett Survey of Leadership Capacity, (Bennett, et al., 2013) administered to all staff as well as the principal. The principal explained how they used that information.

“And we decided that we would share all the data we have, all the staff, the principal survey as well as the other surveys and just give them time during the faculty meeting to just look it over, read it, look at trends, patterns, and it was very interesting to me that there were a lot of similarities and in how I rated myself and how my staff rated me. There may have been a few areas where I rated myself lower than they rated me or maybe I rated myself higher the they did me, but for the most part I felt like it was a good approximation. Our perceptions are similar.”

She went on to discuss areas of concern that came out of this data examination and reflection.

“I think looking at the trusting culture among the staff - that was a huge area - that they don’t really trust; and collective efficacy was bad - they jump out at us…You know maybe a little bit of that, and then maybe just a sense of collaborative decision-making and shared leadership…it is certainly an area that we need to focus more on - using our teacher leaders, our curriculum team leaders, somewhat like department chairs in the middle school. How can we utilize them more?…Shared leaders on campus?…and then from their communication with their teams and their administration with communication, how do we solicit more important opinions and identifying ways for me to reach out for those opinions.”

Teachers from the leadership team expanded on the principal’s reflections, stating:

“One of the things that came up was some of the teachers were reading the results and they were shocked at why some of the people said it this way in some of the written responses, and why would some of them be so negative. They were surprised. Sharing is a positive thing; if you just got negative results it sounds more horrible.”

Trust in the administration emerged as an issue at this school, but by being open about the data, sharing the results of the surveys, both positive and negative, and reflecting on what the data were indicating, staff began the process of coming together to work toward a common goal.

Case B: Mary Miller Elementary School

Mary Miller Elementary School is located in a medium-sized urban school district in southern Arizona, hosting students in Kindergarten through sixth grade. The first elementary school in the district, it is known to serve students whose parents and grandparents had also attended the school.

The state testing results released in 2019 indicate that 31% of students at the school are proficient in English/Language Arts, and 26% are proficient in Mathematics. Understanding that these test results are just one data point from which to judge school success, the school website states:

“The [State] Test measures how well our students are performing in English language arts and math. [State Test] scores are just one of several measures, including report card grades, classroom performance, and feedback from teachers, that can be used to measure your child’s academic progress.”

The principal, Cathy Ignacio, was asked about things that were meaningful from the School Development trainings. Upon reflection, she offered.

“Another thing that we implemented, I think with Institute One, is we talked about how we look at the data, and that traditionally we’ve been all focused on the ground level, but we need to start taking more focus on where it was before it was that … I really enjoyed that session, because we took it back right away, and we did the activity with our staff, and talked about what we needed to really be looking at.”

She continues,

“We did data talks every week, like three grade levels at a time, on Wednesdays, and so that would either become goal setting, or we’d look at math or reading, and we’d look at strategies in the areas they were not doing as well, and what we could do for the reteaching, and what the focus was going to be. We changed our reading, as far as our comprehension questions, tailoring those to the story so that they focused more on the skills we were seeing. We were still on goals, and so that was helpful.”

One use of data for reflection was highlighted in the School Assistance Team process. Cathy explained:

Our school assistance team, some people call it TAT, we call it SAT…one of the things we’ve constantly struggled with is making sure what that referral process is, what teachers need to do on there, and how they document, how they do interventions…we had it nicely set up so when you come to meet with us, you need to bring this data, this data, this data. You need to talk about what’s been going on, but we’re going to come back with interventions. It’s not like just because you come here we’re going to do a child study. We had I think around over 30 referrals this year, but we really only went ahead with a handful of kids to really go through the child study process because we put the interventions in place, and it was just clearer. We did some inservices with teachers on things to know about it, what you’re doing, what you need to do. It probably needs to be repeated each year so we remember, but what a difference. It was clarified, it was communicated, and it was successful with the strategies…and there’s been this change in achievement, that all of a sudden it clicked.”

Cathy and her team recognized that reflection on data was vital to the process of continuous school development, and worked to embed the practices on many different levels.

Final Thoughts

Schools are rich in data, but understanding how to use the data for reflection and continuous development is often lacking. In some cases, data is interpreted as solely test scores from summative state tests; in other instances, data as seen as a rich source of material upon which to reflect and to inform decision-making. The case studies presented above illustrated only small pieces of the process, but recognized that in order to use data appropriately, a culture of trust and collaboration needed to be in place. Teachers are often fearful about sharing student achievement or behavioral data because they are afraid of being judged. Thus, a trusting, collegial culture must be established in order to openly and honestly examine and reflect on data in a collaborative manner.