Child & Youth Care Forum

, Volume 46, Issue 3, pp 285–305

Simple Interactions: Piloting a Strengths-Based and Interaction-Based Professional Development Intervention for Out-of-School Time Programs

  • Thomas Akiva
  • Junlei Li
  • Kelly M. Martin
  • Christy Galletta Horner
  • Anne R. McNamara
Original Paper

DOI: 10.1007/s10566-016-9375-9

Cite this article as:
Akiva, T., Li, J., Martin, K.M. et al. Child Youth Care Forum (2017) 46: 285. doi:10.1007/s10566-016-9375-9
  • 71 Downloads

Abstract

Background

Adult–child relational interactions constitute an essential component of out-of-school-time programs, and training staff to effectively interact with children is key to improving program quality. Efficient staff training, that meets the limited time availability of out-of-school time staff, is particularly needed.

Objective

This pilot study introduces Simple Interactions (SI), an innovative, strengths-based, and interaction-based professional development approach. Rather than attempting to teach generic competencies or targeting weakness areas for improvement, SI is designed to help program staff build from their strengths.

Methods

In two cohorts over the course of 10 months, ten afterschool programs (N = 70 staff) participated in a pilot of SI. Program staff watched short video clips of themselves working with children and used the intuitive, 1-page SI Tool to guide discussion of adult–child interactions; specifically, connection (affective intune-ness), reciprocity (balanced roles of engagement), participation (involving all children), and progression (incremental challenge).

Results

Results suggest that participants valued the professional development process, the strengths-based approach, and the use of self-video despite initial apprehension, and reported perceived improvements in their professional learning communities. Pre-post videos of Cohort 2 staff (n = 20), coded blind to time point (pre or post), indicate significant and substantive improvements in staff–child connection, reciprocity, and participation.

Conclusion

These results support the use of this simple, practical, and potentially effective model of supporting quality improvement for and by local staff.

Keywords

Out-of-school time programs Afterschool Professional development Intervention Mixed-methods 

Introduction

Stakeholders have debated the role that out-of-school time (OST) programs should have in communities since their emergence over 100 years ago: whether their purpose is prevention, protection, enrichment, academic remediation, or self-expression and play (Halpern 2003). This dispute reflects a great diversity of program types, activities, goals, and the issues of societal importance at points in history. However, regardless of the goals of a given OST program, the relationships between adults and children or youth in this setting are of central importance. Out-of-school time programs are prime settings within a community for adult–child relationships to develop, and such relationships are identified as key factors for program effectiveness in numerous studies, reviews, and assessments (e.g., Durlak et al. 2010; Eccles and Gootman 2002; Hirsch et al. 2011; Yohalem et al. 2009). The basic building blocks of positive child–adult relationships are the daily and minute-by-minute interactions between children and staff that take place in OST programs.

The quality of child–adult interactions can be inconsistent in many OST program settings (Smith et al. 2010), potentially limiting the positive effects of participation in such programs. However, interaction quality is malleable (i.e., sensitive to training) as demonstrated by successful quality improvement initiatives (Smith et al. 2012). Targeting relational interactions is likely a productive area of focus for quality improvement efforts in OST programs, as it has been for other developmental contexts (Allen et al. 2011). As the time available for professional development tends to be low in OST contexts (Quinn 2012), professional development should be efficient to help staff make the most of these interactions.

Two types of professional development are currently common in the out-of-school time field (see Fig. 1, sections A and B). The first is by far the most common and may be called the General Training Approach. This approach involves providing opportunities for youth workers to attend professional development workshops in topic areas deemed relevant to the profession. Trainers sometime jokingly refer to this as “spray and pray”, as the goal is to provide one-time or short-term training content to whoever will listen, often without follow-up support to integrate knowledge into practice. Workshops of this type appear at national or regional conferences (usually in the form of one-time sessions lasting 1–2 h), at OST worksites, and through regional quality improvement initiatives. General training can sometimes improve program quality (Fukkink and Lont 2007). However, effects can be limited and short-lived especially for the short-term trainings often employed in OST.
Fig. 1

Approaches to OST professional development

The second training approach, the quality improvement systems (QIS), common in fields like early childhood education, has gained prominence in OST in the past two decades with the rise of communitywide approaches to improvement (Browne 2015). In both OST and early childhood education, effective QIS interventions tend to incorporate ongoing coaching and needs assessments (usually in the form of quality ratings) that drive improvement activities. QISs can be as or more effective than general training. Experimental studies have shown that staff who engage in QIS interventions may significantly improve their observable program quality (Smith et al. 2012). This approach is relatively efficient. For example, the Youth Program Quality Intervention Study was estimated to take approximately 30 h per person with effect sizes of .55 on instructional practices (Smith et al. 2012); whereas, in Fukkink and Lont’s (2007) meta-analysis of experimental studies of caregiver trainings—which we term General Training Approach—the average time spent was 59 h for similar effect sizes. The drawback of QISs is that they involve long lists of standards that can seem overwhelming and impractical in resource-strapped settings. In addition, studies have found a lack of predictive relationships between many of these complex standards and ratings and subsequent child outcomes, at least in the early education and elementary school settings (Mashburn et al. 2008; Sabol et al. 2013). Such systems tend to operate on top-down assumptions, as external experts developed the measurement tools and system administrators defined quality by these measurements. Youth workers may be involved in improving quality, but not in actively defining what quality should look like in any given setting.

The bottom of Fig. 1 depicts a third type of training—the strengths-based approach—which grows from, shares, and complements some of the features of QIS but offers important contrasts as well. This approach begins with identifying existing strengths in program or staff practices, rather than the problematic areas. The orientation of training shifts from prescribing best practices to staff to identifying with staff the effective practices already occurring at a site. Then, with facilitation, the staff may begin to consider amplifying or “growing” these practices. Like the QIS approach, the strengths-based approach commits to longer-term, continuous engagement with staff in a more coaching, facilitative role rather than the “stand and deliver (content)” model of General Training approach. Like QIS, the strengths-based approach focuses on supporting quality improvement within the local contexts, rather than relying on general prescriptions. Unlike QIS, which relies on comprehensive and often complex definitions and measurements of quality, the strengths-based approach engages and relies on more intuitive assessment and judgments from the staff and site leaders themselves.

One strand of strengths-based approaches to positive change relies on the power and wisdom of the target community itself. This community-driven, strengths-based approach has parallels to positive deviance processes pioneered in the public health field (Marsh et al. 2004). Instead of “parachuting” new solutions into local settings and attempting to motivate wholesale change by edict or short-term incentive, this approach begins by identifying innovations or exemplary practices that already exist (and persist) in a local setting, albeit often in an isolated or little-known instance, often unbeknownst to the practitioners themselves. The aim is then to help these practices scale up using small peer groups who can discuss, understand, and disseminate the practices. Such an approach has been found to work across a variety of arenas in public health (e.g., HIV/AIDS prevention, Friedman et al. 2006; and malnutrition in development countries, Sternin et al. 1997). The approach is based on the assumption that within any community, even those with very limited resources, there are cases of exemplary practice. These practices, relying on no more resources than are available to most practitioners in the same setting, are potentially scalable in a local setting. Therefore, the job of professional development can be to identify and amplify these examples.

A method of professional development that is often strengths-based is participatory and/or action research. An example is the Afterschool Matters Practitioner Fellowship program in which youth workers (and sometimes teachers) engage in one- or two-year long inquiry projects aimed to improve or investigate self-identified program features or staff practices (Hill et al. 2009; Walker and Walker 2012). Action-research based professional development programs like the Practitioner Fellowship can be effective, even transformational for participants; however, they require substantial commitment—usually a year or more of intensive involvement.

The strengths-based approach also manifests in Appreciative Inquiry, an organizational improvement process based in systematic questioning and self-evaluation popular in corporations and non-profits (Bushe 2011; Hammond 2013). Appreciative Inquiry aims to promote change by uncovering and enhancing the strengths within a company. This is based on the idea that when a particular behavior or idea is highlighted, it will be repeated (Cooperrider and Whitney 2005). The strengths-based approach to OST professional development may be effective because it highlights staffs’ adult–child interaction experiences in order to encourage staff to repeat their own and their colleagues’ successes (Bushe 2011; Hammond 2013).

The strengths-based approach fits well within the OST practice field. Positive Youth Development, a predominant theory in OST, is based on the idea that all youth have strengths and focusing on youths’ inner resources, rather than their deficits, promotes developmental outcomes (Lerner et al. 2011). For example, Benson (2008) theorized that all youth have developmental assets to be harnessed and supported and Lerner (2009) argued that a positive environment can enhance the strengths of a youth to promote positive youth development. However, this approach is not generally extended to adults who work with youth. Rather, a deficit-based approach is typically used in professional development to identify and attempt to “fix” program weaknesses. In contrast, a strengths-based approach highlights positive examples already occurring among staff in order to increase these instances of productive interactions.

In this report we introduce Simple Interactions (SI), a new strengths-based professional development program for youth workers. In addition to requiring a shorter time commitment than other available trainings, this particular implementation of a strength-based program has three key features that will be described in the sections that follow: It makes use of video recordings of everyday practice to support reflective practice, and it specifically targets staff–child interactions.

Reflective Practice Using Authentic Video

Research on andragogy, an established adult learning theory, suggests that training is most effective when it is relevant to participants, when adults’ experiences are the foundation for learning, and when it is task-oriented (Amobi and Irwin 2009; Hattie 2009; Holton et al. 2009; Kaasila and Lauriala 2010; Knowles 1973). The use of video to capture authentic practice in professional development aligns with these principles and has been found to be an effective training tool across multiple disciplines, including health care, early childhood, social work, and education (Borko et al. 2008; Scherer et al. 2003; van Vonderen et al. 2010).

In a strengths-based approach, watching strategically selected video clips of everyday practice provides tangible examples of afterschool staffs’ experiences working with children (Borko et al. 2008). Video clips provide an “artifact of practice” that allow educators to take a step back and notice details that they might otherwise miss in the moment of direct service (Borko et al. 2008, p. 418; Seidel et al. 2011). Examining relevant instances of child–adult interactions may influence the beliefs and practices of afterschool staff by providing a space for reflection and analysis that ultimately may increase confidence and inspire change in professionals’ behavior (Fisher and Wood 2012).

Watching video of oneself or proximal peers (as opposed to videos of strangers demonstrating “best practice”) seems to be especially effective as it can be directly applied to staffs’ everyday experiences with children (Amobi and Irwin 2009; Kaasila and Lauriala 2010; Seidel et al. 2011). Teachers participating in professional development programs that utilize self-video report that watching videos was the “most valuable aspect of their participation in the program” (Borko et al. 2008, 434). Research on microteaching, a practice in which educators analyze videos of themselves teaching small-group mini-lessons, highlights the success of using self-videos (Hattie 2009, p. 112). A synthesis of four meta-analyses of 402 studies shows that microteaching had a strong effect on teachers (d = .88) that did not decrease significantly over time (Hattie 2009). Watching self-videos allows educators to call upon their own experiences, activating prior teaching knowledge and potentially increasing motivation and engagement (Seidel et al. 2011).

Finally, the use of inexpensive camcorders allows a strengths-based professional development approach to be accessible and sustainable in out-of-school time programs. Recent advances in video technology allow for economical and sufficient quality video recording devices to be in reach of ordinary child and youth programs. Staff members can use these hand-held cameras (or even their own smart phones) to collect video clips of their colleagues, independent of an outside trainer.

A Learning Process Focused on Simple Interactions

Simple Interactions (SI) combines a strengths-based approach and the use of video into a simple but sustainable process for professional development, specifically focused on adult–child interactions. The approach and its associated learning tool (Li 2014) was developed based on a theoretical framework of understanding “developmental relationships” across settings (Li and Julian 2012). Li and colleagues adapted the approach to suit a variety of field settings, including orphanages, childcare centers, classrooms, and out of school time programs. In spite of the diverse contexts, a fairly consistent adult learning process emerged in these early explorations. The learning process begins with the collection of short video clips of staff interacting with children during a typical day at their after-school program, irrespective of the particular activities they are engaged in—whether teaching, playing, or transitioning. One visit with two people collecting video usually can yield ample footage for a training workshop; typically, 5–10 short clips per staff member. The next step in the process involves selecting clips that show successful interactions—at least one clip for each participating staff member. These clips then become the raw material for learning in professional development workshops, usually lasting just over an hour. In a group, staff participants watch a given clip and are led through a strengths-based discussion. They are asked to note and discuss the positive aspects of interaction displayed in the clips, using a simple one-page tool to guide the conversation. This tool (discussed below) also serves as the guide during the clip-selection step, ensuring that the videos viewed during the workshop provide relevant strengths-based content to discuss. After the first workshop, program staff or directors may collect the video clips to be used in subsequent workshops. The overall intervention consists of a few such cycles of video collection, selection, and reflection.

The SI Tool (SIT; see Fig. 2), along with its associated professional development workshops, highlights four dimensions of quality interaction. Two of the dimensions—connection and reciprocity—are based on the theoretical framework of “developmental dyads” by Bronfenbrenner (1979). Connection is marked by mutual social and emotional intune-ness, generally positive, between the adult and children. Reciprocity refers to the balanced interaction between adult and children, neither dominating, but exhibiting an overall “serve and return” type of exchange (National Scientific Council on the Developing Child 2004). A third dimension, opportunity for progression, is modeled after Vygotsky’s (1978) description of Zone of Proximal Development and is observed as the availability of appropriate challenge and corresponding scaffolding that makes learning and growth possible. Across developmental contexts, strong empirical evidence suggests that the presence of these universal constructs leads to program quality and positive developmental outcomes (reviewed and summarized in Li and Julian 2012.) In addition to these three dimensions, we added participation—inclusion of members of the group in joint activities—to draw attention to the practical importance of “spreading” high quality interactions to all children who might benefit, not just those who are most able and most reciprocal. These four dimensions are found in most quality assessment of settings (Yohalem et al. 2009). What is distinct about the SI tool is that it focuses only on these dimensions, and the extremely simplified tool is maximally accessible to a wide range of staff.
Fig. 2

Level 5 indicators from the Simple Interactions Tool. Note This figure contains only the “Z” or level 5 indicators for illustrative purposes. The complete Simple Interactions Tool (SIT) is freely available at [website]. The tool is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License

The goal of the SI workshop is to facilitate staff reflection and conversation around these dimensions in relation to their actual practice. Early in the workshop, the facilitator merely asks the staff what they see and notice in the video clips, without any specific framing or prompting. Because these clips involve local staff within a local staff community, the comments tend to be complimentary, with staff noting what they see and liked. The facilitator may prompt the staff to articulate why they noticed or liked certain details in the interaction, which allows staff to articulate in their own terms what they perceive to be important in child–staff interactions. Often, these comments naturally relate to the dimensions of the tool. The important distinction between this quality improvement discussion versus traditional quality assessment is that staff are not asked to become “assessors” who can reliably break down complex interactions into discrete components (e.g., smile, tone of voice.) Rather, staff are asked to share an intuitive interpretation of the “whole” interactions based on their own perspectives and experiences. In this as well as subsequent discussion, staff are not required or expected to arrive at a “reliable” conclusion regarding any particular interaction.

Building on staff’s existing understanding about what constitutes positive child–adult interactions, the facilitator guides participating staff through the use of the SIT, which uses four sets of simple diagrams that represent typical modes of interaction along the four dimensions. While reflecting on additional footage, staff may individually determine which mode(s) they noticed in the clip, and through pair-and-share, small group, and whole-group discussions, staff describe their reasons for noticing one or another mode of interaction. The facilitator makes it clear that the goal of this discussion is neither to judge any particular staff nor to arrive at one correct answer, but to allow each staff to see a particular interaction from others’ perspectives. This level of discussion again focuses on positive, strengths-based elements, but more rigorously pushes the conversation from “we think this practice works here” to “here is why we think this practice works here.” The approach does not make assumptions that an outside facilitator would know which practices are effective and feasible in a local setting, but aims to help local staff to become more intentional about practices that they are already doing. It is possible that the process may eventually trigger and support staff to discuss current weaknesses in the practice and program, but the primary focus from the start is on what might be working already.

The Current Study

This study evaluates the pilot of the SI professional development intervention. To organize analyses we used Kirkpatrick’s (1998) four-level training evaluation model, which has been the dominant approach in corporate training for many decades (Nickols 2005). More specifically, we used an application of the Kirkpatrick model to the OST professional development field recommended by the Harvard Family Research Project (Bouffard and Little 2004). This model provides a suitable framework for addressing the multiple levels involved in professional development. In this study, we address the first three levels (reaction, learning, and behavior) of this model. The fourth level, “results”, typically refers to financial success and though we might reframe this as child learning gains, it is beyond the scope of the study.

Our primary research questions are shown in Table 1 with their related data sources. Aligned with these research questions, we propose three hypotheses. Based on previous studies of the use of self-videos in professional development, we hypothesize that (1) participants will react positively to the SI training and rate it as enjoyable and useful to their practice. We also expect that during PD workshops staff will learn about adult–child interactions and thereby shift their beliefs about their abilities to deliver high quality interaction. Specifically, we hypothesize that (2) youth work relational self-efficacy—or perceptions of one’s own effectiveness working with children and youth—will increase. Finally, based on the aims of the intervention and our experience in pre-piloting, staff should improve their interaction practices with children. Specifically, we expect to (3) see improvements in the areas targeted by the professional development (connection, reciprocity, progression, and participation) and increases in the ratio of time that staff spend in the program directly interacting with children.
Table 1

Research questions and measures aligned with Kirkpatrick’s model

Level

Research question

Data sources

Reaction

1. Do participants find the SI training enjoyable and useful?

Single wave questionnaire:

 Session evaluations

 Post intervention evaluations

Focus groups/interviews

Learning

2. Do staff related beliefs change?

Pre/post questionnaire:

 Youth Work Relational Efficacy

Focus groups/interviews

Behavior

3. Does participation in SI improve the quality of adult–child interactions in youth programs? For whom is it most effective?

Pre/post video clips:

 SIT Total Score

 Time interacting

Methods

Sample

Beginning in September of 2013, the SI team provided professional development training in two cohorts to 70 individuals working with children and youth in after school program settings. Of those who participated, 49 were adult staff members and 21 were teen participants in leadership roles at the programs (these youth participated in only some forms of data collection, as noted below). The 10 study sites were affiliated with five organizations in one large U.S. city in the Midwest. All were located in areas with high levels of poverty, and the 5 sites in the second cohort were located in urban subsidized housing communities. Recruitment occurred at the sites (starting in August for cohort 1 and November for cohort 2); the research staff invited all staff members and youth leaders to participate. We obtained informed consent from all participants, and the Institutional Review Board at the [NAME OF UNIVERSITY] approved this research.

Participating adult staff members ranged in age from 19 to 70 with a mean age of 35 years (SD = 12.6). Highest education levels reported were 4 % with Graduate degree, 27 % a Bachelor’s degree, 11 % a Professional certificate, 20 % an Associate’s degree, 22 % some college credits, and 16 % a High School diploma or GED. Of those who chose to report race/ethnicity (about 94 %), 78 % identified as Black or African American, 18 % as White or Caucasian, and 4 % as Multi-Racial; one participant also identified as Hispanic. Many staff members lived in or near the communities in which the program sites were located with an average commute of 3.7 miles. The majority (55 %) reported working at the program site full time (30–40 h per week), 38 % reported working part time (15–29 h per week), and 4 % occasionally. No staff members reported working on a volunteer basis. 30 % reported having held their position for 5 or more years, while 20 % reported having taken on their role within the past year; the other half had spent between 1 and 5 years in their position. About 40 % reported having previous work experience in a similar, youth work position.

Procedures

The pilot occurred in two cohorts over the course of 10 months. Each Cohort 1 site (n = 46) participated in two SI professional development workshops, and each Cohort 2 site (n = 24) participated in three workshops. All data were collected at the program sites. Video data (separate from the video used in the intervention) were collected prior to and after the set of workshops. Adult staff completed questionnaires prior to their first workshop as well as following their final workshop. In addition, participants completed short evaluations after each workshop session.

Workshop facilitators included the study PI, the project manager, and a trainer. Facilitators prepared for this role via consultation, iterative co-creation of workshop agendas, and cycles of practice, observation, feedback, and reflection during the pre-pilot phase; this ultimately led to the creation of a “how-to” manual for SI facilitation. Separate workshops were held at each of the program sites so that staff attended with their peers. Facilitators led workshops in pairs or were assisted by graduate student members of the research team. Prior to each workshop, short video clips of staff members interacting with children at the program were recorded for use in the workshops. Responsibility for collecting these clips was gradually passed from research team to program staff.

Each SI Workshop typically lasted 60–90 min and began with an opening activity related to interactions. We opened workshops by asking individual and group reflection prompts (e.g., “How do you know when you’ve had a good interaction with a child?”). The groups then watched and discussed several short (1–3 min) video clips of themselves and their peers, followed by an open discussion. Facilitators encouraged the staff member(s) appearing in the video to speak first and talk about what they remembered about the interaction shown. Then, others were invited to describe the interactions they saw (“what did you notice?”) and facilitators attempted to maintain a focus on the strengths present in the videos. Leaders gradually introduced the SIT as a conversation guide. Each workshop ended with a short closing activity (e.g., tossing a ball and sharing a “take-away” learning). Program staff who attended all workshops spent approximately 4.5 h engaged in the intervention and received a $30 pre-paid credit card (as a token related to the research; not for attending the training). The lead author takes responsibility for the integrity of the data and analyses in this study.

Measures

Video Clips

Research team members collected 5 min clips of video for staff members in cohort 2 (n = 20), the cohort for which the training procedures were finalized. We video recorded each staff member once before the intervention and once after. Trained coders who were blind to time point assigned scores for each of the SI domains as well as time (in seconds) that the adult in the clip spent interacting with at least one youth. Intraclass correlation coefficients (using a two-way mixed consistency model) for the SIT Total score was acceptable, with an ICC(1,4) of .70. Interrater reliability for individual items was expectedly lower, with moderate reliability for three items (Connection ICC[1,4] = .55, Reciprocity ICC[1,4] = .56, Participation ICC[1,4] = .68) and poor reliability for Progression (ICC[3,8] = .32). These numbers suggest that the SIT Total score can be considered reliable but the individual item scores, particularly for Progression, should be considered with caution (Landis and Koch 1977; Portney and Watkins 2000; Shrout and Fleiss 1979). The SIT demonstrated good internal consistency (α = .80).

Questionnaires

Adults in both pilot rounds completed paper and pencil questionnaires. The pre-post questionnaires took 10–15 min to complete and session evaluations were typically completed in 5 min or less. All questionnaires were completed in program sites where each training was conducted. We created the measures described below because we were unable to find existing scales that addressed the specific aims of this research.

Demographic Characteristics

Participants self-reported their race/ethnicity, age, miles from residence to program site, work status: full time/part time/occasional, years of previous youth work experience, and highest education level obtained.

Session Evaluations

With five Likert-type items on a scale of 1–5 (strongly disagree to strongly agree), this scale was designed to assess attitude toward individual PD sessions upon their completion (e.g., usefulness, enjoyment). A sample item is “This workshop was of high quality.” Exploratory factor analysis with these items suggested a single factor and reliability was acceptable (α = .77).

Post-Intervention Evaluation

After participation in the intervention, 13 total items on a Likert scale (1–5, strongly disagree to strongly agree) were used to assess attitudes about the SI process. Exploratory factor analysis suggested 2 subscales: Process (α = .93; e.g., “Attending trainings was enjoyable”) and Results (α = .85; e.g., “I think about things differently after participating in SI.”). Additionally, 7 items on a Likert scale (1–5) addressed attitude toward the Simple Interactions Tool (e.g., “[The Simple Interactions Tool] captures what is important about interactions between adults and youth.” Exploratory factor analysis suggested one factor, with excellent internal consistency (α = .92).

Youth Work Relational Self-Efficacy

Seven Likert-type items on a 1–5 scale (strongly disagree to strongly agree) asked participants to rate their own level of ability (not well at all, fairly well, moderately well, very well, extremely well) in response to questions like “[how well can you] interact respectfully with children and youth?” (α = .89). This question was asked before and after implementation of the intervention.

Focus Groups and Interviews

Leader Interview/Focus Group Protocol

Leaders (e.g., program directors, site supervisors) (n = 7) participated in either interviews or focus groups to provide feedback about the intervention. This included four individual interviews and one focus group with three participants. Protocols were semi-flexible and included the topics of the role of the SI approach in working toward program goals, the alignment of the SI domains with beliefs about practice, and directors’ own reactions as well as their perceptions of staff members’ reactions to the workshops. Questions included, for example: “What changes, if any, have you noticed in staff?” and “has the intervention changed or deepened your thinking about the types of adult–youth interactions that are beneficial for youth?”

Staff Focus Group Protocol

A subsample of staff members voluntarily attended focus groups to discuss topics parallel to those explored with the directors (n = 26 across 7 focus groups). The number of participants in each focus group ranged from 2–7 staff members. Additional emphasis was placed on discussing the staff members’ experiences in the SI workshops (e.g., “What was it like to see yourself on video?”) and applications to practice (What changes, if any, have you noticed in your thinking and/or your practice? In others’ practice?).

Analysis Plan

We took a sequential, explanatory, mixed-methods approach to understanding participants’ SI professional development experience (see Cresswell et al. 2003). This approach began with and emphasized quantitative methods then included qualitative analyses of focus groups and interviews to deepen our understanding of participants’ reactions and learning processes (i.e., “QUAN → qual” in Morse 1991). The aim for the qualitative analyses was to triangulate and complement quantitative results (Greene et al. 1989), for example, by capturing participants’ explanations of mechanisms of change.

To gauge how much participants enjoyed and valued their SI experience (reaction), we examined descriptive statistics for Session Evaluations (for each workshop), SI Evaluations (for the whole intervention), and an evaluation of the SIT. We also analyzed relevant portions of focus group and individual interview transcripts (described below). For learning, we used paired samples t-tests to compare baseline and post-intervention youth work relational self-efficacy and examined qualitative data. We examined behavior change in two ways with the individual as the unit of analysis. First, a multivariate repeated measures analysis of variance in which the four SI items (connection, reciprocity, progression, and participation) were within-subject factors allowed us to examine overall pre-to-post changes as well as differences for individual SI items with a single test. We also computed the effect size of this difference with Hedge’s g with an adjustment for small samples, recommended in the What Works Clearinghouse Procedures and Standards Handbook 3.0 (2014). Second, we conducted paired samples t-tests to compare time spent interacting at pretest and posttest. We performed all quantitative analyses using IBM SPSS version 22.

For qualitative analyses, we coded and analyzed focus group and interview transcripts using Dedoose web-based analysis software. Three members of the research team engaged in iterative cycles of coding (Saldana 2009). Each team member first conducted independent open coding, and then the team collaboratively developed and refined a codebook. Examples of codes include behavior change (discussion of changes in behavior related to SI), the tool (discussion about the SIT), video (discussion about being video recorded or observing video recordings of self/others), and assessment of SI (statements about the SI process of an evaluative nature). The team established sufficient reliability by using Dedoose’s test function; initial Pooled Kappa scores (De Vries et al. 2008) were .61 and .74. The team refined several code definitions and performed an additional reliability test and met the acceptable Pooled Kappa score (Cohen 1960) of .88 and .86 (Landis and Koch 1977). Upon achieving this level of reliability, three coders divided the transcripts and engaged in focused coding.

Results

Reaction

Ratings of the individual workshop sessions and of the entire SI process indicated that participants enjoyed their involvement in SI and found it to be a useful and worthwhile investment of time. For example, 93 % agreed or strongly agreed with the statement “I would recommend SI to a colleague” and 88 % with the statement “Participating in SI was useful to me”. Reactions to the SIT were also favorable (see Table 2). In addition, the process and results subscales from the post-intervention evaluation indicated that, on average, participants agreed that the SI process was useful and that they learned as a result of the workshops.
Table 2

Unadjusted measures before and after SI

 

Before SI

After SI

Pre-post testsa

 

n

M (SD)

n

M (SD)

F

p

Reaction

      

 Session Evaluations

      

  Workshop #1

  

39

4.75 (.33)

  

  Workshop #2

  

41

4.68 (.37)

  

  Workshop #3

  

19

4.63 (.39)

  

 SI evaluation: process

  

39

4.37 (.56)

  

 SI evaluation: results

  

38

3.82 (.61)

  

 SIT evaluation

  

18

4.35 (.53)

  

Learning

      

 Youth work relational self-efficacy

28

4.42 (.42)

28

4.40 (.49)

.01

.95

Behavior

      

 SIT (coded video)

      

  Total score

20

2.85 (.75)

20

3.54 (.41)

4.62

.02

  Connection

20

3.30 (.92)

20

4.05 (.71)

5.38

.03

  Reciprocity

20

2.88 (1.05)

20

3.75 (.64)

7.43

.01

  Participation

20

3.10 (.88)

20

4.23 (.57)

16.82

.001

  Progression

20

2.13 (.93)

20

2.15 (.76)

.73

.40

 Time interacting

20

470 min

20

477 min

  

aFor youth work relational self-efficacy and each item of the SIT, we present results from repeated measures ANOVAs. For Total Score, the difference test presented is from a multivariate repeated measures ANOVA with items as within-subject factors

Analysis of interviews and focus groups with leaders revealed that they found SI to be an appealing and effective approach to professional development, and that they planned to continuing to use SI in the future. Leaders commonly reported that they believed targeting relationships to be foundational and supportive of multiple goals. For example, when asked whether placing a strong focus on improving adult–youth interactions (i.e., participating in SI) competed with other goals (e.g., support for academic or religious development), leaders said things like:

In all of those things that we do with kid there’s interaction…so no I think it definitely doesn’t compete. I think it complements what we do because again we go about our day, our week, doing what we do with kids. And I don’t think we often get an opportunity to step back and look at what we’re doing.

Another leader echoed this perspective that interaction is pervasive in youth work and emphasized the foundational role that adult–youth relationships play in supporting the full gamut of program goals, saying:

The relationship has to be in place in order to meet [other] goals. If you’re not showing respect to [children], and they feel like you don’t like them, or if you’re always talking down to them, if they are ignoring you, if it’s one-sided, if it’s not reciprocal, then we can’t really help them with their academics or with their social skills.

Similar reflections about the importance of developmental interactions were expressed across interviews and focus groups, and not a single leader indicated that they felt that the SI focus on relationships detracted from their ability to focus on other program goals.

Many staff members experienced initial apprehension when faced with the idea of watching videos of themselves, but then valued the experiences and found the videos to be effective tools for reflection. Some participants reported that they “still felt nervous” or “weird” about watching themselves on video, but others explained that their comfort level increased after the first experience; for example, “it was comfortable the second time because we knew what to expect.” However, even those participants who expressed some level of ongoing nervousness emphasized that the experience was positive and beneficial.

Finally, participants often commented on the positive focus they perceived in the workshops; for example, one staff member explained the importance of the strengths-based approach:

I think when you guys first came in, you said you were looking for like positive interactions and to remind us I think before we watched the video to say these are good things, you don’t need to defend yourself. Automatically that’s what you think. First I look like crap on video, I sound weird [laughing], and then what did I screw up in this clip? To have the reminder this is good, find the good, don’t focus on what you could have done differently.

This strength-based aspect of the structure in the context of discussions of video was generally well received by participants.

Learning

Youth work relational self-efficacy did not significantly change from baseline (4.42) to post (4.40); t[27] = .24, p = .815 (See Table 2). The variable, however, exhibits negative skew and low variance, potentially limiting its sensitivity to change. In contrast, during focus groups and individual interviews, staff members generally reported that they understood and could apply the SI domain concepts to practice. During focus groups, participants said things like:
Interviewee2:

Well, I think…we already had it. That just made us realize it

Interviewee1:

What these are

Interviewee2:

Yeah

Interviewer:

Gotcha. Explain what you mean by that

Interviewee2:

The little stuff we do, like helping kids with homework and stuff like that. I mean, it’s already stuff we was doing, but we never knew-

Interviewee1:

Knew these terms

Interviewee2:

Yeah. The terms for it

However, unlike the other three SI domains, some participants described having trouble understanding and applying progression. For example, some staff members found the pictorial representations confusing; for example, one noted, “I had to figure it out…I took [the figure to mean that] these hands are helping them and [the facilitators] had to break it down for me.” Being able to identify progression in short video clips was also a challenge for some participants. One said, “You really couldn’t see progression in like one to two videos. If you would see it, you’d have the whole year to see where a kid first started until like his middle point until they finish a project.”

Focus group participants described two major areas through which learning may have occurred; the first was through viewing and discussing the video clips. For example, one participant described her experience viewing her own clip during a workshop:

I really liked that we get to watch it and they ask us about it, because just watching it, you’re like okay, you see it. You take it in. And then when they ask you questions, it’s like you sort of piece it together and really think about what you were doing, why you did it, and how you can improve and things like that. So it was really helpful for me.

The second theme we identified was the perceived importance of tapping into and/or strengthening professional learning communities. For example, one participant said “it actually helps you like—when you learn something together, then you’re in together” and another commented that:

The great thing about this is that we feed off each other so seeing the teachers do what they did in their classrooms—because we don’t have time to actually go in and observe each other’s classrooms—but to have the camera in there and actually look at what’s going on and each person’s classroom you start to get a better understanding…

This and other staff members described how the videos provided a novel way to benefit from observing and discussing practice with their peers. For example, one participant explained:

I would say it’s really different, because I mean we watch them anyway at work and stuff, but I mean, getting their response to like what they were doing and having them explain it helped me see it differently in a way because it’s like the way they see it is different than the way I would see it. They explained like why they did certain things and stuff like that.

Rather than being redundant with daily interactions with colleagues at work, therefore, participants described ways that watching and talking about the videos during SI workshops afforded valuable opportunities for collaborative reflection and learning that they did not otherwise typically have.

Behavior

The results of the repeated measures MANOVA showed statistically significant improvements in the SIT Total scale with a raw pre-to-post difference of .69 (see Fig. 3). This difference has a large effect size of 1.14 (Hedge’s g). We also saw significant gains in three out of four domains of developmental interactions—all but progression—with the follow differences: connection: .75; reciprocity: .88; participation: 1.13; progression: .03(ns). In the 5-point response scale of the tool, scores moved from about 3 to 4 in connection, reciprocity, and participation; and progression scores started and stayed at about 2. By program site (five sites with 3–5 staff per program), four programs’ Total SIT scores increased (range .41–.98) and one decreased slightly (−.31). By individual staff members, two staff had scores that went down more than .5; seven had scores that stayed roughly the same (+ or −.5), and exactly half (n = 10) had scores that increased more than .5. We detected no significant changes for proportion of time spent interacting (73 % of the time at baseline vs 80 % at post).
Fig. 3

Observed SIT scores before and after the SI intervention: estimated marginal means

Discussion

Findings from this pilot evaluation of the SI professional development program suggest that staff found it enjoyable and useful (reaction) and that participating corresponded with substantial improvements in staff–child interaction (behavior), although the research design did not allow for causal inference. The large effect size (Hedge’s g = 1.14) was favorably comparable to other caregiver professional development studies; Fukkink and Lont’s meta analysis found that the aggregation of results from 15 studies yielded a medium effect size of g = .45.

We were not able to identify quantitatively the mechanisms through which behavior change may have occurred (learning), but it did not appear to go through youth work relational self-efficacy. The lack of change in relational self-efficacy may be influenced by response-shift bias. In this form of bias, individuals’ ideas about a construct change after they learn more about that construct (Howard 1980). In this study, participants ratings of their youth work relational self efficacy may have started at a high level prior to the intervention and then decreased (or remained constant) after workshop discussions of criteria with which to assess their own relationships with children and youth. Improvements also did not appear to involve increasing the proportion of time interacting but rather improving the quality of interactions. Qualitative data suggests that participants tend to have initial apprehension to seeing themselves on video but that this quickly fades and staff see the video as integral and useful. Staff also appreciated the strengths-based aspect and saw it not as overly positive but rather as a way to create a safe space for critical discussion. Staff also indicated that participating improved their professional learning community.

The simplicity and efficiency of the SI approach should not be understated. Staff attended only three 75 to 90 min workshops, spending a total of approximately 4 h engaged in this intervention. If the observed gains in SIT scores were due to the intervention, the effects were far larger than comparable interventions, which take considerably more time and cost. Part of this efficiency is achieved by focusing on one practice—relational interactions. The intervention may also be efficient because this practice is intuitive: relational interaction is a basic aspect of human interaction that participants have encountered throughout their lives, in youth work and beyond. It may be that SI does not teach anything new but rather it may awaken and strengthen practices that youth workers already possess.

The use of short video clips is a particularly promising feature of SI. This aligns with previous research that suggests that watching videos, especially of oneself, can be an effective professional development tool (Amobi and Irwin 2009; Borko et al. 2008; Hattie 2009; Kaasila and Lauriala 2010). Participants noted that, while uncomfortable at first, seeing themselves on screen helped them consider their practice in a way that was targeted, relevant, and engaging. The novelty of watching very short clips (rather than a full classroom lesson as is common in teacher education) makes for “bite-sized”, targeted discussions. The availability of inexpensive handheld cameras also makes it feasible for OST sites to continue utilizing the SI process of collecting and viewing clips with colleagues after the formal intervention has completed.

Participants described positive changes in their interactions with colleagues. They noted that SI improved the conversations they had with staff and supervisors—a mechanism we did not anticipate and therefore did not collect data to investigate in more detail. Research indicates that the professional learning community may be a “key factor” in the professional development of educators (Kaasila and Lauriala 2010, p. 1). The collaborative structure of SI allowed staff to observe one another, connecting them in ways that were previously difficult, and sometimes impossible. During workshops, staff were able to watch the positive practice of their peers and often sought to emulate these in their own interactions with children.

Perhaps due to the use of video and the professional learning community supported by the SI program, results from this study indicate that the strengths-based and interactions-based approach may, in fact, lead to behavior change; we saw significant positive changes in the quality of afterschool staffs’ connection, reciprocity and participation. As prior research suggests, highlighting strengths of a particular behavior may amplify good practices (Cooperrider and Whitney 2005; Marsh et al. 2004). Throughout SI, staff viewed clips of themselves interacting with children and then discussed the positive actions that occurred. Connection, reciprocity, and participation may seem especially relevant to afterschool staff where a large portion of their time is often devoted to hanging out or playing with children (Mahoney et al. 2007). Thus, it is possible that much of the workshop time was spent discussing the three domains of interaction that improved the most.

Participants did not show improvement in progression, this item had lower inter-rater reliability, and lower scores than the other dimensions at both pre and post. As shown in Fig. 2, progression is likely the most complex idea of the SIT—indeed, the image itself is more complex than the other three images. These findings have encouraged us to consider how in the future to better introduce this concept to afterschool staff, who may not have the experiences and the vocabulary (i.e., “scaffolding”) to understand these concepts as easily as other professionals that have been trained in this topic, such as teachers. For example, a 90 min training session can focus specifically on progression, with clips that highlight this item.

The findings also highlighted the potential of using SI to not only support individual staff growth, but also organizational and program development. Regardless of how effective staff–child interactions may be, certain aspects of such interactions may be enabled or constrained by the resources and limits (materials, design, or policy) inherent within the infrastructure of the organization or program. A reasonable and logical progression of the work (not within the scope of this study) would be a joint discussion by local staff and leadership as to why something like “progression” did not show improvement the way other dimensions did, and what actions may be worth experimenting both at the staff level and at the organizational level.

Although the scope of this study was appropriate for a pilot of a new intervention, the research has significant limitations. Without a control group it is impossible to detect whether the observed changes were indeed due to participation in the intervention. Relatedly, the small, relatively homogenous sample size limits generalizability. In addition, the current study had relatively weak measures available for addressing the Kirkpatrick model Level 2 of “learning”. The relational self-efficacy measure provided a proxy for this aspect but self-efficacy is quite different than learning. Overall, further research is needed to more rigorously investigate the potential impact and mechanisms of SI in OST settings.

Conclusion

SI represents a way of doing professional development that features important differences from existing models—both traditional training and quality improvement systems. A large number of existing professional development programs are explicitly or implicitly focused on identifying “deficits” or “low quality” targets for intervention. Many other professional development programs also rely on prescription of “best practice” ideas to staff who are presumed to lack familiarity or fluency with such practices. In SI, rather, the use of short video clips of staff–child interactions with a strengths-based protocol allows for conversations about local practices to occur in a positive, practical, peer-driven way that does not rely on top-down or outside-in notions of best practice.

SI does, however, provide a simplified theory-based and practice-informed framework and tool (Fig. 2) based on broad dimensions of effective adult–child interactions, to guide the reflective process. The framework is not prescriptive or domain-specific; that is, it does not indicate or dictate what a staff should do in a particular situation. Rather, the framework can be used to facilitate a discussion of self-assessment by the staff (e.g., “We have pretty strong ‘connections’ with our youths”), and point to principled directions for improvement (e.g., “We have an opportunity to increase ‘reciprocity’ in this program activity”). These discussions and reflections by the staff become the means by which staff’s intuition hopefully grow into intention. In short, SI empowers staff to identify local strengths and amplify them, and extend such strengths to address remaining challenges. This fundamental shift from top-down to bottom-up quality improvement suggests an innovative way of doing professional development that may have far reaching implications. The promising results from this pilot study suggest that the SI professional development program may provide a practical and powerful strategy for improving the interactions that OST professionals have with children and youth.

The SI approach is grounded in developmental theory and evidence (Li and Julian 2012) and is similar to but simpler to carry out than other promising interactions-based quality interventions (Allen et al. 2011). In practice, SI has been implemented across a diverse range of low-resource developmental settings, including urban schools, childcare, community programs for special needs children, and residential institutions (e.g., orphanages). This pilot study represents the first effort to systematically investigate this professional development approach.

Video-based approaches to professional development are gaining in popularity, particularly in K-12 classrooms. These approaches tend to involve video lesson studies amongst teacher peer groups (Stigler and Hiebert 1999), coaching of teaching footage, often web-mediated (e.g., My Teaching Partner; Allen et al. 2011), or video-based illustrations of best practices (e.g., teachingchannel.org). The approach piloted in this study is similar to those in that we also find authentic practice videos to be useful raw materials for learning and quality improvement. Our approach differs from these efforts in the unit of analysis. Instead of dissecting a full lesson or illustrating repeatable “teacher moves”, the unit of analysis in our work consists of moments of staff–child interactions. We sharpen the focus both for feasibility purposes and based on the understanding that good “lessons”, “moves”, or program activities build from a foundation of high quality interactions. Whereas lessons or moves may not readily transfer across settings, understanding the underlying dynamics of interaction could support staff growth from intuition to intention.

The long-term implications of this research and evaluation work not only has implications for OST programs, but can serve as a model for parallel evaluations in other developmental contexts for a simple, practical, and potentially effective model of supporting quality improvement for and by local staff.

Acknowledgments

This research was supported by grants from The Grable Foundation (132R10) and The Heinz Endowments (E1386).

Compliance with Ethical Standards

Conflict of interest

All authors declare that they have no conflicts of interest.

Ethical Approval

All procedures performed in this study involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.

Informed Consent

Informed consent was obtained from all individual participants included in the study.

Copyright information

© Springer Science+Business Media New York 2016

Authors and Affiliations

  1. 1.University of PittsburghPittsburghUSA
  2. 2.Saint Vincent CollegeLatrobeUSA

Personalised recommendations