Participants
The present study took place in public grade schools in the greater Montreal area, in the province of Quebec (Canada). Adopting a universal approach, the How-to Parenting Program was offered to all parents of recruited grade schools.
Assessments were made by participating parents, their participating child and the child’s teachers. Teacher reports were collected to test the generalization of the program’s impact (children’s improved mental health at school) and to gather reports from blind participants, thereby reducing social desirability attached to parent reports. Inclusion criteria for parents were: having at least one child attending a participating grade school, aged between 5 and 12 years old. Inclusion criteria for teachers were: currently teaching a child whose parent participates in the study and who consented to their targeted child’s teacher’s participation. Inclusion criteria for children were: being 8 years or older and having parental consent. Exclusion criteria for all were: inability to communicate in French.
For parents who had more than one child attending grade school, we guided them in identifying their “targeted” (i.e., participating) child. To avoid any bias that could be introduced by letting parents choose the targeted child themselves, we asked parents to select the child who was 8 years or older. If parents had more than one child over 8 years old, we asked them to select their child closest to 9 years of age. Similarly, if parents had more than one child under 8 years old, they were also asked to select the child closest to 9 years of age.
Intervention
How-to parenting Program’s general format
Seven weekly sessions took place at children’s grade schools, from 7 to 9:30 p.m. The French version of the “How to talk so kids will listen & listen so kids will talk” workshop was offered by two trained group facilitators (this version is manualized; verbatim is based on the English audio format [44]). The workshop closely matches the book: the first six sessions cover the first six chapters and the last session is a general, integrative overview. Parents can learn an average of five skills per week during the six topical sessions. A common feature throughout the various communication skills is the use of an informational (vs. evaluative) style that doesn’t target the child’s character. Indeed, whether praise is given or a problem is described, parents are invited to focus on the task (e.g., “I see books back on their shelves and toys in their box!”), refraining from alluding to the child’s character or worth (e.g., “You are such a good girl”).
During the first session, the “rules of conduct” (e.g., confidentiality) are presented, and parents are invited to introduce themselves briefly by talking about what surprised them in their parenting role. This introduction is meant to address parents’ motivation by eliciting their wishes and expectations, which in turn predicts successful behavioral change.
Through sessions 1 to 6, a total of 30 skills are taught. Every session begins with a discussion about the previous week’s homework (except in session 1). Facilitators devote up to 30 min to welcome and listen to parents’ account of their new skill implementation, whether it seemed successful or not. Next, the main theme is introduced with a perspective-taking exercise. Parents are first placed in “children’s shoes” by listening to typical comments/requests that children often hear and they are then encouraged to describe how they feel. The presentation of alternative communication skills follows, illustrated in comic strips. The rest of the session is composed of various exercises, allowing parents to practice each skill. Most exercises are role-playing activities, often conducted in dyads. Other exercises take place in subgroups and still others are conducted individually. Each involves note-taking in a workbook. In general, parents describe how they feel in a scenario, when playing the role of a given child or a given parent. Group members then share their experiences and a structured discussion addresses participants’ reactions and questions. Before leaving, the homework is introduced by facilitators who stress the importance of giving skills a try at home, with their own child or children.
The 7th session is a structured discussion to review, discuss and integrate the recently learned parenting skills. During that overview, parents think of a challenging situation with their child and all participants are invited to suggest how their newly acquired skills could be useful. At the end, facilitators give each participant a colored summary sheet as well as a stack of small illustrated cards that summarizes all skills (created for the present RCT). They also acknowledge parents’ efforts and accomplishments in their discovery and early mastery of many new skills and stress the importance of cultivating a patient, compassionate attitude toward themselves.
Material
All parents had their own workbook to complete exercises during the program’s sessions and for their weekly homework. They also had a copy of the book [45] to complete the assigned readings. The participant workbook was provided free of charge to parents, but parents were asked to make a 25$ (Canadian; CAN) deposit for the book. This amount was given back at the end of the program, unless parents wished to keep their book. We lent the book without deposit to parents who expressed that this expense was difficult for them.
Adherence
A large number of facilitators received training, as a large pool was needed for this study (up to eight leaders available per condition, per year). In line with the inclusive stance adopted by the program’s authors, there was no “required qualification” to become a facilitator. Some interested facilitators were graduate students in psychology, others were parents and/or adults involved in education or a related domain.
Facilitators’ training
Given that with the French version, group facilitators could not rely on audio or video recordings, they received a 3-day training to promote adherence. This training was provided by a mental health professional who has had a long experience offering the How-to Parenting Program. In addition to being exposed to the program’s content, facilitators also learned about the process of facilitating it. Topics included avoiding acting as an “expert” and using the program’s communication skills oneself when facilitating the program. Facilitators were also encouraged to convey unconditional regard, be empathic and foster self-compassion. Finally, this training also addressed some of the particularities associated with facilitating a group within a RCT, such as content fidelity. These included having facilitators’ own voice recorded during all sessions, following the workshop material as much as possible, and refraining from integrating ideas, exercises, or opinions from other sources.
Supervision meetings took place before and after each wave, during which facilitators were offered and shared a wide range of useful information, which was written in a “facilitator’s guide”, updated yearly. This dynamic guide comprised both practical (e.g., material provided) and process-oriented guidelines (e.g., avoid trying to convince a parent appearing skeptical). Each team was composed of a more and a less experienced facilitator, who shared their experience and questions after each session. Individual supervision was also available if needed, offered by one of the principal investigators, also a licensed psychologist.
Adherence monitoring
The five aspects of program integrity were assessed, as it is essential to evaluate whether the intervention was offered completely (content fidelity) and adequately (process fidelity [46]). After each session, facilitators were asked to rate the percentage of material that was covered using a brief weekly online questionnaire. In addition, all sessions were audiotaped to permit two independent coders to verify content fidelity [46, 47]. Specifically, they assessed whether each activity was completed or not using a checklist [42, 46]. At the end of their 7-session program, facilitators rated each parent’s general involvement and enthusiasm to measure their responsiveness [47]. Process fidelity was assessed by parents, who rated their group facilitators’ empathy, enthusiasm and preparedness [46].
Exposure
To assess participants’ exposure to the program, group facilitators took attendance on-site, using a list of participants. They then transcribed this information when they received their brief post-session questionnaire, online.
Differentiation
While some participants in the experimental condition may have not fully engaged in the program, some parents in the wait-list condition might have gained access to some of the program’s content. Indeed, contamination was possible since we decided to randomly assign participants within each participating school. This procedure was chosen over a cluster randomized trial to avoid conflating our experimental manipulation with schools’ characteristics, such as their size, socio-economic status (SES), educational philosophy, and to remove this unexplained between-school variability from the between-group main effect, thereby increasing statistical power.
Because all parents in the study heard about the program during the information meeting (see Recruitment section below), some parents assigned to the wait-list condition may have decided to buy and read the book (and try some skills). However they could not take part in the program, which presumably fosters increased learning due to group participation (e.g., weekly sessions with facilitators and other parents, exercises, homework, discussion and modeling). To control for this potential confound, we asked parents in the wait-list condition whether they had bought and read the book. Finally, to ensure differentiation between the How-to Parenting Program and other interventions as well as between the experimental and the control conditions, parents in both conditions were asked to document any intervention and/or a therapeutic activity used for their family.
Outcomes
Most parenting program studies emphasize E problems since they are disruptive, but I problems are also an important source of suffering and need to be prevented. Moreover, assessing the level of children’s strengths and well-being allows for a complete account of their mental health. The primary outcome measure is children’s I and E mental health problems, as assessed by their parents. Secondary outcome measures include other indicators of children’s mental health, as teachers rated children’s E and I problems and socio-emotional strengths and children rated their subjective well-being. Secondary outcomes also include the three dimensions of optimal parenting (as perceived by children, in addition to parental reports) and parents’ own mental health (indicators of both symptoms and well-being).
Participant timeline
All parents and children completed their questionnaires before randomization (pre-intervention; T1), 1 week after the seven-week program (post-intervention; T2), and again at 6-month (T3) and 1-year (T4) follow-ups to assess change over time (see Table 2). Teachers each completed two questionnaires, since children’s teachers at the beginning of the study (T1 and T2, in February and April) were not the same teachers as during the last part of the study (T3 and T4, in October and April of the following school year).
Table 2 Schedule of Enrolment, Intervention and Assessments of the How-to Parenting Program RCT Sample size
The pilot study suggests that medium effects can be expected (i.e., Cohen’s d around.5) [48] and the primary focus of the present study pertains to cross-level interactions. Hox [49] suggests that these effects depend more strongly on the number of groups than on the total sample size. Tabachnick and Fidell [50] further suggest that sufficient power for cross-level effects is obtained when the number of groups is 20 or larger, whereas Paterson & Goldstein [51] recommend having at least 25 groups. Following these recommendations, our goal was to have 32 level-3 units (parenting groups), 256 level-2 units (parents), and 1024 level-1 units (time points). Sufficient power was expected since this number of parenting groups is above the recommended threshold and allows for recruitment drawbacks.
Recruitment
The goal was to conduct the study within four grade schools per year, for 4 waves (recruiting about 64 parents to form 8 groups at each wave; see Flowchart in Fig. 1). The RCT began after obtaining ethics approval and funding. We first sought approval from three school boards, a prerequisite for soliciting school principals. We then sent information to school principals by mail at the beginning of each wave (September), who could contact the research coordinator for further information if they were interested in this program implementation and evaluation. There was no inclusion or exclusion criterion for school recruitment; all schools were invited. Given that we did not target specific types of schools nor SES neighborhoods, school participation first depended on school principals’ interest. When a school principal was interested in our study, information was given to all families by sending an information flyer via children’s schoolbag, in December. Parents then communicated their interest in the program by returning the flyer’s response section (reply slip). We asked teachers to refrain from recommending the parenting program to certain parents, to highlight its universal and voluntary nature. Recruitment thus also depended on each particular school’s general level of parental interest since the next recruitment stage (information meeting) could take place solely in schools in which a large number of parents returned their reply slip.
If both parents of a same family expressed their wish to take part in the parenting program, we allowed them to do so (when there was enough space in a group) although data from the second, “duplicate” parent would not be used in statistical analyses. Identifying which parent was the participating one was decided by randomly choosing one of their sealed envelopes. Whether one or two parents participated in the program was coded, to examine whether this factor influences the program’s efficacy.
Consent and allocation
Information meetings for parents were held in schools in January. One of the principal investigators met with interested parents to provide them with information about the parenting program and its assessment. Parents were thus informed about the How-to Parenting Program, the random assignment, their voluntary participation as well as that of their participating child and his/her teachers. Parental consent forms were filled out at the end of that information meeting. This consent form comprises three distinct sections, allowing parents to give their consent (or not) separately for (a) their child’s participation, (b) their child’s teachers’ participation and (c) their own.
Random assignment of families was made after parents’ T1 questionnaires were collected, within each school. The research coordinator extensively shuffled the sealed anonymous envelopes containing T1 questionnaires before randomly assigning them in one of the two conditions. Next, parents received a phone call informing them of the group they were assigned to (either in the group beginning the following spring or in the group beginning during spring of the following school year; see Fig. 1 and Table 2).
Blinding
Since parents knew in which condition they were assigned, their children may also have been aware of it. However, all research assistants (RAs) collecting child reports were blind to the intervention conditions, according to PROBE methodology to reduce assessment bias. Moreover, all teachers were asked to refrain from trying to know if a given pupil’s parent was taking part in the parenting program.
Data collection methods
Parents
At the end of each information meeting, parents who had decided to take part in the study filled out a T1 paper-pencil questionnaire on site (see Instruments section below), after completing their consent form. When filling-out their T1 questionnaire, parents indicated whether they preferred to receive paper-pencil or online questionnaire. We thus either sent a paper-pencil version of T2, T3 and T4 by mail or provided parents a link, via email.
We collected all parent-reports (PR) at T1, prior to randomization. We also aimed to collect all of the T1 child-reports (CR) and T1 teacher-reports (TR) before the first session of experimental groups. We coded whether any of the CR or TR were collected after that, to verify whether including them influences the obtained results.
Teachers
The research coordinator met with participating schools’ teachers during one of their scheduled meetings, to briefly provide them with key information about the study. Teachers learned about the overall procedure and about what their possible participation would entail, i.e., filling-out a questionnaire about one or more of their pupils, on two occasions (either at T1 and T2, or at T3 and T4). Since children move to the next grade the following year, new teachers were also contacted and asked to fill-out the third (T3) and last (T4) TRs. All teachers for whom parental consent were obtained received their own consent form and paper-pencil questionnaires, in their school mailbox.
Children
Within each school, a RA met with participating children (individually or in small groups of a maximum of four children) in an available quiet room (e.g., school library) during a time that did not include any test or special activity. The RAs first informed children that their parent had agreed to participate in a study, without mentioning the parent’s participation to the parenting program. They then invited the children to fill out a questionnaire but specified that even though their parent gave them permission to participate, they could decide for themselves if they wished to participate or not. All children thus gave their verbal assent for their participation. Children completed paper-pencil questionnaires on their own but the RA remained available to answer questions about the questionnaire and study, if needed.
Group facilitators
Group facilitators audiotaped the sessions, answered a short questionnaire at the end of each session about the material covered, and monitored parents’ attendance. We also collected information about facilitators’ experience (in years), their age and sex, and whether they had children of their own to control for these factors, if needed. Each facilitator signed an informed consent form before providing this information.
Distinguishing the program’s implementation from its evaluation
To reduce assessment bias, we ensured that parents made a clear distinction between the parenting program (which we called “the workshop”) and its evaluation (which we called “the study”). Second, we explained that compensation was contingent upon questionnaire completion, not on program participation. We also made this distinction salient by assigning different tasks to different members of our team. The research coordinator and RAs (rather than facilitators) took care of all research communication and procedures (i.e., questionnaires, consent forms, compensation) to foster role clarity. Group facilitators were asked to avoid talking about the study and to refer parents to the research coordinator if questions about the study arose.
Instruments
Primary outcome
At each assessment time (pre-intervention, 1-week post-intervention, 6-month and 1-year follow-up), child’s mental health was assessed with different questionnaires via three different assessors (i.e., children themselves, their parent and their teachers). First, parents were asked to evaluate their child’s mental health using the two subscales - I and E problems - of the Child Behavior Checklist [52] (CBCL), a common outcome in trials assessing parenting programs. The CBCL is one of the most widely used validated instruments to assess children’s mental health. The E syndrome (Cronbach alphas T1/T2 = .88/.85 in our pilot study) reflects rule-breaking behavior and aggressive behavior whereas the I syndrome (Cronbach alphas T1/T2 = .81/.78 in our pilot study) reflects problems of anxiety/depression, withdrawal/depression and somatic complaints.
Secondary outcomes
Complementary measures of child mental health
Children were asked to evaluate their own well-being using measures of positive affect, life satisfaction and self-esteem. Children’s positive affect scale was assessed with an adapted scale [53] based on the Positive and Negative Affect Schedule (PANAS) for children [54, 55] and used in our pilot study [41]. This subscale includes ten positive emotion items, chosen for their simplicity. This French subscale showed good internal consistency in our pilot study (Cronbach alphas T1/T2 = .86/.88). Children’s self-esteem was measured with the Rosenberg’s Self-Esteem Scale [56, 57], one of the most widely-used measures to assess children’s global self-esteem. It assesses the extent to which children have a positive attitude toward themselves, and shows good construct and convergent validity [57] as well as good internal consistency (Cronbach alphas T1/T2 = .71/.83 in our pilot study). Finally, items about children’s life satisfaction were drawn from the French version of the Satisfaction with Life Scale [58, 59], a subscale which demonstrated good internal consistency (Cronbach alphas T1/T2 = .89/.93) in our pilot study.
Children’s mental health problems and socio-emotional strengths were also assessed by their current school teacher. Teachers were asked to fill-out the Teacher-Child Rating Scale [60] (TCRS). The problem subscales of the TCRS assess I (shy-anxious) and E (acting-out) problems whereas the socio-emotional subscales tap frustration tolerance, task orientation and social skills, important self-regulatory skills.
Parenting
The three dimensions of an optimal parenting style (structure, affiliation and AS) were assessed at each assessment time, by both parents and children, using scales drawn from well-validated parenting questionnaires, translated to French using back translation and adapted for children when needed.
Parents completed the Laxness subscale of the Parenting Scale [61] to rate how they generally behave toward their children, using bipolar items, where poles are anchored with a structured and a permissive stance. This subscale has been associated with observations of laxness (r = .61) and shown to identify mothers having difficulties in handling their children. The internal consistency of our French version was good (Cronbach alphas T1/T2 = .75/.72) in our pilot study.
Six items of the Laxness subscale of the Parenting Scale [61] were adapted to measure children’s perception of the extent to which their participating parent is permissive or setting limits. The internal consistency was close to satisfactory for this scale (Cronbach alphas T1/T2 = .57/.56) in the pilot study.
Ten items of the Care subscale of the Parental Bonding Instrument [62] were translated to measure parental care and involvement (vs. rejection). This instrument has been positively related to an observational measure of parental care [62]. The internal consistency of our French version was good (Cronbach alphas T1/T2 = .79/.77) in the pilot study.
The Care subscale of the Parental Bonding Instrument [62] was adapted to gather children’s perception of their participating parent’s care and involvement, contrasted with indifference and rejection, and its internal consistency was also good (Cronbach alphas T1/T2 = .76/.70).
The Autonomy-Supportive Parenting Skill Scale was designed within our pilot study [41]. Twelve autonomy-supportive skills taught in the How-to Parenting Program are contrasted with various controlling strategies parents typically use. Parents rated bipolar items, where one pole is anchored with an autonomy-supportive response and the other with a controlling reaction. The internal consistency was acceptable at T1 and good at T2 (Cronbach alphas T1/T2 = .64/.81) in the pilot study.
Parents also completed the Parental Attitude Scale [63] to rate their attitude toward AS and psychological control. This scale has predictive validity and has been associated with observational measures of autonomy-supportive and controlling behaviors [63]. In the pilot study, the French version showed good internal consistency (Cronbach alphas T1/T2 = .76/.73).
Children completed an adapted version of the Perceived Parental Autonomy Support Scale [31] to assess their perception of the extent to which their parent uses autonomy-supportive and controlling strategies. This scale has a sound factor structure, demonstrates convergent validity, and predicts psychological adjustment. In the pilot study, its internal consistency was good (Cronbach alphas T1/T2 = .70/.78).
Parental mental health
Parents were asked to assess their own mental health at each assessment time. Symptoms of anxiety and depression were measured with the General Health Questionnaire [64] while negative affect and guilt were measured with the negative affect subscale of the PANAS and the guilt subscale of the PANAS-X [54, 65]. Parents also assessed positive indicators of their mental health, by rating their positive affect (with the PANAS [54]), life satisfaction (with the Satisfaction with Life Scale [58, 59]), perceived competence (with the Competence subscale of the Basic Need Satisfaction in Relationships Scale [66]) and their self-esteem (with the Rosenberg’s Self-Esteem Scale [56]).
Potential covariates
At pre-intervention, parents provided demographic questions to determine their age, gender, education level, family income, marital status, first language, ethnicity and number of children. Parents also indicated their children’s age and gender, and evaluated their child’s temperament (Children’s Behavior Questionnaire [67]). The measures used in the present study are summarized in an additional table file [see Additional file 1].
Retention promotion
A 20$ (CAN) compensation was offered to parents each time they completed a questionnaire. After each questionnaire, participating teachers received 10$ (CAN) and children received a gift-certificate of 10$ (CAN) from a popular bookstore. Group facilitators were not compensated for filling out their own short questionnaire but were paid for their work.
Statistical methods
We will adopt an intent-to-treat approach [68] as this approach increases external and internal validity. Our data will also be hierarchical in nature; pre-intervention, post-intervention, 6-month and 1-year follow up measures will indeed be nested within each parent, who are nested within a parenting group. Accordingly, multivariate hierarchical linear modeling (HLM) analyses will be conducted to test our hypotheses; these analyses have the advantages of estimating error terms while taking into account the nested nature of the data and allow for missing data without decreasing power. Analyses will thus include all participants who completed at least one assessment, regardless of the number of sessions they attended (for the experimental groups).
To evaluate change over time, the four assessment points (pre-intervention, post-intervention, 6-month follow-up, and 1-year follow-up) will be treated as repeated measures to estimate change between each subsequent time points [69]. This approach is chosen because it allows rates of change to differ across time.
Preliminary analyses
Although we randomly assigned participants to one of the two conditions, there is still the possibility that some sample characteristics were unequally distributed across the two conditions. We will thus compare the experimental and control groups on baseline variables that may directly or indirectly impact the effect of the intervention (e.g., children’s age and familial SES), to investigate statistical equivalency. If an important imbalance is found between our conditions, we will control for this/these variable/s in later analyses. We will also investigate the percentage of variables not equally distributed between our two conditions and then judge the success of our randomization.
Primary analyses
The effect of the How-to Parenting Program will be evidenced by significant interactions between rates of change and experimental conditions. These interactions should reveal that child mental health increases over time for participants in the experimental condition, but not for participants in the control condition. Based on our pilot study, we expect that the program will be effective to decrease parent-reported I and E problems.
Secondary analyses
Also based on our pilot study, we expect that the program will help reduce teacher-reported I and E problems, and that it will foster higher strengths and well-being [41, 43]. We will also test the impact of the program on parenting and parents’ mental health as well as conduct mediation analyses of the putative impact of the intervention. We expect that improved parenting, as a consequence of the program, will lead to improved child mental health. Finally, we will conduct moderation analyses to explore whether the program’s impact varies according to certain characteristics of children (e.g., age, gender), parents (e.g., age, SES), groups (e.g., content fidelity) or circumstances (e.g., whether one or two parents took part in the program; whether another child and/or family intervention was received).
Handling missing data
An important advantage of HLM is the use of estimation procedures that allow for missing data without decreasing power (e.g., full information maximum likelihood estimates). All participants with at least one assessment will thus be included in our analyses because their missing data is estimated from the information of other participants. This procedure has been shown to yield unbiased coefficients, whether data is missing at random or completely at random [70]. Nevertheless, participants with and without missing data will be compared to document the pattern of missing data. Though we planned a variety of procedures to maximise the retention rate and increase the exactitude of our estimates (e.g., compensation, phone contacts), identifying the characteristics of participants who tend to dropout is useful to adjust future retention strategies.