Skip to main content

Development, Feasibility, and Acceptability of a Nationally Relevant Parent Training to Improve Service Access During the Transition to Adulthood for Youth with ASD


Many youth with autism spectrum disorder (ASD) face challenges accessing needed services as they transition to adulthood. The present study describes the development, feasibility and acceptability of a new intervention designed to teach parents of transition-aged youth with ASD about the adult service system and the most effective ways to access services and supports. As part of a randomized-controlled trial, the intervention—named ASSIST—was delivered to 91 participants in three states in the U.S. Results suggested that ASSIST is feasible and acceptable to participants. Though intended to be an in-person group-based program, due to COVID-19 restrictions ASSIST was primarily delivered online. Results and discussion explore the trade-offs and implications of these different treatment delivery modalities in relation to ASSIST.

The transition to adulthood is a time of significant risk for youth with autism spectrum disorder (ASD). Employment and post-secondary education (PSE) are difficult to obtain (Shattuck et al., 2012; Taylor & Seltzer, 2011b, 2012), and are rarely maintained over time (Taylor & DaWalt, 2017; Taylor et al., 2015). Compared to youth in the general population, those with ASD are more socially isolated and have greater difficulty making friends (Bauminger & Kasari, 2000; Lyons et al., 2011; Petrina et al., 2014). Although many young people may struggle during the transition to adulthood, youth with ASD are at even greater risk across multiple domains of daily life (Shattuck et al., 2011, 2012). These risks lead to significant societal costs: when considering both service use and lost productivity, lifetime costs for one individual with ASD in the U.S. ranges from $1.43 to $2.44 million (Buescher et al., 2014).

Challenges during transition and beyond are partly caused by problems accessing adult services. The Individuals with Disabilities Education Act (IDEA) ensures that youth with disabilities (including ASD) are eligible for mandated services through the school system until the age of 22 years or until they exit high school. However, after high school exit, these youth encounter an adult service system that is inadequately funded, fragmented, and difficult to navigate. In 2017, nearly 500,000 adults with intellectual and developmental disabilities in the U.S. (including those with ASD) were on waiting lists to receive home and community-based services; a number that was over three times what is was in 2005 (138,000) (The Case for Inclusion 2020: Key Findings Report, 2020; The Case For Inclusion Report: 2019, 2019). In another analysis, fewer than 20% of individuals with intellectual and developmental disabilities were receiving long-term services and supports through their state developmental disability agency (Residential Information Systems Project, 2020).

However, lack of funding for services is only one challenge. Even when one is granted eligibility, the system offers unconnected locations for services. Families must contact separate departments for services related to employment, home and community, and PSE. Along with identifying, accessing, and speaking with various agencies, families must understand each agency’s eligibility scheme, funding mechanism, and programming. Parents must advocate so that the program meets the unique needs of their offspring. As many young adults with ASD require assistance in employment and activities of daily living (Ballaban-Gil, 1996; Billstedt et al., 2005; Eaves & Ho, 2008; Howlin et al., 2004), failure to access these needed services leads to serious ramifications including higher rates of emotional and health problems, declines in levels of independence, and reduced engagement in vocational activities (Taylor & Hodapp, 2012; Taylor & Mailick, 2014; Taylor & Seltzer, 2011a).

Addressing Challenges to Service Access through Parental Advocacy

To help address some of the challenges that families encounter when trying to access services, the Volunteer Advocacy Program-Transition (VAP-T) was developed (Taylor et al., 2017a). The VAP-T is a 12-week, 30-h, group-based advocacy training to educate parents of transition-aged youth with ASD about the adult service system. It is directed by an experienced program facilitator with knowledge about group processes, person-centered planning, and adult service systems. The VAP-T is delivered in partnership with the local disability community; in most sessions, the program facilitator is aided by community content experts who present specifics of each topic. Experts include representatives from the Parent Training and Information Centers, The Arc, the University Centers for Excellence in Developmental Disabilities, various government agencies, attorneys, and parents of individuals with ASD. Each session consists of didactic instruction, family-sharing activities, case studies, and group discussions.

The VAP-T curriculum was developed with input from an Advisory Committee that included experts in ASD, adult disability service providers, disability advocates, parents of youth with ASD, and self-advocates with ASD. The curriculum reflects multiple domains, including person-centered thinking, secondary education, PSE, financial support, employment, Medicaid, future planning, medical services, and advocacy. For each domain, participants complete a section of a Letter of Intent (i.e., a planning tool for individuals with disabilities). By aligning the Letter of Intent with each domain, participants exit the VAP-T with a written document about the services their son or daughter needs (per each domain), and strategies to obtain such services. Each participant is given a binder of materials, which for each topic includes a PowerPoint presentation, relevant readings, tip sheet(s), and activities.

This intervention showed preliminary efficacy in a pilot randomized-controlled trial with 45 parents of transition-aged youth with ASD. Relative to a wait-list control group, parents who had taken the VAP-T experienced greater gains in their ability to advocate for adult services on behalf of their son/daughter with ASD, including increased knowledge about adult services, advocacy skills, and empowerment to use this information. Further, effect sizes for all these group differences were large (Taylor et al., 2017a). In addition to improving parents’ ability to advocate for adult services, there was preliminary evidence to suggest that participation in the VAP-T led to increased service access, PSE attendance, and community employment for youth with ASD (Taylor et al., 2017b).

Despite the promise of the VAP-T, its usefulness was limited because it only reflected the availability and eligibility of services for the state in which it was developed. This decision was made because the adult services landscape differs from state to state. Thus, there was a need to develop an adult services advocacy program that could be applicable across states. To address this need, we adapted VAP-T into ASSIST (Advocating for Supports to Improve Service Transitions), which uses a combination of standardized curriculum and individualized learning objectives to be nationally relevant.

Present Study

In this study, we described the development, feasibility, and acceptability of ASSIST, a new intervention that teaches parents of youth with ASD about the adult service system and the most effective ways to access those services. We delivered ASSIST six times, twice in each of three states in the U.S. (IL, TN, WI). Though the intervention was conceptualized and initially implemented as an in-person group-based program, COVID-19 restrictions on social gatherings necessitated conducting ASSIST sessions online. Two groups (out of six) were mid-way through the 12-week intervention series when social distancing guidelines were implemented in their local communities, and thus, they took some sessions in-person and some online. This allowed us the unanticipated opportunity to examine their perceptions of each mode of intervention delivery. We examined the following research aims:

  1. 1.

    To describe the feasibility of ASSIST as evidenced by:

    1. a.

      Treatment fidelity

    2. b.

      Participant attendance

    3. c.

      Participant-reported ease of participation and barriers to participation.

  2. 2.

    To examine acceptability of ASSIST via participant satisfaction, and for those who had both in-person and online implementation, how the type of implementation impacted their satisfaction.



Participants for this study were part of a multi-site randomized-controlled trial testing the effectiveness of ASSIST. Eligibility criteria for the study were as follows: (a) parent or legal guardian of a youth with ASD who was 16–26 year of age; (b) parent provided medical, psychological, and/or educational documentation of their youth’s ASD diagnosis; (c) parent was able to attend the 12-week group program at the intervention site on the day and time it was delivered (either Monday or Wednesday evening, depending on the site); and (d) youth had lifetime scores on the Social Communication Questionnaire (SCQ; Rutter et al., 2003)—a parent report screener for ASD—indicating the likelihood of an ASD diagnosis.

Ninety-one participants were assigned to the treatment group and are the focus of this report. They were evenly distributed across the study sites, reflecting three states in the U.S. (29 in IL, 33 in TN, 29 in WI). Demographic characteristics of the sample are presented in Table 1.

Table 1 Participant demographic information (n = 91)


After baseline data were collected via interview and questionnaire, participants were randomized, and the treatment group began the 12-week program. Attendance was taken at each ASSIST session. Participants who were unable to attend an ASSIST session had access to a password-protected website (along with the rest of the treatment group participants) on which they could watch the session recording. Analytic data were gathered from the website to record who engaged with the online video content. Treatment fidelity was assessed by a study team member for each ASSIST session; for about one-quarter of the sessions (24%), a second team member assessed treatment fidelity so that inter-rater agreement could be determined.

After each group finished the ASSIST program, they were emailed the link to a summative evaluation form. We allowed these forms to be completed anonymously to facilitate honest evaluation and to ensure that participants’ responses were not influenced by relationships they may have developed with project staff during the program. About two-thirds (68%; n = 62) of participants completed the summative evaluation.

Intervention: ASSIST


Over the course of a year, the VAP-T was revised into a new program that would be relevant across the United States, called ASSIST (Advocating for Supports to Improve Service Transitions). The intervention revision team consisted of experts in ASD and adult services, representatives from adult disability service agencies and advocacy groups, and parents of adults with ASD. Participation of disability advocacy groups and parents was particularly important to the development of ASSIST, as their participation ensured that the issues covered were of most relevance to individuals and families, and that the program was developed in a way that facilitated sustainability by community organizations. Team members resided in three states (IL, TN, WI) and thus could speak to the landscape of service systems across multiple systems. During weekly videoconferencing meetings and one in-person conference, this team: (1) examined the topics presented in the VAP-T so see if all important aspects of adult services were covered; (2) divided information about adult services and supports included in the VAP-T into what was relevant across the nation (e.g., definition of Medicaid Waiver; how to apply for Supplemental Security Income) and what would differ based on state and locality; and (3) determined a program structure that would provide standardization while still allowing enough flexibility to address the diverse services and supports available in different regions.

The intervention revision team determined that the best structure for ASSIST would consist of short introductory videos presenting nationally-relevant information followed by local experts who contextualized information about the topic to their geographic area, with ample time for questions and discussion throughout. The local experts would be provided learning objectives for their presentation, developed by the research team. This format provides standardization, while still allowing each site to personalize the program to their local context. For example, Medicaid waivers differ dramatically from state-to-state. Using this format, the introductory video contains important background about why the Medicaid waivers were developed, what they mean in the context of other long-term services and supports, and general information about eligibility and portability. The local expert, then, describes the specific waiver programs that are available in their state and state-specific regulations.

Program Description

The resulting intervention is a 12-week program, with each weekly session lasting two hours (24 h total). An optional 13th week on secondary transition planning can be delivered to participants if relevant (i.e., if youth are still in high school at the time of ASSIST delivery). ASSIST is led by a program facilitator with experience in adult services/supports and group processes.

For this project, ASSIST was delivered using an “Initial Implementation” model (Fixsen et al., 2005, 2013); the program facilitators were members of local community disability agencies or organizations, with support provided, as needed, by the university team. Before the program started, all program facilitators were provided with an ASSIST treatment manual that described the steps to prepare for and deliver the program. The university team met with the facilitators to discuss the implementation of the program and answer questions. Program facilitators identified and invited the local experts with support from the university team. Local expert invitation emails provided information about the program, the audience, and the learning objectives that they were asked to cover during their presentation. During the ASSIST sessions, university team members provided technical support to the program facilitators, and coaches provided feedback to the facilitators after the sessions. As program facilitators became more familiar with the ASSIST program, coaching was phased back but technical support remained.

The topic for each ASSIST session is presented in Table 2 and the schedule for a representative ASSIST session is presented in Fig. 1. Each ASSIST session opens with a group greeting and introductions (Week 1) or reflections on the previous week’s session (Weeks 2–12) as well as an icebreaker question/activity that pertains to the topic of the day. Then, the introductory video is shown. The introductory video for each session presents information about the topic at hand that is relevant regardless of a person’s locale. The content is presented by two “hosts” who are parents of young adults with ASD. In the videos, the hosts have a conversation about the topic, asking and answering questions to present key information. Each introductory video also includes the story of a family’s experience that is relevant to the topic.

Table 2 Treatment fidelity for each session by study site, and for local expert learning objectives
Fig. 1
figure 1

Example agenda and treatment fidelity form from the ASSIST session, “Housing and Technology”

After the video is shown, there is time for group participants to ask questions. Then, the local expert for that week is introduced by the program facilitator, and they give a presentation about the local aspects of the topic at hand, guided by the learning objectives for that topic developed by the research team. Learning objectives for the introductory videos and local expert presentations for each session are presented in Supplementary Table 1.

The last part of the session is reserved for further questions and discussion, and, at the end of the session, the “tip sheet” for that session (i.e., one-page summary with key points including national and state-specific resources) is covered. As in the VAP-T, participants are asked to complete the section of a Letter of Intent that corresponds to each topic; the applicable section is presented at the very end of the session. After the session has finished, participants are asked to complete an evaluation of that session; information from that evaluation is used to determine whether there are questions that need to be addressed at the next session and to provide feedback to program facilitators and local experts.

All participants are given a binder of materials that contains handouts of the information presented in each session for reference, biographies and contact information for the local experts, tip sheets and a Letter of Intent.

In-Person Versus Online Implementation

Because our pilot work suggested that the group experience was an important part of the VAP-T (Taylor et al., 2017a), ASSIST was developed to be an in-person, group-based intervention. The first two ASSIST groups were mid-way through the program (completing 4 and 3 weeks, respectively) when COVID-19 restrictions on group gatherings were implemented in the local areas. After a pause of approximately two months, the programs were resumed using secure video-conferencing platforms. This provided us with the unique opportunity to query participants in these two groups about their experiences taking ASSIST in-person versus online, allowing us to make small refinements to the procedures with the remaining groups to approximate, as closely as possible, the in-person group experience. The remaining 4 treatment groups (1 in IL, 1 in TN, and 2 in WI) were delivered the entire ASSIST series via synchronous video-conference, as restrictions on group gatherings were in place throughout the entire time that the treatment group was receiving the program.


Feasibility: Treatment Fidelity

The treatment fidelity form for a representative ASSIST session is presented in Fig. 1. The form for each session assessed both the fidelity to the components of ASSIST (e.g., greeting group members; showing the video; introducing the expert speaker) as well as the learning objectives that were expected to be covered by the local expert in that session (see Supplemental Table 1 for local expert learning objectives). Fidelity to each item on the form was assessed on a binary scale (yes/no); though this did not allow for rating of items that were “somewhat” covered, the simpler scale facilitated uniformity in coding across sites.

Feasibility: Participant Attendance

Feasibility of parent participation was measured primarily through attendance. Attendance was taken for each ASSIST session, and attendance data was tabulated for each participant. Data were also gathered on whether a participant who was absent from an ASSIST session accessed that session on the password protected website. For each session, participants were given a code of: attended the session (2); absent but watched the recorded session (1); or did not attend nor access the recorded content (0).

Feasibility: Ease of Participation and Barriers to Participation

Information about feasibility of parent participation was also collected in the summative evaluation. In terms of ease of participation, parents were asked whether each of the aspects of ASSIST implementation “worked for me” or “did not work for me:” session length (2 h); session day (Monday or Wednesday); session time (evenings); and number of sessions (12 sessions). If any aspects of ASSIST “did not work for [them],” participants were asked what would have worked better.

Participants were also asked about barriers to participation with the following question, “If you were unable to attend one or more ASSIST sessions(s), what circumstances made attendance challenging. Please select all that apply.” They could select from the following: work conflict; vacation; medical issue (non-COVID related); family emergency; no respite care; weather or other relevant emergency (e.g., power outage); COVID-related circumstances; technology issues (e.g., trouble with internet connection); and/or other.


The summative evaluation included ten items that assessed participant satisfaction. Satisfaction items are presented in Table 3, and reflect satisfaction with the ASSIST content, with participants’ ability to attend and participate, and with the group experience. Agreement with each item was rated on a five-point Likert scale ranging from 1 = strongly disagree to 5 = strongly agree.

Table 3 Descriptive statistics for items measuring participant satisfaction with ASSIST (n = 61)

In addition, the two groups who took some ASSIST sessions in-person and some online (due to COVID-19) were given the same set of statements (see Table 3) and asked to compare their online experience to their in-person one. Response options to the items for this set of questions were 1 = less than in-person; 2 = same as in-person; 3 = more than in-person.

Finally, in the summative evaluation, all participants were asked to rank-order each of the following models of ASSIST delivery by their preference: in-person; online (live); or recorded.

Data Analysis

To examine treatment fidelity (Aim 1), the percentage of learning objectives met for each session were calculated by study site (IL, TN, WI) and overall (combining across sites). Percent agreement was calculated for the sessions in which two research team members rated treatment fidelity. We were particularly interested in whether the local experts for each session addressed the learning objectives in their presentation, given the individualization that we expected across states and speakers; thus, we separately examined the percentage of local expert learning objectives that were covered in each session.

Aim 1 also examined the feasibility of parent participation in the ASSIST program. Attendance was examined by calculating the number of sessions each participant attended (either by joining the session live or by watching recordings of the sessions after the fact). Using the summative evaluations, we used frequencies to examine how feasible each of the components of ASSIST implementation was for the participants. We also examined barriers to participation by calculating frequencies of the circumstances endorsed by participants that made attendance challenging.

Aim 2 examined acceptability of ASSIST as measured by participant satisfaction. For those who had both in-person and online implementation, we also investigated how the type of implementation impacted their satisfaction. Mean scores for each satisfaction item in the summative evaluation were calculated, as well as the percentage who reported “agreeing” or “strongly agreeing” with each satisfaction statement. Participants who received part of the curriculum in-person and part online were also asked to compare their satisfaction across the two modalities (more, similarly, or less satisfied with web-conferencing versus in-person): frequencies of responses to each satisfaction item were summarized. Finally, for all participants, we summarized the percentage that chose in-person, online (live), or recorded as their preferred mode of ASSIST delivery.


Aim 1: Feasibility

Treatment Fidelity

Treatment fidelity percentages for each session by study site are presented in Table 2. Treatment fidelity was good, at over 85% for nearly all sessions; two of the study sites each had one session below 85%, and the third study site had no sessions below 85%. Inter-rater agreement, calculated on 24% of sessions, was excellent at 94%.

Across sites, the percentage of local expert learning objectives met was over 95% for all sessions but two: Health Insurance Options and Employment 2 (see Table 2). In both cases, the lower percentages were driven by a local expert at a single study site, who presented information on their topic that did not align with the learning objectives. Local experts at other sites had high treatment fidelity scores for these sessions, suggesting that the local expert learning objectives remained relevant.


The percentage of sessions attended by participants is presented in Fig. 2. Across sites, 84.6% of the participants (n = 77) attended 75% or more of the sessions (either live or by watching the recording), with nearly 50% attending all 12 sessions. Attendance was similar across sites, with 83% of IL participants, 82% of TN participants, and 90% of WI participants attending at least 75% of sessions (see Supplemental Fig. 1 for attendance by study site). Less than 10% of participants (IL = 4, TN = 3, WI = 2) attended fewer than 50% of sessions.

Fig. 2
figure 2

Percent of sessions attended by participants. Attendance includes both attending sessions live or watching the recordings of the sessions afterwards

Follow-up analyses examined the proportion of time that participants “attended” ASSIST sessions by joining the sessions live versus accessing the recorded content on the secure website. The majority of participant attendance was through live participation; only 15 of the 91 participants (16%) accessed the online content after the sessions. Most of the time (73%), participants entered the online portal to access recordings of sessions they missed. The remaining 27% of the time, they accessed the recordings to re-watch sessions they had attended live.

Ease of Participation and Barriers to Participation

Of the 62 participants who provided responses to the summative evaluation, 100% reported that the session day (Monday or Wednesday depending on the site) “worked for [them].” Regarding the session time (happening on evenings from 6 to 8 PM CST at all sites), 97% (n = 60) responded that it “worked for [them].” The two participants who reported that the session time did not work for them said that the sessions would have worked better had they happened in the morning (n = 1) or started an hour later (at 7 PM; n = 1). As far as session length (two hours) and number of total sessions in the program (12 sessions), 92% (n = 57) of the participants said that each of these aspects “worked for [them].” Those who reported that session length and number of total sessions “did not work” for them (8% or n = 5 for each), indicated that shorter sessions of 1.5 h would have worked better for them, and that fewer than 12 sessions would have been more convenient.

Thirty-seven participants (59.7% of those who completed the summative evaluation) indicated some barriers to participation. The percentage of participants who indicated each barrier is presented in Fig. 3. The most common challenges to participation were conflicts with work and family emergencies. About 15% of families endorsed “other” barriers, which included a diverse set of responses such as having family members visit from out of town, family and home-life challenges, prior commitments, forgetting the session, or (in one case) skipping the session because it was a topic the participant had already learned about.

Fig. 3
figure 3

Percentage of participants who endorsed each circumstance that made attendance difficult (out of n = 62)

Thus, for the vast majority of participants, the ASSIST sessions were feasible as evidenced both by high attendance as well as by reporting that the logistics of the session were feasible for them.

Aim 2. Acceptability

Participant Satisfaction

Sixty-one participants provided valid satisfaction data. Mean scores and standard deviations for each satisfaction item, as well as the percentage of people who ranked each item as “agree” or “strongly agree” are presented in Table 3. In supplemental analyses, we tested for site differences in satisfaction items using analysis of variance; there were no statistically significant differences between study sites (see Supplemental Table 2) and thus results are reported in aggregate.

Participants reported being satisfied with all aspects of the ASSIST program, with mean scores for all items but two falling between “agree” (score of 4) and “strongly agree” (score of 5). Specifically, more than 90% of participants reported that they learned new information, would recommend ASSIST to other families, were satisfied with the sessions, and felt that their questions were answered. The least amount of agreement was for statements related to ability to concentrate during the sessions and feeling connected to other parents, with this last item receiving a mean score of 3.72. Though scores were lower on these items than some of the others, it is important to note that a score of 3 corresponded to the “neutral” response category. Thus, there were high levels of satisfaction with the ASSIST program, with some challenges in concentration and connecting with other parents.

Online Versus In-Person Delivery

For the two groups who attended part of the program in person (before COVID-19 restrictions were put in place; n = 20), we asked them to compare their online and in-person experiences. Figure 4 shows the percentage of respondents who rated each satisfaction item with online implementation as “less than in-person”, “same as in-person”, or “more than in-person.” Participants noted many similarities in their experiences online versus in-person. Nearly all participants reported that they learned new information similarly and most felt that their questions were answered similarly across delivery modalities. About two-thirds reported that modality did not impact their ability to attend sessions, with the remaining 35% somewhat evenly split between having more issues and less issues attending sessions online versus in-person.

Fig. 4
figure 4

Percentage of those who took ASSIST both in-person and online, who endorsed each response (n = 20)

There was also some indication that full participation was more difficult online compared to in-person. About one-half of participants reported that, in the online implementation of ASSIST, they could join the discussion and connect with other parents “less than in-person.” Additionally, a substantial subset of parents (40–45%) felt that compared to in-person implementation, it was more challenging to ask questions, to feel that they were part of a group, and to concentrate during the online sessions.

Though participants who had both in-person and online ASSIST sessions reported some challenges in online participation, across the entire sample these potential challenges were offset by convenience. Of the participants who completed the summative evaluation, 58% (n = 36) reported that their preferred mode of delivery was online compared to participating in-person or watching recorded presentations. The remaining 42% (n = 26) reported their preferred mode of delivery to be in-person. No participants preferred recorded sessions above synchronous online or in-person sessions.

Having experienced ASSIST in-person seemed to impact—to some extent—preference for in-person versus online delivery. Of those who had taken some ASSIST sessions in person (n = 20), 60% indicated in-person as their preferred mode of delivery, compared to one-third (33%) of the 42 participants who only had ASSIST sessions online.


This study presented data on the feasibility and acceptability of a novel intervention to improve parents’ ability to advocate for services and supports on behalf of their transition-aged son or daughter with ASD. This project extended extant research by delivering the intervention—ASSIST—through community agencies (with coaching) at three states in the U.S. Findings suggested that ASSIST can be delivered with fidelity, and that it is feasible and acceptable to participants.

The high rates of treatment fidelity by the local experts are particularly notable. Because adult services and supports are different in every state, it was necessary to build some degree of individualization into the curriculum so that participants at each study site could receive information relevant to their local context. Many guest speakers who provided information to ASSIST participants were from local disability organizations, agencies, and advocacy groups, and had delivered many presentations on their assigned topic. Though we asked local experts to review and incorporate the learning objectives into their presentations, it was unclear to what extent this would happen during community implementation. With two notable exceptions, each local expert was able to address most or all learning objectives in their presentation. Thus, findings suggest that relying on local subject-matter experts from the community, and allowing content to be personalized to the local context, does not preclude ASSIST being delivered in a standardized way across sites with comparable information received by all research participants.

Findings also suggested that the format of ASSIST (weekday evenings, 2-h sessions, 12 weeks) was generally feasible for participants. Interestingly, even during a global pandemic, we saw high attendance for ASSIST sessions. There are a few potential reasons for the high rates of attendance. First, though we had expected COVID-19 to negatively impact attendance due to the additional stress that families are under (Manning et al., 2020; White et al., 2021), in reality it may have encouraged attendance. Because of social distancing restrictions, families were traveling less and had fewer evening activities than they would in typical times. Further, attending ASSIST sessions virtually from one’s home reduces barriers to participation that would be present for in-person participation such as traffic, reliable transportation, and childcare.

Though the high rates of attendance were likely driven, at least to some extent, by the convenience of accessing sessions from home, participants who had the opportunity to take ASSIST sessions both in-person (prior to COVID-19) and online (after COVID-19) identified some trade-offs to this convenience. In general, across the entire sample, participants were highly satisfied with many aspects of the online implementation of ASSIST. However, those who participated both in-person and online reported some challenges in accessing the benefits of the group format with the online delivery. Many found it harder to connect with other parents, to feel part of the group, and to join in the discussion. Their ability to learn new information, however, did not seem to be impacted by the different intervention delivery formats. This feedback is highly consistent with our pilot work, in which the VAP-T was delivered in-person but we offered opportunities to attend virtually or watch the sessions after the fact when in-person attendance was not possible. In that study, we found that attending a higher proportion of sessions in-person (versus accessing the sessions online) was not associated with the amount of knowledge about adult services that participants gained, but was related to families feeling more empowered to use the information that they learned (Taylor et al., 2017a). In the aftermath of COVID-19, many families are more comfortable with technology and it may be that, in many circumstances, online groups replace in-person groups (Ameis et al., 2020; Hacker et al., 2020; Nguyen et al., 2020). It will be important for studies to rigorously examine the benefits and drawbacks of different delivery models for programs such as ASSIST.

Finally, it is interesting to note that those who had taken ASSIST sessions in-person were twice as likely to choose in-person as their preferred method of intervention delivery, relative to those who had not taken any ASSIST sessions in-person. Though those who had taken ASSIST sessions in-person seemed to feel as though “something was lost” in terms of the group experience when moving online, this may not have been the experience for those who only attended online sessions. Across either group, however, there were many participants who preferred each type of intervention delivery. Future research should investigate which type of treatment modality works best for whom, in order to target the appropriate delivery modality to the participants most likely to benefit from it.


There are important limitations worth noting. First, most of the research participants were White non-Hispanic, and those with higher levels of education were over-represented in this sample. The lack of diversity in our sample is a limitation common to much of autism research (Jones & Mandell, 2020; West et al., 2016), and it may have been further exacerbated by the necessity to deliver ASSIST via web conferencing. Though we worked with families to procure the necessary technology when needed, ultimately families without reliable access to technology may have decided not to participate. Future research should consider cultural-responsiveness and other adaptations to make ASSIST feasible and acceptable to under-represented minority groups or those with fewer socioeconomic resources, who are at greatest risk for disparities in service access during the transition to adulthood (Eilenberg et al., 2019; Shattuck et al., 2011).

Second, because this was the first test of the national ASSIST curriculum, we used an initial implementation approach in which coaches assisted community agencies in delivering the program. Though it was necessary, at the current stage of program development, to test whether the ASSIST curriculum could be implemented with fidelity, it is unclear to what extent the feasibility and acceptability outcomes apply to circumstances where ASSIST is fully implemented in the community (without monitoring and coaching from a research team). Future analyses will examine the extent to which coaching was required in the present initial implementation model, and future studies are needed to test full community implementation of ASSIST.

Third, about one-third of study participants did not fill out the summative evaluation. Though our decision to anonymize the evaluations likely encouraged candor in the feedback we received (which was our intent), it renders us unable to identify who did or did not complete the evaluation. It may be that those who had an unsatisfactory experience with ASSIST were less motivated to provide feedback on the program.

Finally, through the happenstance of COVID-19-related restrictions on social gatherings, we had the unanticipated opportunity to examine participant satisfaction and preference for ASSIST when delivered in-person versus online. More intentional investigations of feasibility, satisfaction, and efficacy of ASSIST delivered via different modalities will be necessary to determine which model of delivery is preferable and more effective.


Download references


Chung eun Lee is now at Chonnam National University. This research was supported by the National Institute of Mental Health (R01 MH116058, PI: Taylor), the Vanderbilt Kennedy Center for Excellence in Developmental Disabilities, and The Autism Project of Illinois, with core support from the National Institute of Child Health and Human Development (P50 HD103537, PI: Neul; U54 HD090256, PI: Chang) and the National Center for Advancing Translational Sciences (UL1 TR000445). We are grateful for the support of our community partners during the development and implementation of ASSIST including ASSIST group facilitators (Patty Boheme, Ashley Coulter, Jennifer Espinoza-Forlenza, Carrie Guiden, Loria Hubbard, Linda Totorelli) and the many local experts who lent their time and expertise to the program. We also would like to express our gratitude to the parent volunteers who served as video “hosts” (Linda Brown, Ann Curl, Janet Shouse, John Shouse), and the families who volunteered their time to participate in the project.

Author information

Authors and Affiliations



All authors made a significant contribution to the planning, data collection, and/or analytic design. All authors participated in drafting and/or revising the manuscript, and all approved the final version. JLT conceived of the study and participated in all stages of planning, data collection, analysis, and drafting of the manuscript. FP participated in planning, data collection, analysis, and drafting of the manuscript. MB and LSD participated in planning, data collection, contributing to the analytic plan, and interpreting results. CEL participated in planning, analysis, and drafting the manuscript. CR participated in planning and data collection.

Corresponding author

Correspondence to Julie Lounds Taylor.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Below is the link to the electronic supplementary material.

Supplementary file1 (DOCX 30 kb)

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Taylor, J.L., Pezzimenti, F., Burke, M.M. et al. Development, Feasibility, and Acceptability of a Nationally Relevant Parent Training to Improve Service Access During the Transition to Adulthood for Youth with ASD. J Autism Dev Disord 52, 2388–2399 (2022).

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI:


  • Autism spectrum disorder
  • Families
  • Intervention
  • Transition to adulthood
  • Services