Skip to main content

Introduction to the Special Issue: Acceptance and Commitment Training in Applied Behavior Analysis

The editors of the special issue of Behavior Analysis in Practice on acceptance and commitment training (ACT) in applied behavior analysis solicited a mix of invited and open submissions devoted to ACT from a behavior analytic framework. Drs. Alyssa Wilson, Kate Kellum, and Marianne Jackson served as the special issue’s guest editors. All submitted articles were peer reviewed to determine publication acceptability. We are thankful to the reviewers who handled articles for the special issue, as well as the journal’s editors, Dr. Jonathan Tarbox (previous editor) and Dr. Stephanie M. Peterson (current editor), for their continued support throughout the creation of this special issue.

Recent behavior analytic attention has focused on ACT, as evidenced by a surge of applied behavior analytic workshops, curriculum materials and protocols, and even podcast discussions on the topic. However, although interest has piqued, there are minimal resources specifically designed and created for certified behavior analysts to use ACT in their clinical practice. The special issue was designed to allow applied behavior analysts an opportunity to understand, explore, critique, and gain implementation guidance tailored for them. The articles in the special issue address ACT in applied behavior analysis across three domains: (1) conceptual and theoretical considerations, (2) empirical evidence showcasing the clinical utility of ACT in expansive settings and populations, and (3) practical implementation guidance for applied behavior analysts.

Conceptual and Theoretical Considerations

To further provide clear and cohesive theoretical and conceptual development of ACT, six articles provide unique conceptualizations and critiques of the model. Jonathan Tarbox, Tom Szabo, and Megan Aclan outline how they believe ACT aligns with applied behavior analysts’ scope of practice. The authors provide details on how the ACT model corresponds within the seven dimensions of applied behavior analysis (i.e., applied, analytic, behavior, effective, technological, conceptually systematic, generalizable; Baer et al., 1987). It is important to note, that although the authors clearly distinguish ways that implementation of acceptance and commitment training is within the scope of practice of the behavior analytic profession, some behavior analysts are not yet convinced. For instance, Cihon et al. (2021) conducted a systematic review of ACT within behavior analysis and argued that empirical evidence from populations with whom behavior analysts are more likely to work is sparse. Victoria Suarez, Emma Moon, and Adel Najdowski draw similar conclusions from their own systematic review in the current special issue. However, Jonathan Tarbox, Tom Szabo, and Megan Aclan affirm that ACT aligns within the Behavior Analytic Certification Board® task list, including components related to functional assessment, motivating operations, use of derived relations (including stimulus equivalence, naming, and derived relational responding), goal setting, self-management, use of social validity, and single-subject designs. They provide an example of how to implement an ACT intervention that targets emotional behavior of a parent to highlight distinctions between how a behavior analyst may use the intervention strategy compared to how a psychotherapist may use the same (or similar) strategy.

Emily Sandoz, Evelyn Gould, and Troy DuFrene provide a critical response to Tarbox et al. Sandoz and colleagues suggest the examples provided by Tarbox et al. imply that the functions of verbal stimuli, verbal behavior, and relationships among behaviors (verbal and nonverbal) can be assumed from their literal meaning (i.e., indirect functional assessment), thus deemphasizing direct functional assessment. These authors suggest direct functional assessment is not only possible but likely necessary to maximize the effectiveness of ACT as an intervention at the level of the individual. Sandoz et al. center their critique around the explicit use of direct functional assessments and provide avenues for future behavior analytic research on developing useful ways for using functional assessments within clinical practice.

Three articles focus on the interconnection between relational frame theory (RFT; Barnes-Holmes et al., 2001) and ACT. For instance, Amanda Kelly and Michelle Kelly examine how behavior analysts should identify functional relations using RFT prior to using ACT. The authors provide a detailed background into how behavior analysts come to conceptualize private events, and how private events influence relational repertoires according to RFT. From here, the authors provide an overview of how applied behavior analysts should infuse functional assessment when using ACT in their practice to ensure they provide a function-based treatment. Meanwhile, Ruth Anne Rehfeldt and Ian Tyndall explore ways that relational framing, rule following, and experiential avoidance may be related to healthy and unhealthy lifestyle repertoires. In reviewing literature about ACT and lifestyle repertoires, these authors propose that individual and cultural-level interventions are both warranted and possible. The authors call for a reconceptualization of health care with strong participation of behavior analysts and conclude with suggestions for scaling up interventions likely to facilitate healthy lifestyle behaviors.

In contrast, Jordan Belisle and Mark Dixon proposes an extension of RFT to determine that an emphasis on relational framing as a generalized relational response class as a focus in an ACT intervention may not necessarily be effective. The authors argue that relational patterns of responding, including avoidance repertoires, are more likely to be highly resistant to change (and subsequently resistant to more “traditional” behavior analytic intervention strategies). The use of ACT on these resistant relational repertoires may, therefore, be a needed tool within behavior analyst’s toolbox, in particular if other intervention strategies fail at producing robust behavior change. The authors propose a relational dynamical model as an alternative to “a static interpretation of relational frames.” The proposed model needs further empirical support and validation and sets the stage for potential future research to assist behavior analysts with determining relational repertoires that may (or may not) need to be subjected to ACT interventions.

Finally, Mitch Fryling and Linda Hayes highlight how interbehavioral psychology may assist applied behavior analysts with developing and understanding an ACT model without reliance on constructs. These authors suggest that interbehavioral psychology provides a framework from which to distinguish descriptive constructions and the events that are described by those constructions. Further, they argue that such distinctions are particularly important for behavior analysts, because terms frequently associated with ACT (e.g., descriptive constructions like “defusion” and “self-as-context”) may be confused with events of interest, thereby potentially removing focus from those events. In describing interbehavioral fields, with a focus on stimulus substitution and implicit responding, the authors discuss the implications of this interbehavioral perspective for ABA practitioners using ACT.

Practical Implementation Guidance

Three articles provide conceptual overviews and considerations of specific components within the ACT model. Michael Bordieri highlights the component of acceptance as a core treatment process within the ACT model, emphasizing behavior analytic intervention strategies and empirical support from studies that target “acceptance” as an alternative to engagement in experiential avoidance. The article provides behavior analysts with guidelines for implementing acceptance intervention strategies through demonstration of a hypothetical case study. Most important, Bordieri provides guidance for behavior analysts when using acceptance as a functional (rather than topographical) process and provides examples of how to link acceptance strategies with other components within the ACT model.

Olga Berkout and Dana Paliliunas provide conceptual overviews for the component on values as core treatment processes within the ACT model. Berkout showcases how behavior analysts can use values in their clinical practice and presents a range of commonly used values exercises and how to use values with other components, in particular committed action and present moment. In addition, Berkout provides a clinical breakdown of using metaphors when targeting values in clinical practice and provides clear clinical guidelines and considerations for behavior analysts on developing and using metaphors.

Paliliunas also aligns her conceptualization of values within a RFT account of hierarchical, temporal, deictic, and causal frames that come to have augmentative reinforcing functions. This analysis provides insight into core mechanisms, in particular how relational repertories and verbal establishing stimuli may influence the extent to which valued-consistent repertoires are emitted (or avoided). Finally, Paliliunas provides a thorough overview of future research that is needed to further behavior analytic development of values-based intervention strategies.

Empirical Support of ACT

Five articles provide empirical evidence in support of behavior analytic use of ACT across a range of settings and populations. Victoria Suarez, Emma Moon, and Adel Najdowski conducted a systematic review of ACT components used with individuals with autism and developmental disabilities. A total of 30 studies (29 articles) were identified and analyzed across 12 different dependent measures. When analyzing across global outcomes, the authors calculated the percentage of nonoverlapping data (PND; Scruggs et al., 1987). It is interesting that only eight studies resulted in PND scores above 90%, suggesting that only 1 data point in the intervention phase overlapped (or occurred at the same dimension) as baseline. Further, 5 studies resulted in PND scores between 70%–89%, suggesting that only 13 studies (43%) demonstrated effective results. These results highlight how behavior analysts have more work to do when it comes to determining the effectiveness of ACT components and complete treatment models when working with individuals with neurodevelopmental and developmental disorders.

Alyssa Wilson, Emily Dzugan, and Victoria Hutchinson used Dixon’s (2014) ACT curriculum with three male students in a special education school. Using a nonconcurrent multiple baseline with embedded reversal design, the authors compared the effects of individualized ACT exercises (as derived from the curriculum) to treatment control conditions on classroom behaviors. Results found that during individualized ACT sessions, participants’ engagement in on-task behaviors improved compared to baseline and treatment-control conditions. Mark Dixon, Dana Paliliunas, Jennifer Weber, and Ayla Schmick showcase a naturalistic study on the usefulness of AIM (Dixon & Paliliunas, 2018) when implemented across a single public school (n = 318). Middle-school students were subjected to daily exposure to AIM lessons. At the end of the school year, students reported increased psychological flexibility scores (as measured by the Avoidance and Fusion Questionnaire for Youth [AFQY]; Greco et al., 2008) and increased mindfulness scores (as measured by the Child and Adolescent Mindfulness Measure [CAAM]; Kuby et al., 2015). Further, state testing scores were also shown to increase following the school-wide implementation of AIM when compared to previous years. Although the study is not experimental, it does showcase how large-scale implementation of ACT may prove to be beneficial. It is clear, however, that additional research is needed to validate the utility of ACT when applied in school settings.

Theodore Issen, Jessica Hinman, and Mark Dixon provide some evidence of the effectiveness of a brief present moment awareness intervention on paraprofessionals accuracy of data collection and staff-initiated interactions. Because both dependent variables are important for skill development and intervention for students and clients, we agree that research about interventions that target staff skill improvement are critical. Three experienced paraprofessionals participated in a multiple baseline across participants design in a day school providing services for students ages 5–21 years. Following a 10-min guided present moment awareness activity, the researchers saw average increases in data accuracy and staff-initiated interactions. The authors conclude with suggestions for future research focused on staff ACT-based interventions.

Callan Gilsenan, Zhihui Yi, Jessica Hinman, Becky Barron, and Mark Dixon examined the impact of targeting improvement of derived relational responding repertoires on participation in ACT sessions. The researchers directly measured idiographic maladaptive behaviors exhibited by three children and indirectly measured participation during sessions using the AIM curriculum (Dixon & Paliliunas, 2018) during baseline and intervention. Following intervention, maladaptive behaviors reduced for one of the three participants and ratings of participation in activity increased for each participant. Results suggested that the effectiveness of, and even the ability to effectively participate in ACT sessions might be augmented by improving relational responding repertories. The authors warn that the “universal application of ACT” may lead to mixed results without attention to foundational repertoires. Similar to the conclusions drawn by other studies included in the special issue, additional research in this area is needed.

Future of ACT in Applied Behavior Analytic Practice

Behavior analysts are likely equipped to use ACT in their scope of practice, depending upon how ACT is considered and implemented. We hope that the current special issue spurs further debate and dialogue within the behavior analytic community about topics beyond “Can I use ACT if I am a behavior analyst?” The use of mid-level terms in the model is one area that behavior analysts should continue to discuss. Although some of the articles in the special issue unpack mid-level terms (e.g., see the articles by Michael Bordieri, Mitch Frying and Linda Hayes, and Dana Paliliunas), no article pushed the boundaries, per se, on the clinical need or utility of these terms when used by applied behavior analysts. We believe that these terms may impede the functional utility of the ACT model, in particular when used by applied behavior analysts. For instance, the words spoken by a client, or a parent are not real per se, in the same way that “defusion” and even “reinforcement” are not real. Rather, the observational processes allow for the term “defusion” or “reinforcement” to be used only after a specific behavior and environmental event (like delivery of a specific stimulus or the presentation of a specific stimulus). It is only after those events and behaviors have been observed over time, under specific conditions or schedules, that we can consider “reinforcement” or “defusion” as terms that describe “real” environmental variables. Skinner (1945) argued that psychological terms can be used to identify manipulable intervening variables rather than hypothetical constructs. It is unclear if ACT terms like “defusion” and “self-as-context” function as intervening variables. Behavior analysts may work more effectively by returning to their “roots”—that is, to agree to focus on observable events as well as use function-based interventions. The extent to which such an approach includes the use of and reliance on mid-level terminology is yet to be determined.

The use of functional analyses within behavior analytic practice, and in particular within an ACT framework, was also a theme within submitted and accepted articles for the special issue (e.g., Johnathan Tarbox et al., Emily Sandoz et al., and Amanda Kelly and Michelle Kelly). We believe it is imperative that applied behavior analysts always look for functional relations in the environment before and during all clinical practice. It is this focus (i.e., identifying functional relationships between environmental stimuli and behaviors) that is the bread and butter of applied behavior analysis. What is less clear, however, is how behavior analysts conduct functional rather than topographical analyses when using ACT in their practice. Given the current limitations of published research to date, behavior analysts must continue to ask themselves, “How do I know what I am doing is working? How do I know if ACT is a functional intervention for this person in front of me?” Perhaps the simplest way to do this is by way of using single-subject experimental designs and relying on measuring observable events. Behavior analysts must continue to select and directly measure observable behaviors, whether they use ACT or another intervention strategy all together. Without this, there is no way to answer the previous questions about the effectiveness of the intervention. In addition, these changes must be analyzed using within-subject designs to make valid claims about the effects of ACT strategies on these behaviors.

Finally, more research is needed to determine when ACT should be selected as the treatment intervention. At present, it is not clear if ACT should be the first treatment selected or if ACT should be selected only after more traditional intervention strategies have failed. Further, it is not clear if there are clients who may respond better to ACT interventions than others, or what prerequisite skills an individual may need in their repertoire for an ACT intervention to be effective. Behavior analysts should always consider the assessments selected to identify conditions under which a target (and alternative) behavior is more or less likely to occur. Such assessments may help the behavior analyst identify why ACT is the treatment of choice for any individual client. These questions should not be answered after the fact, but instead should be asked before the behavior analyst implements any ACT intervention. Additional research on these topics and components is sorely needed and will be exceptionally useful for behavior analysts. More research on strategies to for teaching behavior analysts how to implement ACT are also needed.

We hope that this special issue sets the stage for more behavior analysts to empirically investigate the use of ACT strategies in clinical practice. The articles selected for the special issue provide an overview of the state of ACT within applied behavior analysis, and are both supportive and critical of the model. As a field, we still have more work to do before we consider ACT a hallmark of applied behavior analytic clinical practice. We look forward to seeing how the practice of ACT unfolds as a part of clinical practice and believe the special issue provides the first step towards motivating future behavior analysts to push the boundaries of the ACT model.

References

  1. Baer, D. M., Wolf, M. M., & Risley, T. R. (1987). Some still-current dimensions of applied behavior analysis. Journal of Applied Behavior Analysis, 20(4), 313–327.

    Article  Google Scholar 

  2. Barnes-Holmes, S. C. H. D., & Roche, B. (2001). Relational frame theory: A post-Skinnerian account of human language and cognition. Advance in Child Development and Behavior 28, 101–138.

  3. Cihon, J. H., Ferguson, J. L., Leaf, J. B., Milne, C. M., Leaf, R., & McEachin, J. (2021). Acceptance and commitment training: A review of the research. European Journal of Behavior Analysis, 1–21. 

  4. Dixon, M. R. (2014). ACT for children with autism and emotional challenges. Shawnee Scientific Press.

    Google Scholar 

  5. Dixon, M. R., & Paliliunas, D. (2018). AIM: A behavior analytic curriculum for social-emotional development in children. Shawnee Scientific Press.

    Google Scholar 

  6. Greco, L. A., Lambert, W., & Baer, R. A. (2008). Psychological inflexibility in childhood and adolescence: development and evaluation of the Avoidance and Fusion Questionnaire for Youth. Psychological Assessment, 20(2), 93.

    Article  Google Scholar 

  7. Kuby, A. K., McLean, N., & Allen, K. (2015). Validation of the Child and Adolescent Mindfulness Measure (CAMM) with non-clinical adolescents. Mindfulness, 6(6), 1448–1455.

    Article  Google Scholar 

  8. Scruggs, T. E., Mastropieri, M. A., & Casto, G. (1987). The quantitative synthesis of single-subject research: Methodology and validation. Remedial & Special Education, 8(2), 24–33.

    Article  Google Scholar 

  9. Skinner, B. F. (1945). The operational analysis of psychological terms. Psychological Review, 52(5), 270–277. https://doi.org/10.1037/h0062535

    Article  Google Scholar 

Download references

Author information

Affiliations

Authors

Corresponding author

Correspondence to Alyssa N. Wilson.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Wilson, A.N., Kellum, K.K. & Jackson, M. Introduction to the Special Issue: Acceptance and Commitment Training in Applied Behavior Analysis. Behav Analysis Practice (2021). https://doi.org/10.1007/s40617-021-00645-w

Download citation