Skip to main content

Challenges in the Implementation of Measurement Feedback Systems


This commentary on the articles published in the special section on the development and implementation of measurement feedback systems (MFSs) discusses three challenging themes in the process of MFS implementation: design and planning, organizational context, and sustainability and unintended consequences. It is argued that the implementation of MFSs is complex, but is an important step in improving outcomes in routine care for children and young persons.

In the past decade, the implementation of measurement feedback systems (MFSs), also referred to as progress feedback or (routine) outcome monitoring, has taken a leap worldwide. Several countries have mandated the use of MFSs as part of routine care (e.g. United Kingdom, Australia, Norway, the Netherlands), and in other countries large public or private initiatives exist (e.g. United States, Germany, Chile). By using an MFS, the client’s progress in treatment is tracked by frequent administration of standardized measures. The MFS supports the clinician in deciding to adapt treatment when insufficient progress has been made. There are several feedback systems available (e.g. OQ Measures, PCOMS, TOP), and by now MFSs have been introduced in a variety of settings (inpatient, outpatient, group and individual therapy), populations (e.g. youth, adults, elderly) and disorders (e.g. addiction, common mental disorders, eating disorders) (Bickman et al. 2011; Crits-Christoph et al. 2010; Kraus et al. 2005; Lambert et al. 2004; Miller et al. 2005; Probst et al. 2013; Simon et al. 2013). Some studies have found large effects of using MFSs (Shimokawa et al. 2010), but a recent review suggests that the effects can vary substantially over studies (Krägeloh et al. 2015). A potential explanation for this variation in effectiveness might be the way in which MFSs have been implemented (de Jong 2014). For example, two recent studies found that half of the clinicians did not use the feedback they were provided with (e.g. De Jong et al. 2012; Simon et al. 2012). As such, it is worth taking a closer look at the processes associated with the implementation of MFSs, as the articles in this special section aim to do. Dixon-Woods et al. (2012) analyzed evaluation reports from a large number of quality improvement programs in the UK, and identified three themes for implementation: (1) Design and planning of the improvement intervention; (2) Organizational and institutional contexts, professions and leadership; and (3) Beyond the intervention: sustainability, spread and unintended consequences. The articles in this special section will be discussed within the context of these themes.

Design and Planning

The first implementation theme is the design and planning of the introduction of the MFS. The way Dixon-Woods et al. (2012) define this, it includes the process of convincing clinicians, staff and management that there is a problem (e.g. outcomes are not good enough), for which the implementation of the MFS is the solution (e.g. MFSs improve outcomes). Although none of the articles in this section mention explicitly how they have addressed this issue, in Gleacher et al. (2015) a clinician mentions that once (s)he saw the value of the MFS, (s)he really started to buy into it. Research has shown that clinicians can have quite negative attitudes towards MFSs (Walter et al. 1998), and that they predict active use of the MFS (De Jong et al. 2012). That makes them an important target in the implementation process.

The design of the MFS is another important factor within this theme. MFS design has been discussed in detail in several of the articles in this section (Bruns et al. 2015; Nadeem et al. 2015; Steinfeld et al. 2015). Given the technical complexity of MFSs, and a group of users that is not necessarily “computer savvy”, the design of the MFS is extremely important. Bickman and colleagues report that problems with their MFS was mentioned by clinicians as the main barrier for implementation in their study (Bickman et al. 2014). MFS needs to fit their users’ needs. Specifically regarding the technology aspects Lyon et al. (2015) provide a framework for optimizing existing software packages to fit the needs of mental health care organizations, in which they put a strong emphasis on the involvement of future users. Given that the technical aspect is one of the complicating factors of implementing a MFS, compared to other implementation processes, the article by Lyon et al. is a valuable addition to the implementation literature.

Organizational Context

The second theme that Dixon-Woods and colleagues address is organizational and institutional contexts, professions, and leadership. This is an important theme in the articles in this special section. Both Nadeem et al. and Gleacher et al. conclude that higher organizational and leadership support improved implementation of the MFS. Interviews with clinicians in the article by Bickman et al. shed a light on what type of support was particularly helpful: on-site implementation support, and having a local champion in using the MFS. These results are in line with the conclusion of Aarons et al. (2014) that leaders have a key role in promoting an implementation climate. Aarons et al. stress that alignment across multiple levels of leadership is especially important. A study by Torrey et al. (2012) found that implementation success was correlated with leadership devoted to redesigning the work flow and reinforcing implementation though measurement and feedback. Dixon-Woods et al. (2012) suggest that a ‘quieter’ leadership style might be more succesfull: fewer bombastic declarations and more working to facilitate collaboration.

Sustainability and Unintended Consequences

The third theme mentioned by Dixon-Woods et al. (2012) refers to sustainability and unintended consequences. With the exception of Steinfeld et al. in the studies in this special issue the implementation of the MFSs was heavily supported by research teams. This may make sustainability a potential issue. Especially when MFSs were implemented as part of a specific project, there appears to be a risk that clinicians and managers lose interest at the project’s end, when they are faced with new, competing priorities (Dixon-Woods et al. 2012). As Lyon et al. point out, the adaptation of the MFS is an ongoing process, one that in my experience is often underestimated at the start of the process by management, researchers and even the software engineers. However, it is possible to implement a MFS successfully without external support. For example, Steinfeld et al. (this issue) obtained impressive percentages of complete data, and a majority of clinicians accessing the feedback before seeing the patient without external support. The key to their success seems to be keeping the process simple and taking one small step at a time.

The issue of unintended consequences is a complex one. Research on the efficacy of MFSs has predominantly taken place in adult populations. To my knowledge, in youth mental health care only two randomized controlled studies have been conducted (Bickman et al. 2011, 2014), resulting in a relatively narrow research base for the implementation of MFSs in children and young persons. Even in adult populations, in which MFSs have been studied more extensively (e.g. Davidson et al. 2014; Krägeloh et al. 2015; Shimokawa et al. 2010), we do not know much about potential unhelpful effects of feedback. A recent study suggested that providing feedback in a long-term inpatient and day patient psychotherapy setting, was initially associated with an increase symptoms in patients with borderline personality disorder and patients with personality disorder not otherwise specified, although these effects diminished after several weeks (De Jong et al., submitted). In youth mental health care, there may also exist groups of patients for which feedback is not helpful (e.g. developmental disorders, high severity clients). Additionally, new research suggests that feedback may also be unhelpful for therapists who are motivated by preventing failure (prevention focus), rather than obtaining success (promotion focus; De Jong and De Goede 2015). More research would be needed in order to study potentially unwanted effects of MFSs.

A different class of potential unwanted effects has to do with forming partnerships with the industry. Modern MFSs have become so technologically complex and costly that researchers often need to form partnerships with software developers in order to be able to develop a MFS (as for example was done by Bruns et al. 2015). Often, software developers are co-investors in these projects. Sometimes scientists buy into a MFS financially as well, or get benefits from the sales of a package, or from giving trainings in the use of the MFS. These situations may lead to conflicts of interest, or situations in which shared creative ownership of the MFS can lead to problems in the continuation of a research line (as Bickman et al. mention in their article).


The articles in this special section teach us that the implementation of MFSs is a complex process in which many challenges need to be faced. They also show us good examples of successful implementation processes or creative solutions to challenges faced, which will be extremely helpful to future implementers of MFSs. The implementation of MFSs in routine care has the potential to improve outcomes for large groups of future clients. This is especially important, given that the effect sizes of interventions in routine care are much smaller than those found in clinical trials (Hansen et al. 2002; Weisz et al. 1995). Moreover, 21 % of children and young people deteriorate significantly during their care episode (Warren et al. 2009). This means that the result we are getting in routine care are not good enough. The implementation of MFSs in routine care is an important step in improving outcomes, that has the potential to affect quality of life for thousands of children and young persons worldwide. It should be mentioned that so far, the research on the use of MFSs in youth mental health care has primarily taken place in the US. More research in other countries is necessary, especially given the fact that care systems and accessibility of care is quite different in other countries. For instance, in many European countries, youth mental health care is either freely accessible, or for a relatively affordable co-pay. Additionally, the level of training for clinicians may also differ substantially between countries, which may impact both outcomes and the implementation process for MFSs. It is encouraging to see that implementation of MFSs in the US has taken flight, and hopefully the articles in this special section will inspire others to start thinking about implementing a MFS in their own setting.


  1. Aarons, G. A., Ehrhart, M. G., Farahnak, L. R., & Sklar, M. (2014). Aligning leadership across systems and organizations to develop a strategic climate for evidence-based practice implementation. Annual Review of Public Health, 35(1), 255–274. doi:10.1146/annurev-publhealth-032013-182447.

    Article  PubMed  PubMed Central  Google Scholar 

  2. Bickman, L., Douglas Kelley, S., Breda, C., De Andrade, A. R., & Riemer, M. (2011). Effects of routine feedback to clinicians on youth mental health outcomes: A randomized cluster design. Psychiatric Services, 62, 1423–1429. doi:10.1176/

    Article  PubMed  Google Scholar 

  3. Bickman, L., Douglas, S. R., De Andrade, A. R. V., Tomlinson, M., Gleacher, A., Olin, S., & Hoagwood, K. (2014). Implementing a measurement feedback system: A tale of two sites. Administration and Policy in Mental Health and Mental Health Services Research. doi:10.1007/s10488-015-0647-8.

    Google Scholar 

  4. Bruns, E. J., Hyde, K. L., Sather, A., Hook, A. N., & Lyon, A. R. (2015). Applying User Input to the Design and Testing of an Electronic Behavioral Health Information System for Wraparound Care Coordination. Administration and Policy in Mental Health and Mental Health Services Research. doi:10.1007/s10488-015-0658-5.

    Google Scholar 

  5. Crits-Christoph, P., Ring-Kurtz, S., McClure, B., Temes, C., Kulaga, A., & Gallop, R. (2010). A randomized controlled study of a web-based performance improvement system for substance abuse treatment providers. Journal of Substance Abuse Treatment, 38(3), 251–262. doi:10.1016/j.jsat.2010.01.001.

    Article  PubMed  PubMed Central  Google Scholar 

  6. Davidson, K., Perry, A., & Bell, L. (2014). Would continuous feedback of patient’s clinical outcomes to practitioners improve NHS psychological therapy services? Critical analysis and assessment of quality of existing studies. Psychology and Psychotherapy: Theory, Research and Practice, n/a-n/a. doi:10.1111/papt.12032.

    Google Scholar 

  7. de Jong, K. (2014). Deriving implementation strategies for outcome monitoring feedback from theory, research and practice. Administration and Policy in Mental Health and Mental Health Services Research. doi:10.1007/s10488-014-0589-6.

  8. De Jong, K., & De Goede, M. (2015). Why do some therapists not deal with outcome monitoring feedback? A feasibility study on the effect of regulatory focus and person–organization fit on attitude and outcome. Psychotherapy Research, 25(6), 661–668. doi:10.1080/10503307.2015.1076198.

    Article  Google Scholar 

  9. De Jong, K., van Sluis, P., Nugter, M. A., Heiser, W. J., & Spinhoven, P. (2012). Understanding the differential impact of outcome monitoring: Therapist variables that moderate feedback effects in a randomized clinical trial. Psychotherapy research, 22(4), 464–474. doi:10.1080/10503307.2012.673023.

    Article  PubMed  Google Scholar 

  10. De Jong, K., Segaar, J., Van Ingenhoven, T., Busschbach, J. V., & Timman, R. Adverse effects of feedback in inpatients and day patients with personality disorders. Unpublished manuscript.

  11. Dixon-Woods, M., McNicol, S., & Martin, G. (2012). Ten challenges in improving quality in healthcare: lessons from the Health Foundation’s programme evaluations and relevant literature. BMJ Quality & Safety. doi:10.1136/bmjqs-2011-000760.

    Google Scholar 

  12. Gleacher, A. A., Olin, S. S., Nadeem, E., Pollock, M., Ringle, V., Bickman, L., et al. (2015). Implementing a measurement feedback system in community mental health clinics: A case study of multilevel barriers and facilitators. Administration and Policy in Mental Health and Mental Health Services Research. doi:10.1007/s10488-015-0642-0.

    Google Scholar 

  13. Hansen, N. B., Lambert, M. J., & Forman, E. M. (2002). The psychotherapy dose-response effect and its implications for treatment delivery services. Clinical Psychology and Psychotherapy: Science and Practice, 9(3), 329–343. doi:10.1093/clipsy.9.3.329.

    Google Scholar 

  14. Krägeloh, C. U., Czuba, K. J., Billington, D. R., Kersten, P., & Siegert, R. J. (2015). Using feedback from patient-reported outcome measures in mental health services: A scoping study and typology. Psychiatric Services, 66(3), 224–241. doi:10.1176/

    Article  PubMed  Google Scholar 

  15. Kraus, D. R., Seligman, D. A., & Jordan, J. R. (2005). Validation of a behavioral health treatment outcome and assessment tool designed for naturalistic settings: The Treatment Outcome Package. Journal of Clinical Psychology, 61(3), 285–314. doi:10.1002/jclp.20084.

    Article  PubMed  Google Scholar 

  16. Lambert, M. J., Morton, J. J., Hatfield, D. R., Harmon, C., Hamilton, S., & Shimokawa, K. (2004). Administration and scoring manual for the OQ-45.2 (Outcome Questionnaire) (3rd ed.). Wilmington, DE: American Professional Credentialling Services LLC.

    Google Scholar 

  17. Lyon, A. R., Wasse, J. K., Ludwig, K., Zachry, M., Bruns, E. J., Unützer, J., et al. (2015). The contextualized technology adaptation process (CTAP): Optimizing health information technology to improve mental health systems. Administration and Policy in Mental Health and Mental Health Services Research. doi:10.1007/s10488-015-0637-x.

    PubMed Central  Google Scholar 

  18. Miller, S. D., Duncan, B. L., Sorrell, R., & Brown, G. S. (2005). The partners for change outcome management system. Journal of Clinical Psychology, 61(2), 199–208. doi:10.1002/jclp.20111.

    Article  PubMed  Google Scholar 

  19. Nadeem, E., Cappella, E., Holland, S., Coccaro, C., & Crisonino, G. (2015). Development and piloting of a classroom-focused measurement feedback system. Administration and Policy in Mental Health and Mental Health Services Research. doi:10.1007/s10488-015-0651-z.

    Google Scholar 

  20. Probst, T., Lambert, M. J., Loew, T. H., Dahlbender, R. W., Göllner, R., & Tritt, K. (2013). Feedback on patient progress and clinical support tools for therapists: Improved outcome for patients at risk of treatment failure in psychosomatic in-patient therapy under the conditions of routine practice. Journal of Psychosomatic Research, 75(3), 255–261. doi:10.1016/j.jpsychores.2013.07.003.

    Article  PubMed  Google Scholar 

  21. Shimokawa, K., Lambert, M. J., & Smart, D. W. (2010). Enhancing treatment outcome of patients at risk of treatment failure: Meta-analytic and mega-analytic review of a psychotherapy quality assurance system. Journal of Consulting and Clinical Psychology, 78, 298–311. doi:10.1037/a0019247.

    Article  PubMed  Google Scholar 

  22. Simon, W., Lambert, M. J., Harris, M. W., Busath, G., & Vazquez, A. (2012). Providing patient progress information and clinical support tools to therapists: Effects on patients at risk of treatment failure. Psychotherapy research, 22(6), 638–647. doi:10.1080/10503307.2012.698918.

    Article  PubMed  Google Scholar 

  23. Simon, W., Lambert, M. J., Busath, G., Vazquez, A., Berkeljon, A., & Hyer, K. (2013). Effects of providing patient progress feedback and clinical support tools to psychotherapists in an inpatient eating disorders treatment program: A randomized controlled study. Psychotherapy research, 23(3), 287–300. doi:10.1080/10503307.2013.787497.

    Article  PubMed  Google Scholar 

  24. Steinfeld, B., Franklin, A., Mercer, B., Fraynt, R., & Simon, G. (2015). Progress monitoring in an integrated health care system: Tracking behavioral health vital signs. Administration and Policy in Mental Health and Mental Health Services Research. doi:10.1007/s10488-015-0648-7.

    PubMed  Google Scholar 

  25. Torrey, W., Bond, G., McHugo, G., & Swain, K. (2012). Evidence-based practice implementation in community mental health settings: the relative importance of key domains of implementation activity. Administration and Policy in Mental Health and Mental Health Services Research, 39(5), 353–364. doi:10.1007/s10488-011-0357-9.

    Article  PubMed  Google Scholar 

  26. Walter, G., Cleary, M., & Rey, J. M. (1998). Attitudes of mental health personnel towards rating outcome. Journal of Quality in Clinical Practice, 18(2), 109–115.

    CAS  PubMed  Google Scholar 

  27. Warren, J., Nelson, P., & Burlingame, G. (2009). Identifying youth at risk for treatment failure in outpatient community mental health services. Journal of Child and Family Studies, 18(6), 690–701. doi:10.1007/s10826-009-9275-9.

    Article  Google Scholar 

  28. Weisz, J. R., Weiss, B., Han, S. S., Granger, D. A., & Morton, T. (1995). Effects of psychotherapy with children and adolescents revisited: A meta-analysis of treatment outcome studies. Journal of Consulting and Clinical Psychology, 117(3), 450–469. doi:10.1037/0033-2909.117.3.450.

    CAS  Google Scholar 

Download references

Author information



Corresponding author

Correspondence to Kim de Jong.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (, which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

de Jong, K. Challenges in the Implementation of Measurement Feedback Systems. Adm Policy Ment Health 43, 467–470 (2016).

Download citation

  • Published:

  • Issue Date:

  • DOI:


  • Outcome monitoring
  • Feedback
  • Implementation