Skip to main content
Log in

Development and Initial Testing of a Structured Clinical Observation Tool to Assess Pharmacotherapy Competence

  • Original Article
  • Published:
Academic Psychiatry Aims and scope Submit manuscript

Abstract

Objective

The authors developed and tested the feasibility and utility of a new direct-observation instrument to assess trainee performance of a medication management session.

Methods

The Psychopharmacotherapy-Structured Clinical Observation (P-SCO) instrument was developed based on multiple sources of expertise and then implemented in 4 university-based outpatient medication management clinics with 7 faculty supervising 17 third-year residents. After each observation by a faculty member of a medication management session, residents received feedback in writing (the completed P-SCO) and verbally in person. Targets were 8 P-SCO observations per academic year per resident (or 0.67 per month) and 16 observations per year completed by each faculty (or 1.3 per month). Qualitative thematic analysis was employed to compare the frequency, specificity, type (reinforcing vs. corrective), and content of comments documented on the P-SCO forms to midpoint and end of rotation global assessments by the same faculty for the same residents in the same rotation.

Results

Faculty completed 2.4 (SD=1,2) P-SCOs per month during the study period. Each resident received 1.1 (SD=0.53) P-SCO observations per month. Faculty and residents completed significantly more observations than targeted (p=0.03 and p=0.003, respectively). Two percent of the P-SCOs had no written comments. Less than 3% of the P-SCO comments were nonspecific compared with 43% for the global assessments. Residents received, on average, 3.3 times more total, 2.6 times more reinforcing, and 5.3 times more corrective patient care specific comments on the P-SCO than on the global assessment (p<0.001). For the numerical ratings, residents received an average of 4.2 “exceeds expectations” and 1.7 “below expectations” ratings on P-SCOs compared with 2.6 and 0, respectively, on global assessments (p<0.02).

Conclusion

Faculty can feasibly use the P-SCO instrument in a training clinic. Compared with traditional global assessment, the P-SCO provided much more specific feedback information, a better balance of corrective to re-enforcing comments, and a greater spread of ratings related to competency in pharmacodverbally

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  1. Mojtabai R, Olfson M: National trends in psychotherapy by office-based psychiatrists. Arch Gen Psychiatry 2008; 65: 962–970

    Article  PubMed  Google Scholar 

  2. US Institute of Medicine: Committee on Crossing the Quality Chasm: adaptation to mental health and addictive disorders, improving the quality of health care for mental and substance-use conditions. Quality Chasm Series. Washington, DC, National Academies Press, 2006

    Google Scholar 

  3. Daelmans HE, Overmeer RM, van der Hem-Stokroos HH, et al: In-training assessment: qualitative study of effects on supervision and feedback in an undergraduate clinical rotation. Med Educ 2006; 40:51–58

    Article  CAS  PubMed  Google Scholar 

  4. Daelmans HE, van der Hem-Stokroos HH, Hoogenboom RJ, et al: Feasibility and reliability of an in-training assessment programme in an undergraduate clerkship. Med Educ 2004; 38:1270–1277

    Article  CAS  PubMed  Google Scholar 

  5. Turnbull J, MacFadyen J, Van Barneveld C, et al: Clinical work sampling: a new approach to the problem of in-training evaluation. J Gen Intern Med 2000; 15:556–561

    Article  CAS  PubMed Central  PubMed  Google Scholar 

  6. Hays R, Wellard R: In-training assessment in postgraduate training for general practice. Med Educ 1998; 32:507–513

    Article  CAS  PubMed  Google Scholar 

  7. Dunnington G, Reisner L, Witzke D, et al: Structured single-observer methods of evaluation for the assessment of ward performance on the surgical clerkship. Am J Surg 1990; 159:423–426

    Article  CAS  PubMed  Google Scholar 

  8. Wass V, Jolly B: Does observation add to the validity of the long case? Med Educ 2001; 35:729–734

    Article  CAS  PubMed  Google Scholar 

  9. Downing SM: Validity: on meaningful interpretation of assessment data. Med Educ 2003; 37:830–837

    Article  PubMed  Google Scholar 

  10. Irby DM: What clinical teachers in medicine need to know. Acad Med 1994; 69:333–342

    Article  CAS  PubMed  Google Scholar 

  11. Schum TR, Krippendorf RL, Biernat KA: Simple feedback notes enhance specificity of feedback to learners. Ambul Pediatr 2003; 3:9–11

    Article  PubMed  Google Scholar 

  12. Schatzberg AF, Nemeroff CB: The American Psychiatric Publishing Textbook of Psychopharmacology, 3rd ed. Washington, DC, American Psychiatric Publishing, 2004

    Google Scholar 

  13. Janicak PG, Davis JM, Preskorn SH, et al: Principles and Practice of Psychopharmacotherapy, 4th ed. Philadelphia, Lippincott Williams & Wilkins, 2006

    Google Scholar 

  14. Scheiber SC, Kramer TAM: Core Competencies for Psychiatric Practice: What Clinicians Need To Know: A Report of the American Board of Psychiatry and Neurology, 1st ed. Washington, DC, American Psychiatric Publishing, 2003

    Google Scholar 

  15. ACGME: ACGME Psychiatry Program Requirements, 2007. Available at http://www.acgme.org/acWebsite/navPages/nav_400.asp

  16. Fawcett J, Epstein P, Fiester SJ, et al: Clinical management: imipramine/placebo administration manual. NIMH Treatment of Depression Collaborative Res Program. Psychopharmacol Bull 1987; 23:309–324

    CAS  PubMed  Google Scholar 

  17. Miller M: Medication Clinic Training Procedures and Treatment Manual. Pittsburgh, University of Pittsburgh Medical Center Late-Life Depression Prevention Clinic, 1998, pp 1–21

    Google Scholar 

  18. Miller MD, Frank E, Reynolds CF 3rd: The art of clinical management in pharmacologic trials with depressed elderly patients: lessons from the Pittsburgh Study of Maintenance Therapies in Late-Life Depression. Am J Geriatr Psychiatry 1999; 7:228–234

    Article  CAS  PubMed  Google Scholar 

  19. Elstein AS, Shulman LS, Sprafka SA: Medical problem solving: an analysis of clinical reasoning. Cambridge, Mass, Harvard University Press, 1978, pp xvi, 330

    Google Scholar 

  20. Williams RG, Dunnington GL, Klamen DL: Forecasting residents’ performance-partly cloudy. Acad Med 2005; 80:415–422

    Article  PubMed  Google Scholar 

  21. Norcini JJ, Blank LL, Arnold GK, et al: The mini-CEX (clinical evaluation exercise): a preliminary investigation. Ann Intern Med 1995; 123:795–799

    Article  CAS  PubMed  Google Scholar 

  22. Kroboth FJ, Hanusa BH, Parker S, et al: The inter-rater reliability and internal consistency of a clinical evaluation exercise. J Gen Intern Med 1992; 7:174–179

    Article  CAS  PubMed  Google Scholar 

  23. Wilkinson TJ, Campbell PJ, Judd SJ: Reliability of the long case. Med Educ 2008; 42:887–893

    Article  PubMed  Google Scholar 

  24. Salerno SM, Jackson JL, O’Malley PG: Interactive faculty development seminars improve the quality of written feedback in ambulatory teaching. J Gen Intern Med 2003; 18: 831–834

    Article  PubMed Central  PubMed  Google Scholar 

  25. Ende J: Feedback in clinical medical education. JAMA 1983; 250:777–781

    Article  CAS  PubMed  Google Scholar 

  26. Smith CS, Francovich C, Gieselman J, et al: A broader theoretical model for feedback in ambulatory care. Adv Health Sci Educ Theory Pract 1998; 3:133–140

    Article  PubMed  Google Scholar 

  27. Hewson MG, Little ML: Giving feedback in medical education: verification of recommended techniques. J Gen Intern Med 1998; 13:111–116

    Article  CAS  PubMed Central  PubMed  Google Scholar 

  28. Heidenreich C, Lye P, Simpson D, et al: The search for effective and efficient ambulatory teaching methods through the literature. Pediatrics 2000; 105(part 3):231–237

    CAS  PubMed  Google Scholar 

  29. Lye PS, Biernat KA, Bragg DS, et al: A pleasure to work with: an analysis of written comments on student evaluations. Ambul Pediatr 2001; 1:128–131

    Article  CAS  PubMed  Google Scholar 

  30. Paukert JL, Richards ML, Olney C: An encounter card system for increasing feedback to students. Am J Surg 2002; 183:300–304

    Article  PubMed  Google Scholar 

  31. Holmboe ES, Fiebach NH, Galaty LA, et al: Effectiveness of a focused educational intervention on resident evaluations from faculty a randomized controlled trial. J Gen Intern Med 2001; 16:427–434

    Article  CAS  PubMed Central  PubMed  Google Scholar 

  32. Holmboe ES, Yepes M, Williams F, et al: Feedback and the mini clinical evaluation exercise. J Gen Intern Med 2004; 19(part 2):558–561

    Article  PubMed Central  PubMed  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to John Q. Young M.D., M.P.P..

Rights and permissions

Reprints and permissions

About this article

Cite this article

Young, J.Q., Lieu, S., O’Sullivan, P. et al. Development and Initial Testing of a Structured Clinical Observation Tool to Assess Pharmacotherapy Competence. Acad Psychiatry 35, 27–34 (2011). https://doi.org/10.1176/appi.ap.35.1.27

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1176/appi.ap.35.1.27

Keywords

Navigation