Setting and participants
This study occurred from July 2018 through June 2019 at Montefiore Medical Center, an academic medical center in Bronx, NY. The system provides primary and specialty care in a network of outpatient practices with over three million visits annually. Primary care practices include teaching sites which are Federally Qualified Health Centers (FQHCs) or FQHC Look-Alikes, including medical trainees and have attending physicians with academic appointments; neighborhood sites which are FQHCs without trainees; and group practices without trainees. Most patients (approximately 75%) are publicly insured by Medicaid and/or Medicare. The system uses Epic electronic medical record (EMR) software. This study was approved by the Montefiore/Albert Einstein Institutional Review Board.
We assessed our opt-in choice eConsult program using a stepped-wedge, cluster randomized design (Fig. 1). Sixteen internal medicine and family medicine practices were classified (teaching site, neighborhood site, group practice) and then included in one of three clusters using stratified random sampling with practice type as the stratifying variable. We chose to scale our intervention using a stepped-wedge cluster randomized design to (1) quantify demand for eConsults over time to inform training needs for each eConsult specialty; (2) provide on-site training for each primary care practice during standing meetings; (3) evaluate the effect of our program on outcomes (defined below).
Our opt-in choice eConsult program was designed by an interdisciplinary team after reviewing existing eConsult programs and the framework for establishing an e-consultation service described by Liddy et al.1, 9,10,11,12,13 The eConsult team was directed by a PCP (SR) with expertise in quality improvement and included physician champions from primary care (JA, JD), specialty care (YT, EE), medical informatics (MB), administrators, and referral coordinators. A pilot program was first tested at three primary care practices; the pilot practices were not included in this study. The pilot program was supported by the Department of Medicine and program expansion was supported by the Medicaid Delivery System Reform Incentive Payments (DSRIP) program.
Primary care implementation
A focus group was held with PCPs from five sites during a monthly PCP meeting. The focus group was facilitated by the eConsult director (SR) and an observer recorded participants’ responses on a whiteboard. The facilitator asked: (1) What are barriers to specialty care? (2) What is an ideal workflow for an eConsult program? (3) What concerns do you have with an eConsult program? The eConsult team reviewed responses using thematic analysis, identifying the following areas for improvement which informed the intervention: (1) lack of a triage system for urgent in-person visits led to development of appointment scheduling structure following an eConsult, (2) concern that patient care responsibilities would be shifted from specialists to PCPs and that appointment decision-making autonomy would be lost led to an opt-in choice eConsult model, (3) concern about quality of specialist eConsult recommendations led to use of audit and feedback for eConsultant recommendations.
PCPs at each primary care practice received an in-person introduction to the program during monthly practice meetings prior to local site implementation. These 30-min training sessions were led by the eConsult program director (SR) and framed as a quality improvement initiative for access to specialty expertise. The sessions included background on problems identified by PCPs, the eConsult workflow, and a request for ongoing feedback to improve the program. Outcomes were reviewed at quarterly meetings with medical directors and administrative leaders. Additionally, PCPs received compensation corresponding to 0.25 wRVU per completed eConsult.
All eConsults were reviewed weekly by the eConsult team for audit and feedback. The eConsult director (SR) e-mailed PCPs to discuss the following: quality of questions (e.g., omission of specific questions or requesting appointment only), incomplete workflows, and feedback for ambiguous eConsult recommendations.
We identified specialties (endocrinology, hematology, gastroenterology, rheumatology) with the greatest need for improvement based on wait times and perceptions that many consult questions could be answered without an in-person visit. eConsultants were selected based on interest in participating. Initially, one eConsultant was identified per specialty, with additional eConsultants recruited as demand grew. Each specialty designed their own workflow to schedule patients requiring expedited appointments, often overbooking appointment slots.
Each eConsultant received an in-person introduction to the program prior to formal participation, including background on program goals, workflow training, and opportunity to ask questions and provide suggestions. These 60-min sessions were led by the program director (SR) for the first group of eConsultants. New eConsultants were subsequently trained by their same-specialty colleagues. eConsultants received compensation corresponding to 0.5 wRVU per completed eConsult. eConsultants participated in monthly meetings with the eConsult team to review and improve outcomes, quality of eConsult communication, and workflow. The program director (SR) also communicated with eConsultants through e-mail as part of the weekly audit and feedback process.
Our intervention was our opt-in choice eConsult program. To place an eConsult order, an EMR-based workflow was designed in which the PCP completes two free-text prompts:
The order was directed to the specialty eConsultant who was expected to respond within three business days with free text:
“Assessment and recommendations:”
“Types of recommendations provided (Select all that apply): Tests, Treatments, Expedited appointment, Regular appointment, Procedure, Alternate specialty recommended, No further evaluation needed by this specialty, More information needed prior to recommendation.”
The “types of recommendations provided” section allowed the eConsult team to track and review outcomes. The recommendation was sent back to the PCP with opportunity for iterative communication or to acknowledge next steps. The eConsult communication was available in the EMR as part of a patient’s chart. Resident physicians could place eConsult orders under supervision of attending physicians.
PCPs could continue to use traditional referrals for specialty in-person appointments throughout the study. In this workflow, a PCP ordered an electronic referral after which either the patient or referral coordinator scheduled the appointment. This process was variable between practices and specialties. If a timely appointment was not available, the PCP often called to request an expedited appointment. The traditional referral did not require a specific consult question. Following an in-person appointment, the progress note was forwarded to the referring PCP; however, there was no direct communication between the PCP and specialist.
To evaluate the implementation of our opt-in choice eConsult program, we measured the eConsult rate: weekly proportion of eConsults per primary care visit at each site. To evaluate our intervention’s impact on demand for traditional referrals, we measured traditional referral rate: weekly proportion of referrals per primary care visit at each site. Visit, referral and eConsult data were obtained from the EMR. Referral data were restricted to new patient requests (no visit in the last 18 months) for eConsult-participating specialties. To assess barriers and facilitators to implementation, we conducted Web-based quarterly surveys of PCP experiences with eConsults and traditional referrals.
Primary care practice characteristics (frequency of visits, referrals, and eConsults) were examined using descriptive statistics. Weekly proportions of referrals and eConsults were examined using a statistical process control p-chart. A linear mixed model (model 1) was constructed to evaluate whether the weekly eConsult rate differed by practice type (teaching, neighborhood, or group), adjusted for cluster and concurrent traditional referral rate. Linear mixed models (models 2A and 2B) were constructed to evaluate whether traditional referral rate at each practice site decreased after implementation of the intervention (eConsult available: no or yes), adjusted for practice type (teaching, neighborhood, or group) and cluster. In model 2B, we examined whether the practice type modified the traditional referral rate after implementation of the intervention (interaction variable for eConsult available and practice type). In all models, to account for within-site correlations over time, the covariance matrix of the error term was assumed to be a full Toeplitz matrix, which can be viewed as an autoregressive structure.