Deceased donor audits (DA), also known as death audits, are retrospective medical chart reviews of ventilated hospital deaths that help inform the number of potential deceased donors and measure system performance at each step along the donation pathway (Fig. 1).1 Accurate estimates of organ donor potential and identification of missed donation opportunities (MDOs) are fundamental to quality assurance and improvement. Missed donation opportunities are safety events where best or expected practice has not occurred at one or more points along the donation pathway (Fig. 1). Missed donation opportunities can result in preventable harm to patients and their families at end of life who are denied the opportunity to donate and to patients on transplant waitlists who are denied access to life-saving organ transplantation.1,2

Fig. 1
figure 1

Sequence of care in deceased donation

Currently in Canada, there is no national approach to DA, nor are such data universally collected, collated, or reported.1 Standardization of national donor definitions to inform DA processes and data collection is essential for accurate organ donation organization (ODO) performance measurement, reporting, and benchmarking.1,2,3,4,5 At present, there is no national accountability for MDOs despite the persistent gap in the number of deceased donors annually and the number of patients waiting for a transplant.

In 2014, the Deceased Donation Data Working Group (DDDWG) sponsored by Canadian Blood Services (Ottawa, ON, Canada) conducted an evaluation of Canadian and international ODO practices to inform recommendations for donor definitions (DD) and a national minimum data set.3,6 Observing the persistent variability in baseline data required to inform quality improvements, we undertook an updated environmental scan (ES) to describe current ODO DA practices, DD, donation statistics available for 2016–2018, and formulas used to calculate donation metrics to inform development of an updated national DA strategy. This strategy includes federal government support through Health Canada to implement a common software data solution across all provinces. Therefore, the need to establish and confirm clear and common deceased donation data definitions and metrics is timely and vital.

Methods

The Research Institute of the McGill University Health Centre (Montreal, QC, Canada), in collaboration with Canadian Blood Services, invited all eleven Canadian ODOs to complete the ES. The ES was designed to understand the current state of DA practices in each ODO through an internal survey and interviews. As a quality improvement project, the McGill University Health Centre Research Ethics Board granted an exemption letter for ethics review. Collaboration was sought from each ODO administrator through Canadian Blood Services’ Donation and Transplant Administrators Advisory Committee.

Participants

All 11 Canadian ODOs participated in the ES. Organ donation organization representatives, experts in data collection and donor audit processes at their ODO, took part in all aspects of the ES: BC Transplant, BC; Southern Alberta Organ and Tissue Donation Program (SAOTDP), AB; Human Organ Procurement and Exchange (HOPE), AB; Donation Program, Saskatchewan Health Authority, SK; Transplant Manitoba - Gift of Life, MB; Ontario Health (Trillium Gift of Life Network), ON; Transplant Québec, QC; Legacy of Life: Nova Scotia Organ and Tissue Donation Program, NS; NB Organ and Tissue Donation Program – Horizon Health Network, NB; Organ & Tissue Donation PEI, PEI; and The Organ Procurement Exchange of Newfoundland and Labrador (OPEN), NL.

Survey and interview development

The survey and interview guide were designed using common methodology.7 The research team first generated an exhaustive list of themes based on existing literature. Themes were then drafted as survey and interview questions and divided into domains of DA practices, procedures and resources, operational definitions, and calculation of donation rates. Potential survey and interview questions were then circulated to expert consultants (donor coordinators and donation physicians) for feedback. The research team and an experienced donor coordinator reduced the list of potential survey questions to a target of 25 or fewer.8 Our team then determined whether each item should be: 1) retained for survey; 2) retained for survey with edits; 3) excluded from survey; or 4) retained and moved to the interview guide.

Prior to data collection, two donor coordinators provided feedback on ease of survey completion, flow, clarity, relevance, completeness, face validity, content validity, redundancy, and time for completion. We revised the survey based on the provided feedback. The survey was created and hosted on the Interceptum (Aquiro Systems Inc., Gatineau, QC, Canada) platform. The final survey had 21 questions, which were a combination of multiple choice and free-text responses (Electronic Supplementary Material [ESM] eAppendix 1).

Survey administration

A research assistant communicated with each ODO to establish a point of contact and introduce the project. Subsequently, each ODO was emailed a cover letter and link to the web-based survey. Up to three weekly reminders were sent after the initial invitation.

Interview

After survey submission, the research coordinator booked a one-hour telephone interview with the ODO representative. Interviews were recorded for transcription purposes with verbal consent. Data were entered and summarized in a spreadsheet.

The structured interview guide (ESM eAppendix 2) included questions about DA objectives, frequency, scope (hospital sites and patient inclusion/exclusion criteria), data collection methodology, human and other resources required to conduct DAs (including training, standard operating procedures, policies), and outcome reporting/feedback processes including accountability structures. Finally, we sought to clarify or complete information that was missing from the survey results.

Data analysis and review

The ES aimed to identify current DA practices among Canadian ODOs. We presented the data in frequencies and proportions. Qualitative data collected during interviews and as written responses were analyzed thematically and organized into figures that summarize DA practices and quotations used in text where appropriate.

Results

Donor audit definitions and metrics

Organ donation organizations were asked to provide their operational definitions, if available for: potential donor, identified potential donor, referred potential donor, eligible potential donor, approached potential donor, missed referral, consented donor, and non-utilized consented donor (ESM eAppendix 3). If operational definitions were not available, ODOs volunteered locally used terms.

Next, we asked ODOs to provide the calculation formulas used for local performance measures. The proportion of ODOs who calculated each rate is shown in Table 1. The ODO calculation formulas for these rates are shown in ESM eAppendix 3.

Table 1 Proportion of ODOs who calculate rates for performance measurement

All DD and the classification of missed referral by ODOs varied significantly across Canada. Rates calculated for performance metrics were also inconsistently employed. Reasons provided for lack of available metrics were: “not currently calculating rates in death records,” “documentation of approach is not standard practice,” “standardization has not yet occurred in critical care and emergency departments that reflect accurate referral triggers,” and “could not provide a calculation for referral rate because the critical care team (rather than the ODO) usually approaches families in our jurisdiction. Thus, they are generally only referred if they are consented donors.” Even within jurisdictions, ODOs reported inconsistent application of DD and performance metric calculations, leading to uncertain confidence in calculated and reported rates.

Donor audit practices

Ten out of eleven ODOs (91%) were conducting DAs from 2016 to 2018. Primary purposes for conducting DAs were estimating donor potential 5/11 (45%), estimating system performance at the provincial level 3/11 (27%), and estimating system performance at the hospital level 3/11 (27%). Other purposes for conducting DAs were to inform quality improvement initiatives and to identify MDOs. Frequency of DAs varied between ODOs from weekly to annually and was determined by availability of hospital death reports, hospital locality (urban vs rural), and availability of personnel to perform the DA. Hospitals providing ventilator support were considered a source of potential deceased donors. The number of hospitals in each ODO jurisdiction that provide ventilator support and the proportion audited are shown in Fig. 2.

Fig. 2
figure 2

Number of hospitals with ventilatory capacity audited by each ODO

Organ donation organizations audited hospital care units are based on factors such as potential donor volume and capacity of ODO staff. The number of hospitals routinely audited (as defined by each ODO) may vary in frequency. In Table 2, the care units audited by ODOs are identified along with the number of ODOs that audit at least one of these units in their jurisdiction. Six (55%) ODOs consistently audited the same units across all hospitals with ventilation capability, while the other five ODOs vary which units are audited over time. The proportion of hospitals with ventilator capacity that were routinely audited varied from 43% to 100%.

Table 2 Number of ODOs that audit at least one of the listed care units in their province

Both patients with neurologic determination of death (NDD) or donation after circulatory death (DCD) may be audited. All eleven ODOs (100%) audited NDD patients while nine ODOs (82%) audited both NDD and DCD patients. Six (55%) ODOs audited all hospitals that offered NDD donation and eight (73%) audited all hospitals that offered DCD donation. Nationally, 81% of hospitals offering NDD donation and 84% offering DCD had previously been audited.

Reporting and feedback process

The majority of ODOs report either quarterly or annually for the equivalent period audited; however, the time gap between conducting a DA and reporting could be up to two years. Table 3 provides a breakdown of how the report is published and to whom it is disseminated or made available.

Table 3 Organ donation organization donor audit reporting practices

Organ donation organizations aimed to provide feedback on MDOs in a timely manner. Feedback was provided directly to physicians involved in the MDO at seven (64%) ODOs. Feedback was most often provided verbally and through a donation physician or the unit medical director. One ODO communicated directly with nursing staff rather than with physicians while another ODO provided both formal, written feedback to individual physicians as well as public reporting.

Procedures and resources

Ten ODOs provided us with their current DA data collection tools. Organ donation organizations described that the data variable set has evolved over time based on expert input, local standardization, and formalization of DD.

Data collection tools showed broad variability in the sequence of data collection and in the minimum data set collected. Several data collection tools had multiple data collection stop points depending on referral status, inclusion criteria, ventilation status, and diagnoses. Despite the variation, nine (82%) ODOs collected minimal demographic information on missed potential donors.

In Table 4, we outline DA data collection practices and procedures across ODOs. All ODOs confirmed that deceased donation community stakeholders, such as donor physicians and intensive care unit (ICU) health care professionals, could access ODO data for quality improvement initiatives or research. Two ODOs (18%) described a formal process to requesting data while the remaining ODOs described predominantly informal processes for accessing data.

Table 4 Donor audit data collection practices and procedures

Much DA data collection is completed by donor coordinators who are nurses with ICU experience ranging from one to 12 years. One ODO (9%) employs Health Information Management professionals to complete DAs to preserve donor coordinator resources. Two ODOs (18%) reported having or have had respiratory therapists and neurointensivists performing DA. Employment status (full-time vs part-time) and location of employment (onsite vs offsite) again varied among ODOs. Organ donation organizations had, on average, four full-time (range 1–20) and three part-time (range 1–20) coordinators.

All ODOs provided training (50:50 informal:formal training) to coordinators performing DA. Pairing staff with an experienced mentor was the most popular training method for both formal and informal training. Formal training involved mock DAs and standardized training procedures and materials. Only one province used interrater agreement as a measure of validity on previously audited “training cases” when training new coordinators. Training program lengths varied from a day to a year before a new coordinator could independently conduct an audit.

Consistently performed data validation occurred in three of the eleven ODOs (27%). Six ODOs (55%) had a “one person, one chart” system with no independent review of data entry or system to retrace errors. Six (55%) ODOs employed an adjudication process for chart review discrepancies. Two ODOs (18%) had formal processes and four ODOs (36%) had informal processes. Examples of formal adjudication processes included ODO medical directors making a final decision or a special review committee (composed of medical leadership, ODO directors, and vice president of clinical donation) achieving consensus on whether the case was an MDO. Examples of informal adjudication processes include consensus discussions among donor coordinators, some of which may include a donation physician.

Privacy issues

Privacy issues related to patient confidentiality did not preclude DA activities; however, three ODOs (27%) reported that such issues have impacted their audit timeframes by delaying access to medical archives. Privacy issues related to reporting were not evaluated.

A summary of ODO donor audit processes is presented in Table 5. Organ donation organization numbers for each DD for the years 2016–2018 are previously published.2

Table 5 Summary of donor audit processes

Discussion

In this ES of current DA practices among Canadian ODOs, we found that the frequency, methods, and scope of these DAs were variable across jurisdictions. The primary reasons for conducting DAs across all ODOs are to estimate donor potential, evaluate system performance, identify MDOs, and inform quality improvement initiatives. Reporting DA results was frequently done to ODO quality improvement/patient safety committees (73%) and feedback in various forms was often, but not universally, provided to health care professionals (64%). Only one ODO provided public reporting aligned with current standards of increasing transparency with the public. Organ donation organizations collected differing minimum data sets using divergent DD and methodology for calculating donation metrics. Our aim was to understand current practices to inform development of a national deceased donation data strategy and not to compare ODO.

When comparing our data to the ES previously undertaken by the DDDWG, we noted that DAs in Canada continue to be predominately conducted by staff with informal training. Nevertheless, there is improved standardization in DA practices with twice as many ODO reporting the use of standard operating procedures (SOPs) in 2019 (comparison data obtained 27 March 2019 through personal communication with Karen Hornby, BScN, MS, Research Support Services Program, Trillium Gift of Life Network, Toronto, ON, Canada).

It is not surprising that we found variability between ODOs given that oversight of health care in Canada resides at the provincial level. This differs in comparison with high-performing organ donation and transplantation (ODT) systems with timely performance of systemwide DAs like the UK, where health care is managed nationally under the National Health Service. The lack of consistent timeliness of performing DAs in Canada is likely explained by limitations in human resources (e.g., limited staff, staff turnover, competing demands, staff off site), inconsistency of electronic health records, and lack of automation, namely an electronic donation system used nationwide. Under these conditions, it is difficult to build expertise in performing DAs and to have consistent, accurate, and timely data to drive improvement. In an era of increased transparency and patient/family engagement in health care, public reporting of ODT data from DAs would be the goal; it is no wonder that, given our findings, only one ODO is doing so.

Published DA standardization initiatives have been associated with an increase in donor identification and referral, thereby reducing MDOs over time.9,10 A systematic approach to medical chart reviews is deemed to be the most accurate method to estimate potential donor pools; however, this requires standardized data collection tools and protocols for training chart abstractors and measures of interrater reliability to optimize data accuracy and precision.4,5 An example of regional and international standardization initiative is the Donor Action program.11 This program, led by a group of professional organizations with experience in organ procurement from Europe and the USA, provides tools, resources, and guidelines for conducting DA. This allows for assessment of hospital-specific organ donation potential and the identification of bottlenecks along the donation pathway, including DAs used to identify MDOs. After one year of implementing the Donor Action program, an average increase of 53% in aggregated donation rates was reported by eight countries and sustained after two years at 33% in 11 European hospitals.12,13 Importantly, consistently applied quality improvement initiatives and the cost to implement them can outweigh the cost of treatment alternatives, such as dialysis in patients with end-stage renal disease.12

To further evolve and standardize a national approach to DAs in Canada, we need to harmonize and implement a national deceased donation minimum data set, DD, and donation metrics. Next, the DA process should be standardized and supported by a DA trigger to inform scope, training for those conducting DAs, and timely reporting and feedback of audit findings. Finally, to evolve to a high-performing ODT system, we recommend that DAs be performed in all emergency departments and ICUs where mechanical ventilation is available. Encouragingly, most Canadian hospitals with ventilatory capacity were being audited. Additional elements of a high-performing system include publicly reported donation rates and that a single, national electronic donation data system be implemented. Only one ODO had public reporting in 2019, but importantly, five provinces had legislation in place around mandatory reporting on deceased donor metrics to the provincial Ministries of Health, advancing the argument for regular standardized DAs at the national level. Given the significant variability reported in donor audit practices, national collaboration will be required to report, analyze, and reduce MDOs. Concurrent work is underway by our group and others to achieve these objectives. Important facilitators to this work are the willingness and motivation of all ODOs to collaborate on this project and certain ODOs being high performers in DAs who can guide others to also be successful.

Our results are limited by the self-reporting nature of the study design. In addition, we did not verify or observe practices reported by ODOs. Furthermore, our understanding is limited by the questions we asked and how they were interpreted. For accuracy, we sought to clarify responses with ODO representatives during the interview process. Importantly, where practices were not harmonized throughout a province, ODO representatives who participated in the ES responded in consideration of their local context and we recognize there may be an underestimation of the variability in practices. As a result, we may have an incomplete picture of each ODO’s practices in data collection, adjudication, and training. We expect that certain practices may also have evolved since the ES was undertaken, as some ODOs were already planning quality assurance and SOP development projects. We do not know how these may have evolved during the COVID-19 pandemic.

Conclusion

Canadian ODOs deceased DA practices remain fragmented and lack consistency in methodology, timeliness, definitions used, metrics calculated, and reporting. Consequently, there are significant challenges in tracking MDOs and developing quality improvement initiatives to reduce them. This variability underscores the need for a national DA strategy to harmonize DD metrics and establish a deceased donation minimum data set. This would allow accurate assessment of ODT system performance and inform quality improvement activities to better serve patients at the end of life who are denied the opportunity to donate and those on the transplant waitlist.