Keywords

What Is Outcome Analysis?

Outcome analysis does not directly assess the quality of performance, it only allows for interpretation of the quality of the process and structure of care. A set of key performance indicators (KPIs) should be established to ‘measure’ against.

Outcome analysis can be a subjective process as good outcomes can come from poor care and poor outcomes can come from good care. There are many interplaying factors that need to be considered and a ‘one-size-fits-all’ approach cannot be established. It is important to ensure that the intended outcome is consistent with the systems and processes put in place by each facility.

Ensure that the intended outcome measure is clearly defined and quantifiable. Ways of measurement can be against a scale, by questionnaires, direct observation, or retrospective review of data.

Policies and procedures should describe in detail the steps to be taken to perform outcome analysis. The process for outcome analysis should follow the PDCA quality cycle processes of Planning (what is going to be analysed), Doing (undertaking the analysis), Check (that what has been done was correct) and Acting (taking action to improve based upon the findings).

Standards

There are 146 incidences of the word ‘outcome’ or ‘outcomes’ within the seventh standards, so it is present in many of the standards, or explanations of standards, or evidence required (Fig. 6.1).

Fig. 6.1
figure 1

Sample standards from v7

As the standards mature, into the eighth, ninth, and tenth standards and beyond, it can be expected that the need to present outcome improvements will become more prevalent, so the ability to document and review these key performance indicators (KPIs) will become more essential.

Collecting Data

When collecting data, be clear to identify which patient groups/subgroups are being included. Data will normally be collated from patient notes/records/care plans.

Outcome analysis, in line with the standards, can be difficult to establish, but the fundamental requirements are arguably quite simple:

  • The results of what has taken place should regularly be looked at and reviewed, at least quarterly and ideally more frequently, depending on patient numbers.

  • Analysis that MUST be completed: (Table 6.1) (reference JACIE Standard)

    • 100-day mortality

    • Time to engraftment

    • Recipient outcome after infusion of a product with a positive microbial culture

Table 6.1 Sample of data presentation[2]

Have outcome data as a standing item on the quality management/clinical governance meeting

Quality indicators include CAR-T/IEC and other novel therapies.

  • 100-day mortality

  • Acute GVHD grade within one hundred (100) days after transplantation

  • Chronic GVHD grade within one (1) year after transplantation

  • Engraftment data review

  • Late engraftment due to complex issues

  • KPI for platelets not met

  • Auto recovery data (Table 6.2; Fig. 6.2)

  • CAR-T metrics (Table 6.4)

  • Apheresis data and any cancellations

  • Central venous catheter infections

  • Complaints

  • Incident reports

    • Labs

    • Wards

    • Outpatients

    • Apheresis

    • Pharmacy, etc.

Table 6.2 Sample of data presentation[2]
Fig. 6.2
figure 2

Sample of data presentation [2]

Over time a comprehensive tracking system can be established (see Fig. 6.3).

Fig. 6.3
figure 3

Long-term trending review engraftment [2]

To collect data, consider the following:

  1. 1.

    Title: ensure this is appropriate and repeatable.

  2. 2.

    Timeline: decide on appropriate timeline, given issue being measured, and number of patients/potential occurrences – weekly, monthly, quarterly, annually?

  3. 3.

    Patients: be clear to identify which patient groups/subgroups you are including in the data collection.

  4. 4.

    Information Source: Specific form extracted from medical records and laboratory. Describe personnel and sections of the programme responsible for this form.

  5. 5.

    Measurement: Equipment and analysis used; reporting and recording of the data.

  6. 6.

    Formula: if appropriate, ensure consistent formula is used, e.g. number of patients with ANC ≥0.5 × 109/L achieved and sustained for three consecutive days without subsequent decline for ≥3 days.

  7. 7.

    Definitions: ensure any definitions are clearly specified, so there cannot be different interpretations.

  8. 8.

    Adjustment: make any appropriate adjustments as required, e.g. primary disease, stage at transplant, type of transplant, type of donor or period of transplant.

  9. 9.

    Assessment Criteria: There may be several standards to achieve. Minimum acceptable, optimal, or best attainable result.

  10. 10.

    Related Processes: If measurement results indicate poor outcome, consider what problems in the process might have made them happen. Was the process done correctly, were there adequate staff in place, did everyone have training?

  11. 11.

    References Used: EBMT MED-A/B forms, patient registry, national/international standards, etc.

  12. 12.

    Presentation: Report presented and oral presentation at appropriate review meeting.

  13. 13.

    Observations: What were the findings that came out of the analysis and what does research suggest might happen to make the results appear as they have – e.g. co-morbidities, age or ethnicity implications

Holistic outcome analysis can also be completed; these tend to be more subjective and completed via questionnaires or interview. Ideally these should have a process for scoring to facilitate analysis, for example using a scale of 1–10 for pain and satisfaction with ward facilities.

  • Quality of life post-transplant

  • Late effects (fertility, etc.)

  • Satisfaction with care

Staff-based outcome analysis can be completed by establishing a suitable internal level of completion and then assessing the following:

  • Quality of information in case notes, paper or electronic, and the effect on day-to-day care structure

  • Annual training, education and competency assessment for all staff groups

  • Induction of new staff and their understanding of and participation in the quality programme

  • Use of drugs and therapeutics against protocols

Establishing Outcome Analysis for Novel Applications

As each new development comes into implementation in the clinical environment, a new set of outcome analysis needs to be developed. One of the more recent of these is CAR-T/IEC. In development with the manufacturer, a new set of KPIs was developed (Table 6.3) and is being monitored as part of ‘business as usual’. Once sufficient data has been gathered, the most indicative KPIs can then be used and the remainder used for audit purposes.

Table 6.3 New KPIs [2]

Reviewing Data

Outcome reviews should be completed with a wide range of staff and at regular intervals. It is a JACIE standard B/C/D 4.17 (seventh Standards) that the programme director, or designee, shall review and report to staff quality management activities, at a minimum, quarterly (Table 6.4). This presents a local snapshot of activity.

Table 6.4 Sample of quality summary report [2]

The director shall annually review the effectiveness (outcomes) of the quality management program. An annual report is required by JACIE – so combine the two.

Whilst these are useful, it is critical to look at long-term trending when reviewing outcome data. A failure to do this can result in critical trends not being spotted.

This clearly shows outliers that can be investigated.

It could be ‘assumed’ from Fig. 6.4 that treatments should not take place in February as two out of 3 years show high pre 100-day mortality. However, a greater depth of investigation is required to determine the causes of this.

Fig. 6.4
figure 4

Long-term review mortality [2]

As with all reviews, a thorough investigation should take place of any outliers.

Reviews can take the form of [1]:

  • Details of the processes you are trying to improve

  • Details of the areas where these processes take place, i.e. clinical unit, collection facility and processing

  • Details of the numbers of staff involved in the processes and who is responsible for which part of the process

  • Details of any documentation in place to support the current process, i.e. policies and standard operating procedures

  • The actions required to improve the process, e.g.:

    • Simple action – development of patient guide and other information as suggested from patient survey

    • More complex – such as revalidation of stem cell machine or discussion with supplier re-ongoing breakdown problems

    • Testing of machine against several components

    • Changes to donor clearance forms at registry totally in your control

  • Details of who is responsible for the actions

  • Dates when actions should be completed

  • Details of expected outcomes

  • Review of actions and outcomes

  • Record it

  • Review it

  • Record review

  • Act on it

  • Record actions