Background

Infections are among the most common indications requiring care in an intensive care unit (ICU). The Extended Study on Prevalence of Infection in intensive Care III (EPIC III) was a recent international point prevalence study examining the occurrence of infections in ICUs [1]. Among 15,165 qualifying patients, 8135 (54%) had at least one suspected or proven infection on the study day and 1921 (24%) of these patients had more than one suspected or proven infection. Interestingly, multilevel analysis demonstrated that infection with antibiotic-resistant pathogens including vancomycin-resistant Enterococcus (VRE), Klebsiella species resistant to β-lactams, or carbapenem-resistant Acinetobacter species were associated with a higher risk of in-hospital death compared to susceptible microorganisms [1]. Escalating antimicrobial resistance for all pathogen types (bacterial, fungal, viral) has also increasingly impacted the outcomes of critically ill patients as suggested by EPIC III and other studies. The World Health Organization considers antimicrobial resistance to be a major threat to human health and a recent Wellcome Trust report suggests that nearly 300 million individuals will die over the next several decades as a direct result of antimicrobial resistance [2, 3]. Similarly, in the United States antibiotic resistant pathogens cause more than 2 million infections and 23,000 deaths per year as reported by the Centers for Disease Control and Prevention [4].

Given the common occurrence of infections in the ICU, along with escalating antimicrobial resistance, we set out as our main goal to review the available literature regarding the importance of time related variables impacting antibiotic therapy (Fig. 1). We also wanted to provide some “common sense” recommendations supported by published evidence that may help clinicians optimize antibiotic therapy for critically ill patients and potentially improve their outcomes while minimizing further resistance emergence.

Fig. 1
figure 1

Important antibiotic related timelines potentially impacting the outcomes of infected critically ill patients. *Prolonged infusion duration of antimicrobials to increase antimicrobial drug exposure for the offending pathogen

Timing of appropriate therapy—septic shock

Although controversy persists regarding many aspects of care for septic patients, nearly all agree that timely and appropriate antibiotic treatment is a necessary first step to insure good outcomes (Fig. 2) [5,6,7,8,9]. Interest in the issue of appropriate antibiotic treatment arose over two decades ago [5]. For example, Kumar and colleagues documented that for each hour delay in the administration of appropriate antibiotic(s) the patient’s risk for death increased substantially [10]. These authors demonstrated that for every hour’s delay until appropriate antibiotic administration led to a more than 10% increase in crude mortality. Specifically, if one did not begin appropriate therapy within 1 h of shock, the odds ratio (OR) for mortality increased from 1.67 in hour 2 to 92.54 with delays > 36 h [10]. Subsequent analyses examining the value of care bundles in sepsis confirmed the crucial importance of timely antimicrobials and source control [11]. A review of over 1000 patients with septic shock arising from Gram-negative pathogens revealed that inappropriate antibiotic therapy (identified based on the failure to administer an in vitro active antibiotic within six hours) independently increased the risk for mortality nearly fourfold [12].

Fig. 2
figure 2

Bar graph depicting mortality for patients receiving delayed appropriate antibiotic therapy (black bars) and those receiving timely appropriate antibiotic therapy (white bars). See references 5–9 for individual study characteristics

Despite multiple analyses emphasizing the need for appropriate antibiotic treatment, documented rates of appropriate therapy in chart audits have not improved. Some suggest that relying on ORs to describe the burden of inappropriate therapy have not sufficiently motivated clinicians to change behavior. Therefore, Vazquez-Guillamet et al. shifted the emphasis from reliance on ORs to making the burden of inappropriate therapy much more tangible for the bedside provider. Specifically, they determined the number needed to treat (NNT) with appropriate therapy to save one life [13]. In over 1000 subjects with septic shock caused by a range of pathogens these investigators calculated that appropriate therapy enhanced the likelihood of survival at least threefold. More importantly, this converted into a NNT to save one life of only 5 [13]. A recent meta-analysis of the import of appropriate antibiotic therapy in a range of infectious nicely summarizes how the value of appropriate therapy increases in parallel with a patient’s severity of illness. Bassetti and colleagues identified 114 studies of appropriate therapy, 63 of which specifically dealt with sepsis and septic shock. The strongest positive impact of appropriate antibiotic treatment was observed among those with septic shock [14]. Appropriate therapy in sepsis not only significantly reduced in-hospital mortality (OR 0.44, 95% Confidence Interval 0.37–0.52) but also reduced length of stay by approximately 5 days [14]. That one can reduce both rates of death while simultaneously improving resource use and throughput underscores the true significance of this aspect of sepsis care.

Why does inappropriate therapy persist in clinical practice? In part, there may be delayed recognition of sepsis. More likely, the issue lies with the clinician. Looking at the factors associated with inappropriate therapy, the strongest variable related to failure to prescribe appropriate therapy relates to the prescriber not considering the presence of antibiotic resistance. With escalating rates of antibiotic resistance, the strongest factor independently associated with inappropriate therapy has been infection due to a resistant pathogen. In other words, the central factor propelling inappropriate therapy is failure to realize a patient’s risk factors for infection with an antibiotic resistant pathogen.

Appropriate therapy optimization—bacterial infections

When initiating antimicrobial treatment in ICU patients, the choice of agents is most often empirical based on the site of infection, clinical severity and patient comorbidities [15]. Another key element for guiding appropriate empirical therapy is identifying risk factors for infection with multidrug-resistant bacteria (MDRB), so as to rationalize the empirical use of broad-spectrum antibiotics and prevent their unnecessary utilization. Recent literature suggests that initial antimicrobial therapy that is too broad is associated with poor outcomes. Webb et al. examined 1995 patients with community acquired pneumonia of whom 39.7% received broad-spectrum antibiotics, but MDRB were recovered in only 3% [16]. Broad-spectrum antibiotic treatment was associated with an increased mortality risk even after adjusting for prognostic covariates. Antibiotic-associated events were found in 17.5% of dying patients in the broad-spectrum group and may explain in part the worse outcomes for this cohort. The absence of bacteriological documentation in the majority of patients receiving broad-spectrum therapy suggests that other disease processes mimicking pneumonia and requiring alternative treatments may also have been missed [16].

Rhee et al. conducted a multicenter cohort study of 17,430 adults with sepsis and positive clinical cultures [17]. Among the 15,183 cases where antibiotic susceptibility testing was available, 12,398 (81.6%) received appropriate antibiotics. Less than 30% of cases were due to MDRB. Unnecessarily broad-spectrum treatment (defined as coverage of methicillin-resistant Staphylococcus aureus, VRE and ceftriaxone-resistant Gram-negative bacteria (GNB) when none of these were isolated) occurred in 8405 (67.8%) cases. The adjusted odds ratio for in-hospital death was 1.27 (1.06–1.4) when comparing unnecessarily broad-spectrum and not unnecessarily broad-spectrum initial antibiotic therapy. Unnecessarily broad antibiotic therapy was also associated with increases in acute kidney injury and Clostridium difficile infections.

Although it is difficult to ascertain with certainty the presence of an MDRB infection before pathogen identification and susceptibility testing, several factors can help clinicians in guiding broad-spectrum therapy [18, 19]. The conditions that influence risk for MDRB infection include recent hospitalization, prior antibiotic exposure, hospital- or healthcare-associated infection, known colonization with MDRB pathogens and local hospital and ICU epidemiology [18, 19]. However, none of these risk factors are completely accurate and the fear of bacterial resistance often drives overuse of broad-spectrum antimicrobials.

In patients colonized with extended-spectrum beta-lactamase (ESBL) producing GNB, carbapenem use increased from 69 to 241 per 1000 patient-days in patients who will not develop an ESBL infection and only 7.5% of infection-related ventilator-associated complications could be attributed to ESBL GNB in ESBL colonized patients [20, 21]. Among patients colonized with ESBL GNB, the site of colonization and its quantitative assessment may help to predict ESBL infections [22]. Similarly, MRSA colonization has been shown to increase empiric vancomycin use by 3.3 fold even in the absence of infection that would justify vancomycin use [23]. The use of rapid molecular tests (genotypic or phenotypic) to identify microorganisms and resistance mechanisms will probably help to increase the likelihood that empirical therapy is also definitive therapy (Fig. 1) while also avoiding unnecessary antibiotic exposures. Turn-around-time of these techniques is less than 2–4 h for routine use and will likely be reduced to less than 1 h in the near future [24, 25].

Besides the use of broad-spectrum antibiotics, combination antibiotic regimens (mostly a pivotal beta-lactam and an aminoglycoside) can help provide appropriate initial coverage while avoiding the systematic use of empiric carbapenems, providing that the patient is at low risk of infection with ESBL GNB [9]. The beneficial effect of dual antibiotic therapy is debated and probably most useful in neutropenic patients and infections due to difficult-to-treat GNB such as Pseudomonas aeruginosa [26, 27].

Appropriate therapy optimization—fungal infections

There is considerable clinical evidence that delayed initiation of appropriate treatment is associated with increased mortality in patients with invasive fungal infections (IFI) [7, 28,29,30]. This is especially the case for critically ill patients with candidemia and septic shock [7, 31,32,33]. Although a specific cut-off point has not been established, several retrospective studies generally support the view that early and effective antifungal therapy is important for survival of patients with IFIs [29, 30]. Specifically, in a retrospective analysis of 157 candidemic patients, Morrell and colleagues found that, the administration of antifungal treatment ≥ 12 h after the collection of the first blood culture positive for Candida was an independent risk factor for hospital mortality (OR 2.09) [29]. Similarly, in another retrospective study of 230 candidemic patients, mortality was lowest (15%) when fluconazole therapy was started on the same day the blood culture was performed and rates rose progressively with time to initiation of fluconazole [30]. Another study of 446 patients showed significant mortality benefit when antifungal treatment was administered within 72 h of a positive blood culture for Candida [34].

The findings that a delay in initiating appropriate treatment is associated with increased mortality [29, 30], has contributed to recent guidelines recommending initiation of empirical antifungal therapy in critically ill septic patients at high risk for IFI [35]. Nonetheless, deciding which subgroup of patients actually require prompt empirical treatment still remains challenging. Indeed, there are no randomized controlled trials demonstrating the efficacy of empirical antifungal therapy on patient survival [36], thus limiting overall recommendations on timing. Moreover, empirical Candida treatment is frequently based on risk scores with very low positive predictive values that inevitably lead to unnecessary, expensive and sometimes toxic antifungal administration [37]. Despite such controversies, clinicians should be aware that empirical antifungals remain a common practice [38, 39]. Accordingly, when antifungals are prescribed empirically, it is critical to reassess the need for antifungal therapy 72–96 h after starting the treatment, especially when the initial diagnosis was uncertain. Candida biomarkers (CAGTA, T2Candida and 1,3-β-D-glucan assay) have emerged to assist clinicians in de-escalating unnecessary empirical therapy [38, 39]. A strategy using biomarkers among patients receiving empirical antifungals demonstrated a high negative predictive value (97% for the entire population and 100% among ICU patients) [38], thus permitting the safe early discontinuation of empirical therapy.

Regarding other IFIs (e.g. invasive aspergillosis, mucormycosis), no consensus exists about the exact timeframe for starting empirical therapy outside of neutropenic patients [40]. However, due to the high mortality associated with these infections, we suggest that patients with specific risks for developing IFI other than invasive candidiasis, should receive empirical treatment upon clinical suspicion occurs, even if definitive proof of infection has not yet been obtained. Fungal cultures, a combination of serological biomarkers (galactomannan, Aspergillus PCR and 1,3-β-D-glucan assay) along with computed tomography, should always be performed and treatment should be revised and eventually withheld if the diagnosis of fungal infection is not confirmed [40].

Resistance avoidance with antimicrobial de-escalation

Antimicrobial de-escalation (ADE) refers to early modification of empiric antimicrobial therapy in order to prevent the emergence of antimicrobial resistance by decreasing overall exposure to broad-spectrum agents. It is known that the risk of new resistance emergence increases for each day of additional exposure to antipseudomonal β-lactam antibiotics ranging from 2% for meropenem to 8% for cefepime or piperacillin/tazobactam [41]. ADE is generally achieved by switching from combination antibiotics to monotherapy or by reducing the antimicrobial spectrum when broad-spectrum antibiotics are initially prescribed [42]. Additionally, reducing the number of administered antibiotics also offers the advantage of potentially reducing side effects and costs.

Many clinicians still are reluctant to modify initial broad-spectrum antibiotic regimens even when the practice is supported clinically and by microbiologic testing. To date, most studies have agreed on the fact that ADE is safe [42, 43]. One multicenter non-blinded trial of ADE compared to continued broad-spectrum therapy did find no difference in mortality but longer length of ICU stay in the ADE arm [44]. Among critically ill patients with proven candidemia, de-escalation from an echinocandin to fluconazole based on susceptibility testing was also found to be safe in terms of mortality and other outcomes [45]. Despite these data, the overall utilization of de-escalation is still low. In a recent multinational observational study (DIANA study), empirical therapy was de-escalated in only 16% of patients receiving initial broad-spectrum therapy [46]. Previous studies have reported ADE rates between 25 and 80% where the higher rates are generally reported from single centers focused on de-escalation for specific pathogens [43, 47,48,49].

The impact of ADE on resistance prevention has not been consistently demonstrated. In fact, few studies have specifically analyzed the effect of ADE on new antimicrobial resistance. One retrospective study of ADE did not find any prevention for the subsequent isolation of multi-drug resistant (MDR) pathogens in surveillance cultures or in ICU-acquired infections [50]. Montravers and colleagues also did not find a reduction of the emergence of MDR pathogens in a cohort of critically ill patients with intra-abdominal infections [51]. Similarly, the emergence of antibiotic-resistant bacteria was not reduced with de-escalation of empirical anti-pseudomonal beta-lactams in a retrospective study focused on the occurrence of new antibiotic resistance [52].

The DIANA study also did not demonstrate significant differences in the emergence of MDR pathogens following ADE [46]. However, emergence of MDR pathogens was numerically lower with ADE than in patients in whom empirical therapy was maintained (7.5% vs 11.9%; p = 0.052). Importantly, this study was not designed to draw definite conclusions about resistance emergence. In non-critically ill patients, a retrospective study that evaluated the safety of de-escalation of empiric carbapenems prescribed in an ESBL-endemic setting observed a significantly lower incidence of carbapenem-resistant A. baumannii acquisition in the group that underwent ADE [49]. The rate of adverse drug reactions was also significantly lower in the de-escalated group.

ADE is clearly feasible to carry out for both bacterial and fungal infections. ADE is safe and has been a recommended strategy in critically ill patients endorsed by an international position paper [42]. Clinicians should attempt to routinely carry out ADE focusing on the clinical response of the patient and the results of susceptibility testing. The use of appropriate antimicrobial doses and infusion durations will also help insure appropriate pharmacokinetic (PK) antibiotic exposure to optimize clinical outcomes.

Antibiotic infusion duration to optimize drug pharmacokinetics

In addition to delivering timely appropriate antibiotic regimens, adequate drug concentrations at the infection site are needed to optimize clinical outcomes. The DALI study, a prospective, multicenter study, was primarily conducted to describe the frequency with which PK/pharmacodynamic (PK/PD) end points for β-lactam antibiotics were achieved in critically ill patients [53]. Achievement of PK/PD targets was highly variable among the different antibiotics studied, ranging from 35.0% for an aggressive target (100% TFREE > 4 × MIC) to 78.9% for a traditionally acceptable target (50% TFREE > MIC). These data suggest that many critically ill patients have inadequate antibiotic exposure as assessed by PK/PD endpoints.

Many factors influence the PK of antibiotics in critically ill patients and may contribute to subtherapeutic exposures. Hypoalbuminemia, large-volume crystalloid administration, large pleural effusions or abdominal ascites that increase the volume of distribution for hydrophilic drugs, catecholamines, and renal replacement therapies can all significantly alter infection site concentrations of administered antibiotics [54]. Another factor worth specific mention is augmented renal clearance (ARC). ARC is defined as a creatinine clearance (CrCl) greater than 130 mL/min/1.73 m2 in males and greater than 120 mL/min/1.73 m2 in females [55]. ARC has been linked with subtherapeutic β-lactam and glycopeptide concentrations [56, 57]. However, results have been conflicting in studies attempting to associate ARC with worse clinical outcomes [58,59,60]. ARC was implicated as a possible cause of treatment failure in a randomized controlled trial comparing 10 days of imipenem/cilastatin with 7 days of doripenem for ventilator-associated pneumonia caused by GNB [61]. Altogether, the study was terminated early because clinical cure rates were lower and mortality rates were higher in the doripenem group than in the imipenem group. Of interest, the largest difference in clinical cure rates was in the subgroup of patients with a CrCl greater than 150 mL/min/1.73 m2 [61].

The most common strategy studied to adjust for altered PK parameters in critically ill patients and achieve greater time above the MIC has been prolonged or continuous infusions of time-dependent antimicrobials, including β-lactams, carbapenems, and vancomycin. While numerous observational studies have shown better clinical cure rates with prolonged or continuous infusion of β-lactams, two meta-analyses have failed to confirm these findings [62, 63]. In contrast, a meta-analysis that included vancomycin and linezolid [64] and another that focused specifically on piperacillin/tazobactam or carbapenems [65] found improved clinical outcomes, including lower mortality, when antibiotics were administered by prolonged or continuous infusion compared with bolus injections.

The variability in outcomes between meta-analyses of prolonged or continuous antibiotic infusions is likely multifactorial but, in large part, a result of the lack of methodologic rigor and transparency as recommended by well-established standards for conducting such studies. Therefore the findings, both positive and negative, should be tempered by the presumed risk of bias [66]. It is also important to recognize that prolonged infusions of antibiotics will not compensate for poor initial drug selection, inferior drug characteristics, or underdosing of these agents in critically ill patients. The largest (n = 432) randomized, multicenter trial to date comparing continuous β-lactam infusions with intermittent infusions in critically ill patients with severe sepsis found no difference in alive ICU-free days, 90-day survival, or clinical cure 14 days after antibiotic cessation [67].

Using AI/ML to improve sepsis outcomes

As the foundation of optimal sepsis care is fundamentally linked to the timing of key interventions, early recognition coupled with timely management strategies remain paramount to improving outcomes. Artificial Intelligence (AI) and Machine Learning (ML) are types of advanced mathematical models that combine computer science with statistical methods to yield highly accurate predictive models. These advanced computational tools can analyze enormous quantities of data to identify patterns from large, complex datasets. Sepsis, being a common entity with significant heterogeneity, combined with the large quantity of clinical data available, especially in the ICU, is a particularly attractive target for AI/ML-based analysis.

As a result, over the past 10 years, there has been a relative explosion in the use of AI/ML in sepsis, particularly around predicting onset time, which if done correctly, can help identify patients with impending sepsis and reduce time to appropriate antimicrobial therapy. One of the earliest approaches used a simple recursive partitioning and regression tree to identify ward patients who may become septic [68]. In this analysis, 70% of alerted patients had a sepsis-related intervention performed, suggesting the feasibility of early identification. This paved the way for additional analyses and in 2015, Henry and colleagues demonstrated that more advanced statistical tools could be combined with large, publicly available ICU databases, by creating a retrospective model that could predict septic shock (Sepsis-II criteria) 28.2 h (median) before onset with a sensitivity of 85% and specificity of 67% (area under the receiving operating characteristic curve [AUROC] 0.82) [69]. In 2016, the same data was used to train a different model which could predict sepsis (Sepsis-II criteria) 3-h ahead of clinical onset with a sensitivity of 0.90 at specificity of 0.81 (AUROC 0.83) [70]. Since then, yet further progress to operationalize advanced AI/ML techniques have spawned additional analyses using more robust AI/ML algorithms yielding similar results [71,72,73]. Furthermore, these advanced approaches have yielded incremental improvements in sepsis case recognition and prediction when compared to traditional early warning systems [72, 74]. Despite the promise of these retrospective models, only about 6% have been prospectively evaluated and when implemented have yielded mixed results on patient mortality and length of stay [68, 75, 76].

While rapid molecular diagnostic tests are increasingly being developed to identify pathogens and antibiotic resistance patterns, their cost and availability preclude widespread deployment. Similarly, even though these tests are considered “rapid”, they still require time for sample collection, lab delivery, and specimen analysis, during which time, antibiotic therapy is usually not withheld. AI/ML may be able to help bridge this time gap, by predicting antimicrobial resistance patterns, further facilitating antimicrobial stewardship. In a recent analyses, McGuire and colleagues demonstrated that longitudinal clinical data could be harmonized to predict the risk of carbapenem resistance [77]. In this investigation, new carbapenem resistant infections accounted for 1.6% of the population, yet the predictive model generated a sensitivity of 30%, a positive predictive value of 30% and a negative predictive value of 99% (AUROC 0.84). While AI/ML certainly cannot replace the role of rapid molecular testing, it may be able to facilitate upfront appropriate antimicrobial selection.

Beyond using AI/ML to predict onset time and antimicrobial resistance patterns, advancements in decision modeling are creating avenues for investigators to develop AI/ML algorithms to help determine optimal timing for fluid resuscitation and vasopressor initiation [78]. In this study, Komorowski and colleagues use a reinforcement model to learn optimal intravenous fluid resuscitation and vasopressor dosing strategies. Retrospective validation of this model revealed that mortality was the lowest when clinician actions matched the AI-based recommendations.

As we look towards the future of AI/ML in sepsis care, there are many necessary barriers that need to be overcome before wide scale deployment is achieved. These include the need for larger, more integrated datasets, a harmonized definition of sepsis suitable for automated extraction, more robust explainability, and prospective algorithm validation with emphasis on end-user needs, expectations and clinical workflows [79,80,81].

Conclusions

Time variables play an important role in the care of patients with life threatening infections. As Fig. 3 demonstrates, delaying appropriate antibiotic therapy increases the risk of death. At the same time, the risk of antibiotic resistance increases as the duration of antibiotic therapy is prolonged without a ceiling effect [41, 82]. Given these competing clinical outcomes, infection cure versus resistance emergence, clinicians must employ strategies that optimize their use of antimicrobials in the ICU. Table 1 provides some “common sense” recommendations that will assist clinicians in achieving a more harmonious balance in the ICU in regards to antibiotic utilization and timing. Future advances in non-antibiotic therapies for serious infections, rapid molecular diagnostics, and AI/ML should further enhance antibiotic timing practices in the ICU and improve patient outcomes while minimizing the use of unnecessary antimicrobial therapy.

Fig. 3
figure 3

Solid line depicts increasing risk of mortality for each day that inappropriate antibiotic therapy is continued from the start of treatment. Dash line depicts increasing risk of new antibiotic resistance emergence for each day that antibiotic treatment is continued from the start of treatment

Table 1 Summary and key recommendations