FormalPara Take-home message

This narrative review summarizes the available evidence, emerging options, and unsolved controversies for the optimization of antibiotic therapy in the ICU. The potential benefit of antibiotic stewardship programs to improve patient outcomes and reduce the ecological side effects of these drugs is also discussed.


Antibiotics are massively used in ICUs around the world [1]. While the adequacy and the early implementation of empirical coverage are pivotal to cure patients with community- and hospital-acquired sepsis, antimicrobial therapy is not always targeted and, in more than one out of two cases, may be prescribed in patients without confirmed infections. Moreover, antibiotic de-escalation is insufficiently considered. The resulting selection pressure together with the incomplete control of cross-colonization with multidrug-resistant bacteria (MDRB) makes the ICU an important determinant of the spread of these pathogens in the hospital. As instrumental contributors of antimicrobial stewardship programs (ASP), intensivists should be on the leading edge of conception, optimization, and promotion of therapeutic schemes for severe infections and sepsis, including the limitation of antimicrobial overuse.

In this narrative review based on a literature search (MEDLINE database) completed in September 2018, we sought to summarize recent advances and emerging perspectives for the optimization of antibiotic therapy in the ICU, notably better identification of patients at risk of MDRB infection, more accurate diagnostic tools enabling a rule-in/rule-out approach for bacterial sepsis, an individualized reasoning for the selection of single-drug or combination empirical regimen, the use of adequate dosing and administration schemes to ensure the attainment of pharmacokinetics/pharmacodynamics targets, concomitant source control when appropriate, and a systematic reappraisal of initial therapy in an attempt to minimize collateral damage on commensal ecosystems through de-escalation and treatment-shortening whenever conceivable. We also aimed to compile arguments for the elaboration of actionable ASP in the ICU, including improved patient outcomes and a reduction in antibiotic-related selection pressure that may help to control the dissemination of MDRB in this healthcare setting.

How antimicrobial therapy influences bacterial resistance

The burden of infections due to extended-spectrum beta-lactamase-producing Enterobacteriaceae (ESBL-E) and MDR Pseudomonas aeruginosa is rising steadily, carbapenem-resistant Acinetobacter baumannii and carbapenemase-producing Enterobacteriaceae (CRE) are spreading globally, while methicillin-resistant Staphylococcus aureus (MRSA) and vancomycin-resistant enterococci generate major issues in several geographical areas [2,3,4,5,6,7,8,9,10,11,12,13,14]. These trends now apply for both ICU-acquired infections and imported bacterial sepsis as a result of the successful dissemination of MDRB in hospital wards and other healthcare environments (Fig. 1).

Fig. 1
figure 1

Current resistance rates in major pathogens responsible for hospital-acquired infections according to World Health Organization (WHO) regions. 3GCR third-generation cephalosporin-resistant, CR carbapenem-resistant, MDR multidrug-resistant, MR methicillin-resistant. Data were extracted from the WHO Antimicrobial Resistance Global Report 2014 [171], National Healthcare Safety Network/Centers for Disease Control and Prevention Report 2011–2014 [11], European Antimicrobial Resistance Surveillance Network Annual Report 2016 [172], International Nosocomial Infection Control Consortium Report 2010–2015 [10], CHINET Surveillance Network Report 2014 [14], and other references [5, 12, 13]. Available resistance rates in the specific context of ICU-acquired infections are in the upper ranges of reported values for all geographical areas

Up to 70% of ICU patients receive empirical or definite antimicrobial therapy on a given day [1]. The average volume of antibiotic consumption in this population has been recently estimated as 1563 defined daily doses (DDD) per 1000 patient-days (95% confidence interval 1472–1653)—that is, almost three times higher than in ward patients, with marked disparities for broad-spectrum agents such as third-generation cephalosporins [15]. Whilst most of the underlying mechanisms ensue from a succession of sporadic genetic events that are not directly induced by antibiotics, the selection pressure exerted by these drugs stands as a potent driver of bacterial resistance (Fig. 2) [16, 17].

Fig. 2
figure 2

Drivers of antimicrobial resistance in the ICU. MDRB multidrug-resistant bacteria, ASP antimicrobial stewardship programs, ICP infection control programs. “Direct” selection pressure indicates the selection of a pathogen with resistance to the administered drug. Green vignettes indicate the positioning of countermeasures. ASP may notably encompass every intervention aimed at limiting the ecological impact of antimicrobials agents, including rationalized empirical initiation, choice of appropriate drugs with the narrowest spectrum of activity (especially against resident intestinal anaerobes) and minimal bowel bioavailability, and reduced treatment duration [173]. ICP may include educational interventions to ensure a high level of compliance to hand hygiene and other standard precautions, targeted contact precautions in MDRB carriers (e.g., carbapenemase-producing Enterobacteriaceae), appropriate handling of excreta, and environment disinfection [167]

At the patient level, antimicrobial exposure allows the overgrowth of pathogens with intrinsic or acquired resistance to the administered drug within commensal ecosystems or, to a lesser extent, at the site of infection. Of note, some mechanisms may confer resistance to various classes, notably the overexpression of efflux pumps in non-fermenting Gram-negative bacteria, thereby resulting in the selection of MDR mutants following only a single-drug exposure [18]. At the ICU scale, consumption volumes of a given class correlate with resistance rates in clinical isolates, including for carbapenems or polymyxins [19,20,21,22,23,24,25,26], although this may fluctuate depending on bacterial species and settings [27, 28].

Yet, in addition to its clinical spectrum, anti-anaerobic properties should be considered when appraising the ecological impact of each antibiotic [29]. Indeed, acquisition of MDR Gram-negative bacteria through in situ selection, cross-transmission, or environmental reservoirs may be eased by antimicrobial-related alterations of the normal gut microbiota—primarily resident anaerobes—and the colonization resistance that it confers [30]. A prior course of anti-anaerobic drugs may notably predispose to colonization with ESBL-E [31], AmpC-hyperproducing Enterobacteriaceae [32], or CRE [33, 34]. The degree of biliary excretion of the drug appears as another key factor to appraise its potential impact on intestinal commensals [35,36,37].

Risk factors for multidrug-resistant pathogens

The clinical value of identifying risk factors for MDRB infection is to guide empirical therapy before the availability of culture results—that is, pathogen identification and antimicrobial susceptibility testing (AST). However, no single algorithm may be used to predict a MDRB infection given the complex interplay between the host, the environment, and the pathogen, thus requiring an individualized probabilistic approach for the selection of empirical drugs (Table 1).

Table 1 Determinants of increased risk of MDRB infection at ICU admission and during the ICU stay

Colonization markedly amplifies the risk of subsequent infection with a given MDRB. However, the positive predictive value of this risk factor never exceeds 50% whatever the colonizer is [2, 38,39,40]. For instance, ESBL-E infections occur during the ICU stay in only 10–25% of ESBL-E carriers [41]. Whether an MDRB carrier becomes infected is related to a further series of factors that may or not be related to those associated with the risk of acquired colonization [2, 38, 39]. Overall, the presence or absence of documented carriage should not be considered as the unique requisite for the choice of empirical therapy.

Patients with advanced co-morbid illnesses, prolonged hospital stays, use of invasive procedures, and prior antibiotic exposure are at increased risk of MDRB infections [42, 43].

The patient location is another determinant of risk as there are vast differences in the epidemiology of MDRB globally, regionally, and even within hospitals in the same city [2, 44]. Reasons for these discrepancies may include socioeconomic factors as well as variations in case-mix, antimicrobial consumption, and hygiene practices.

When not to start antimicrobials in the ICU

Although mixed [45], the available evidence supports a beneficial effect of prompt antibiotic administration on survival rates in sepsis and septic shock, irrespective of the number of organ dysfunctions [46,47,48,49]. However, the clinical diagnosis of sepsis is challenging in critically ill patients having multiple concurrent disease processes, with up to 50% of febrile episodes being of non-infectious origin [50]. Furthermore, collection of microbiological evidence for infection is typically slow, and previous antibiotic exposure may render results unreliable. Indeed, cultures remain negative in 30–80% of patients clinically considered infected [51, 52]. Uncertainty regarding antibiotic initiation in patients with suspected lower respiratory tract infection is further complicated by the fact that as many as one-third of pneumonia cases requiring ICU admission are actually viral [53, 54].

In 2016, the Sepsis-3 task force introduced the quick sepsis-related organ failure assessment score (qSOFA), a bedside clinical tool for early sepsis detection [55]. Although the predictive value of qSOFA for in-hospital mortality has been the focus of several external validation studies [56], it remains to be investigated whether this new score may help to rationalize antibiotic use in patients with suspected infection. Yet, published data suggest that qSOFA may lack sensitivity for early identification of patients meeting the Sepsis-3 criteria for sepsis [57].

Hence, antibiotics are mostly used empirically in ICU patients [58]. A provocative before–after study, however, suggested that aggressive empirical antibiotic use might be harmful in this population [59]. In fact, a conservative approach—with antimicrobials started only after confirmed infection—was associated with a more than 50% reduction in adjusted mortality as well as higher rates of appropriate initial therapy and shorter treatment durations.

Biomarkers may help to identify or—perhaps more importantly—rule out bacterial infections in this setting, thus limiting unnecessary antibiotic use and encouraging clinicians to search for alternative diagnoses. Many cytokines, cell surface markers, soluble receptors, complement factors, coagulation factors, and acute phase reactants have been evaluated for sepsis diagnosis [60], yet most offer only poor discrimination [61]. Procalcitonin (PCT) levels are high in bacterial sepsis but remain fairly low in viral infections and most cases of non-infectious systemic inflammatory response syndrome (SIRS). However, a PCT-based algorithm for initiation (or escalation) of antibiotic therapy in ICU patients neither decreases overall antimicrobial consumption nor shortens time to adequate therapy or improves patient outcomes [62]. Thus, PCT is currently not recommended as part of the decision-making process for antibiotic initiation in ICU patients [49].

Considering the complexity of the host response and biomarker kinetics, a combined approach which integrates the clinical pretest probability of infection could facilitate the discrimination between bacterial sepsis and non-infectious SIRS in emergency departments and probably also in critically ill patients [63]. Given their high sensitivity, such multi-marker panels may be primarily used to rule out sepsis, albeit only in a subset of patients as a result of their suboptimal specificity (Table E1). In contrast, novel molecular assays for rapid pathogen detection in clinical samples show good specificity, yet poor sensitivity, thus providing a primarily rule-in method for infection (see below). For the foreseeable future, however, physicians will remain confronted with considerable diagnostic uncertainty and, in many cases, still have to rely on their clinical judgment for decisions to withhold or postpone antimicrobial therapy.

Impact of immune status

The host immune status is a key factor for the initial choice of antimicrobial therapy in the ICU [64]. Solid organ transplant recipients receiving immunosuppressive medications to prevent allograft rejection can present with sepsis or septic shock and very few or even no typical warning signs such as fever or leukocytosis. The level of required immunosuppression and the site of infections vary according to the allograft type; the timing of infection from original transplant surgery delineates the occurrence of nosocomial sepsis and opportunistic infections (Table 2) [65]. In hematological of solid cancer patients receiving cytoablative chemotherapy, the duration and level of neutropenia will be essential factors for the choice of empirical therapy [66, 67]. HIV-infected patients are not only susceptible to community-acquired infections but also to a vast panel of opportunist infections depending on CD4 cell count [68]. Other host immune profiles encompass immunoglobulin deficiencies and iatrogenic immunosuppression (Table 2) [69]. Because immunocompromised patients may have multiple concomitant dysfunctional immune pathways, co-infections (bacterial and/or viral and/or fungal) are possible and, when suspected, required several antimicrobials as part of empirical therapy. Of note, ageing has been associated with impairments in both innate and adaptive immunity that may predispose to severe bacterial infections; yet, the impact of immunosenescence on the management of ICU patients warrants further investigation [70, 71].

Table 2 Spectrum of empirical antimicrobial therapy in immunocompromised patients

Early microbiological diagnosis: from empirical to immediate adequate therapy

The concepts of empirical therapy and de-escalation originate from the timeframe of routine bacteriological diagnosis. With culture-based methods, the turnaround time from sampling to AST results necessitates 48 h or more, leaving much uncertainty about the adequacy of empirical coverage at the acute phase of sepsis. Molecular diagnostic solutions have therefore been developed to accelerate the process without losing performance in terms of sensitivity and specificity.

A wide array of automated PCR-based systems targeting selected pathogens and certain resistance markers have recently been introduced (Table 3). Several panels are now widely available in clinical laboratories for specific clinical contexts (e.g., suspected bloodstream infections, pneumonia, or meningoencephalitis), offering a “syndromic approach” to microbiological diagnosis [72, 73]. Syndromic tests can be run with minimal hands-on time and identify pathogens faster than conventional methods (i.e., 1.5–6 h), especially when implemented as point-of-care systems. However, these tests remain expensive (> 100 USD per test) and must be performed alongside conventional cultures, which they cannot entirely replace. They also provide partial information about antibiotic susceptibility since only a limited number of acquired resistance genes are screened (e.g., those encoding ESBL or carbapenemase). Overall, further investigations are warranted to fully appraise their potential impact on patient outcomes [72].

Table 3 New diagnostic tools for bacterial infection in critically ill patients

A next step will be the daily use of clinical metagenomics—that is, the sequencing of nucleic acids extracted directly from a given clinical sample for the identification of all bacterial pathogens and their resistance determinants [74]. Fast sequencers such as the Nanopore (Oxford Nanopore Technologies, Oxford, UK) allow turnaround times of 6–8 h at similar costs to that of syndromic tests [75, 76]. This approach can also assess the host response at the infection site by sequencing the retro-transcribed RNA, possibly adding to its diagnostic yield [77]. Nonetheless, significant improvements in nucleic extraction rates, antibiotic susceptibility inference, and the exploitation of results into actionable data must be made before clinical metagenomics can be part of routine diagnostic algorithms.

Besides new-generation tools, rapidly applicable information can still be obtained from culture-based methods such as direct AST on lower respiratory tract samples (time from sample collection ca. 24 h) [78] or lab automation with real-time imaging of growing colonies—for instance, the Accelerate Pheno™ system (Accelerate Diagnostics, Tucson, AZ) provides AST results in 6–8 h from a positive blood culture [79].

To be effective, all these tests must be integrated into the clinical workflow, thereby raising other organizational challenges and requiring the implementation of ASP [80].

The right molecule(s) but avoid the wrong dose

Key features to appraise the optimal dozing of a given antibiotic include the minimum inhibitory concentration (MIC) of the pathogen and the site of infection. Still, for most cases, clear guidance on how to adapt the dose on the basis of such characteristics is lacking, leaving much uncertainty on this issue. Defining the right dose in patients with culture-negative sepsis is a further challenge, although targeting potential pathogens with the highest MICs may appear to be a reasonable approach.

Underdosing of antibiotics is frequent in critically ill patients. Indeed, up to one out of six patients receiving beta-lactams does not reach the minimal concentration target (i.e., free antibiotic concentrations above the MIC of the pathogen during more than 50% of the dosing interval), and many more do not reach the target associated with maximal bacterial killing (i.e., concentrations above 4 × MIC during 100% of the dosing interval) [81]. This is particularly worrisome in the first hours of therapy when a maximal effect is highly desirable. Unfortunately, no standard remedy for this problem is available, and the solution depends on the physicochemical properties of the drug (e.g., hydrophilic versus lipophilic), patient characteristics, administration scheme, and the use of organ support (e.g., renal replacement therapy or extracorporeal membrane oxygenation) [82].

The volume of distribution—an important determinant of adequate antibiotic concentrations—is not measurable in critically ill patients. Yet, those with evidence for increased volume of distribution (e.g., positive fluid balance) require a higher loading dose to rapidly ensure adequate tissue concentrations, particularly for hydrophilic antibiotics, and for both intermittent and continuous infusion schemes [83]. This first dose must not be adapted to the renal function for antibiotics with predominant or exclusive renal clearance.

Many antibiotics used in the ICU are cleared by the kidneys; so, dosing adaptation for subsequent infusions must be considered in case of acute kidney injury (AKI) or augmented renal clearance (i.e., a measured creatinine clearance of 130 mL/min/1.73 m2 or higher). This latter situation is associated with lower antibiotic exposure [84] and implies higher maintenance doses to keep concentrations at the targeted level, yet therapeutic drug monitoring (TDM) appears necessary to avoid overdosing.

These features can be integrated into pharmacokinetics (PK)/pharmacodynamics (PD) optimized dosing which can be considered a three-step process (Fig. 3). PK models can be used when selecting the dose for each of these steps [85] even if these predictions are estimations only with still important intra- and inter-individual variations. These are nowadays available in several stand-alone software packages, and integration in prescription drug monitoring systems (PDMS) will be the next step. TDM can be used to further refine therapy for many antibiotics [86].

Fig. 3
figure 3

Sequential optimization of antimicrobial pharmacokinetics in critically ill patients. In obese patients, dosing regimen should be adapted on the basis of lean body weight or adjusted body weight for hydrophilic drugs (e.g., beta-lactams or aminoglycosides) and on the basis of lean body weight for lipophilic drugs (e.g., fluoroquinolones or glycylcyclines)—see Ref. [174] for details. Dosing regimens for the first antibiotic dose (unchanged, increased, or doubled) are proposed by comparison with those usually prescribed in non-critically ill patients. PD pharmacodynamics, MIC minimal inhibitory concentration, AUC area under the curve, ARC augmented renal clearance, TDM therapeutic drug monitoring, AKI acute kidney injury, CRRT continuous renal replacement therapy, CrCL creatinine clearance

Is there a role for routine therapeutic drug monitoring?

TDM may be employed to minimize the risk of antimicrobial toxicity and maximize drug efficacy through optimized PK, especially for aminoglycosides and glycopeptides. Indeed, high peak levels of aminoglycosides over the pathogen MIC appear beneficial in patients with ventilator-associated pneumonia or other life-threatening MDRB infections [87,88,89], while adequate trough vancomycin concentrations improve the clinical response in those with bloodstream infection due to MRSA [90].

However, the role of routine TDM in optimizing beta-lactam dosing remains controversial. The main issues that nowadays prevent the implementation of such a strategy in clinical practice are (1) the lack of a standardized method to reliably measure beta-lactam concentrations with a high intercenter reproducibility, (2) the delayed results of TDM for clinicians (i.e., the lack of a “point-of-care” for beta-lactam TDM in most of hospitals), (3) the optimal timing and number of samples to adequately describe the time course of drug concentrations, (4) the fact that the association between insufficient beta-lactam concentrations and the increased risk of therapeutic failure or impaired outcome is based only on retrospective studies, (5) the absence of clinical data showing a potential role of adequate beta-lactam levels in the emergence of resistant strains, (6) the poor characterization of the optimal duration of beta-lactam levels exceeding the MIC of the infective pathogen, when available, or of the optimal PK target in case of empirical therapy, and (7) the time needed to obtain the MIC of the infective pathogen, which precludes an adequate targeted therapy using PK principles [81, 91]. It is therefore possible that epidemiological cutoff (ECOFF) values are an acceptable option [92], but further studies are needed before the routine TDM of beta-lactams becomes available in most ICUs. Interestingly, high beta-lactam concentrations may result in drug-related neurotoxicity, which represents another potential role for TDM in critically ill patients [93, 94].

Key questions about antimicrobials

New and long-established antimicrobials

Polymyxins are considered the cornerstone of therapy for infections due to extremely drug-resistant (XDR) Gram-negative bacteria, including carbapenem-resistant A. baumannii, P. aeruginosa, and K. pneumoniae. Of note, recent studies indicate that colistin and polymyxin B are associated with less renal and neurological toxicity than previously reported. Several questions remain incompletely addressed, including the need and type of combination therapies, optimal dosing regimen, ways to prevent the emergence of resistance, and role of aerosolized therapy. Fosfomycin may also have a role in these infections.

Drugs newly approved or in late development phase mainly include ceftolozane–tazobactam, ceftazidime–avibactam, ceftaroline–avibactam, aztreonam–avibactam, carbapenems combined with new beta-lactamase inhibitors (e.g., vaborbactam, relebactam), cefiderocol, plazomicin, and eravacycline (Table 4). These drugs have mainly been tested in complicated urinary tract infection, complicated intra-abdominal infections (cIAI), or skin and soft tissue infection (SSTI). Limited data are currently available in ICU patients [95], notably for dosing optimization in severe MDRB infections. Piperacillin–tazobactam appears less effective than carbapenems in bloodstream infections caused by ESBL-E [96, 97]; however, ceftolozane–tazobactam and ceftazidime–avibactam might be considered as carbapenem-sparing options for treatment of such infections in areas with high prevalence of CRE. The actual question is should we still save carbapenems instead of saving new antibiotics?

Table 4 Indications and doses of new and long-established antibiotics for treating MDR bacteria

In addition to glycopeptides, long-established antibiotics with activity against MRSA mainly include daptomycin (e.g., for bloodstream infections) and linezolid (e.g., for hospital-acquired pneumonia, HAP) [98, 99]. These alternatives may be preferred in patients with risk factors for AKI. Daptomycin appears safe even at high doses and in prolonged regimens, with rhabdomyolysis representing a rare, reversible side effect. Conversely, linezolid has been linked with several adverse events most often associated with specific risk factors (e.g., renal impairment, underlying hematological disease, or extended therapy duration), suggesting a role for TDM in patients at high risk of toxicity. Next, “new-generation” cephalosporins such as ceftaroline and ceftobiprole have been approved for the treatment of MRSA infections and seem promising in overcoming the limitations associated with the older compounds. Other new agents with activity against MRSA include lipoglycopeptides (dalbavancin, oritavancin, and telavancin), fluoroquinolones (delafloxacin, nemonoxacin, and zabofloxacin), an oxazolidinone (tedizolid), a dihydrofolate reductase inhibitor (iclaprim), and a tetracycline (omadacycline); yet, the yield of these new options remains to be investigated in critically ill patients with severe MRSA infection [100].

Single-drug or combination regimen

The question of whether antibiotic combinations provide a beneficial effect beyond the empirical treatment period remains unsettled. Meta-analyses of randomized controlled trials (RCTs) comparing beta-lactams vs. beta-lactams combined with another agent demonstrate no difference in clinical outcomes in a variety of infections caused by Gram-negative pathogens; however, patients with sepsis or septic shock were underrepresented [101, 102]. In contrast, a meta-analysis of randomized and observational studies focused on sepsis or septic shock showed that combination therapy is beneficial in high-risk patients (i.e., projected mortality rate greater than 25%) [103]. This positive impact may be especially pronounced in neutropenic patients and when a pathogen with reduced antimicrobial susceptibility is involved (e.g., P. aeruginosa) [104].

To date, there is no RCT to examine whether combination therapy is superior to monotherapy for CRE infections. Observational studies suggest that the benefit of combination therapy is mainly observed in patients with serious underlying diseases or high pretreatment probability of death (e.g., septic shock) [105,106,107,108,109]. The most effective regimen is challenging to define, as only one of the aforementioned studies reported survival benefit with a specific drug combination (colistin plus tigecycline plus meropenem) after adjustment for potential confounders [109].

Although there have been five RCTs and several meta-analyses for the treatment of carbapenem-resistant A. baumannii infections, the optimal treatment regimen has not yet been determined [110,111,112,113,114,115]. Notably, none of the RCTs demonstrated a survival benefit with combination therapy, although one study showed a better clinical response with colistin plus high-dose ampicillin/sulbactam and three studies reported faster microbiological clearance when combining colistin with rifampin or fosfomycin. A recent meta-analysis, however, demonstrated survival benefit in bacteremic patients who were receiving high doses of colistin (more than 6 MIU per day) in combination with another agent [116].

Continuous prolonged or intermittent administration of beta-lactams and other time-dependent antimicrobials

The proportion of the interdose interval with drug concentration above the pathogen MIC is predictive of efficacy for time-dependent antibiotics, including beta-lactams. This parameter may be increased by reducing the interdose interval and/or by using extended infusions (EI) over 3–4 h or continuous infusion (CI). Stochastic models show that prolonged beta-lactam infusions increase the probability of target attainment against isolates with borderline MIC, especially in patients with ARC or increased volume of distribution [117].

Most RCTs comparing intermittent versus prolonged beta-lactam infusions could not find significant differences in outcomes. However, in a recent meta-analysis of RCTs comparing prolonged (EI or CI) and intermittent infusions of antipseudomonal beta-lactams in patients with sepsis, prolonged infusion was associated with improved survival, including when carbapenems or beta-lactam/beta-lactam inhibitor combinations were analyzed separately [118]. Prolonged infusions might only be needed in some patients—e.g., those with beta-lactam underdosing using intermittent administration schemes, or infections caused by isolates with elevated MICs. Because these features cannot be anticipated, it seems reasonable to consider the use of prolonged infusions of sufficiently stable antipseudomonal beta-lactams in all patients with sepsis.

For some other drugs such as vancomycin, the ratio area under the curve/MIC is considered the PK/PD parameter predictive of efficacy (Fig. 3). A recent meta-analysis suggested that continuous vancomycin infusion is associated with lower nephrotoxicity but not better cure or lower mortality than intermittent infusions [119]; nevertheless, included studies had many limitations and further investigations are needed to address this issue.

De-escalation: impact in practice

Conceptually, de-escalation is a strategy whereby the provision of effective antibiotic treatment is achieved, while minimizing unnecessary exposure to broad-spectrum agents that would promote the development of resistance. Practically, it consists in the reappraisal of antimicrobial therapy as soon as AST results are available. However, no clear consensus on de-escalation components exists and various definitions have been used (e.g., changing the “pivotal” agent for a drug with a narrower spectrum and/or lower ecological effects on microbiota, or discontinuing an antimicrobial combination), resulting in equivocal interpretation of the available evidence [120, 121].

De-escalation is applied in only 40–50% of inpatients with bacterial infection [121]. This reflects physician reluctance to narrow the covered spectrum when caring for severely ill patients with culture-negative sepsis and/or MDRB carriage [120]. Importantly, the available evidence does not suggest a detrimental impact of de-escalation on outcomes [120, 122], including in high-risk patients such as those with bloodstream infections, severe sepsis, VAP, and neutropenia [123, 124]. However, further well-designed RCTs are needed to definitely solve this issue.

Increasing physician confidence and compliance with de-escalation has become a cornerstone of ASP. Paradoxically, there is a lack of clinical data regarding the impact of de-escalation on antimicrobial consumption and emergence of resistance [120]. While this strategy has been associated with reduced use of certain antimicrobial classes [125, 126], no study demonstrated that it may allow a decrease in overall antimicrobial consumption, and an increase in antibiotic exposure has even been observed [123, 125, 127]. Similarly, the few studies that addressed this point reported no impact—or only a marginal effect—of de-escalation on the individual hazard of MDRB acquisition or local prevalence of MDRB [125,126,127].

In light of these uncertainties, efforts should focus on microbiological documentation to increase ADE rates in patients with sepsis. New diagnostic tools should be exploited to hasten pathogen identification and AST availability. Lastly, human data on the specific impact of each antimicrobial on commensal ecosystems and the risk of MDRB acquisition are needed to optimize antibiotic streamlining and further support de-escalation strategies [37, 128].

Duration of antibiotic therapy and antibiotic resistance

Prolonged durations of antibiotic therapy have been associated with the emergence of antimicrobial resistance [129]. Yet, short-course antibiotic therapy has been shown to be effective and safe in a number of infections, including community-acquired pneumonia, VAP, urinary tract infections, cIAI, and even some types of bacteremia [130,131,132,133,134,135,136]. The shortening of antibiotic durations on the basis of PCT kinetics has also been shown to be safe, including in patients with sepsis [51, 52]. However, the recent ProACT trial failed to confirm the ability of PCT to reduce the duration of antibiotic exposure compared to usual care in suspected lower respiratory tract infections [137]. Given the importance of overruling in available RCTs and the relatively long duration of therapy in control groups, the question remains unresolved. In particular, the efficacy and costs of PCT if an active ASP is in place remain to be evaluated.

Many national and international guidelines encourage physicians to shorten the overall durations of antibiotic therapy for a number of infections. Shorter courses are now recommended for pneumonia, urinary tract infections, and cIAI with source control [49, 138,139,140,141,142]. However, despite the presence of these recommendations, recent studies suggest that excessive durations of antibiotics are still being administered, thereby offering further opportunities for ASP [143, 144]. However, clinicians should also be aware that, under some circumstances, short-course therapy may be detrimental to patient outcomes, especially in case of prolonged neutropenia, lack of adequate source control, infection due to XDR Gram-negative bacteria, and endovascular or foreign body infections [130, 145].

Source control

Source control to eliminate infectious foci follows principles of drainage, debridement, device removal, compartment decompression, and often deferred definitive restoration of anatomy and function [146]. If required, source control is a major determinant of outcome, more so than early adequate antimicrobial therapy [147,148,149], and should never be considered as “covered” by broad-spectrum agents. Therefore, surgical and radiological options for intervention must be systematically discussed, especially in patients with cIAI or SSTI. The efficacy of source control is time-dependent [150,151,152,153] and adequate procedures should therefore be performed as rapidly as possible in patients with septic shock [49], while longer delays may be acceptable in closely monitored stable patients. Failure of source control should be considered in cases of persistent or new organ failure despite resuscitation and appropriate antimicrobial therapy, and requires (re)imaging and repeated or alternative intervention. Importantly, source control procedures should include microbiological sampling whenever possible to facilitate ADE initiatives.

Antibiotic stewardship programs in the ICU

Implementing ASP in the ICU improves antimicrobial utilization and reduces broad-spectrum antimicrobial use, incidence of infections and colonization with MDRB, antimicrobial-related adverse events, and healthcare-associated costs, all without increase in mortality [26, 154, 155]. According to the ESCMID Study Group for Antimicrobial Stewardship, ASP should be approached as “a coherent set of actions which promote using antimicrobials in ways that ensure sustainable access to effective therapy for all who need them” [156]. Therefore, ASP should be viewed as a quality improvement initiative, requiring (1) an evidence-based, ideally bundled, change package, (2) a clear definition of goals, indicators, and targets, (3) a dynamic measurement and data collection system with feedback to prescribers, (4) a strategy for building capacity, and (5) a plan to identify and approach areas for improvement and solve quality gaps. This necessarily implies the appointment of a member of the ICU staff as a leader with expertise in the field of antimicrobial therapy and prespecified functions for the implementation of the local ASP.

Three main kinds of interventions may be used in ASP [157,158,159]:

  • Restrictive, in which one tries to reduce the number of opportunities for bad behavior, such as formulary restrictions, pre-approval by senior ASP doctor (either an external infectious disease specialist or a specified expert in the ICU team), and automatic stop orders

  • Collaborative or enhancement, in which one tries to increase the number of opportunities and decrease barriers for good behavior, such as education of prescribers, implementation of treatment guidelines, promotion of ADE, use of PK/PD concepts, and prospective audit and feedback to providers

  • Structural, which may include the use of computerized antibiotic decision support systems, faster diagnostic methods for antimicrobial resistance, antibiotic consumption surveillance systems, ICU leadership commitment, staff involvement, and daily collaboration between ICU staff, pharmacists, infection control units, and microbiologists

The implementation of ASP should take into account the need for a quick answer from the system in case of severe infections (e.g., regarding as unacceptable the delay in the first antimicrobial delivery due to too restrictive pharmacy-driven prescription policies).

An ASP should consensually rest on multifaceted interventions to achieve its fundamental goals (Table 5), namely improving outcomes and decreasing antimicrobial-related collateral damage in infected patients. Yet, the weight of each component must be customized according to the context and culture of every single ICU in terms of habits for antibiotic prescription, MDRB prevalence, local organizational aspects, and available resources. For this purpose, concepts of implementation science should be applied—that is, identifying barriers and facilitators that impact the staff’s compliance to guidelines in order to design and execute a structured plan for improvement [160].

Table 5 Implementation and objectives of antibiotic stewardship programs in the ICU

The appropriate dashboard in the ICU

The availability of constantly updated information is pivotal to improve decision-making processes in the ICU [161, 162]. As the epidemiology of MDRB is continuously evolving, close monitoring of local resistance patterns may help to rationalize the empirical use of broad-spectrum antibiotics in this setting. With the expanding utilization of electronic medical records and applications specifically developed for the ICU, streaming analytics can provide dashboards containing real-time and easily accessible data for intensivists [162, 163]. Such dashboards should capture data from medical records and microbiology systems, display an intuitive and user-friendly interface, and be available on both ICU computers and mobile devices to allow easy access to actionable data at the bedside. Finally, a complete dashboard should include information not only on dynamics of resistance patterns but also on local antimicrobial consumption, adherence to protocols of care and antibiotic guidelines, healthcare-associated infections (e.g., source, type, severity), and general patient characteristics (e.g., comorbidities, severity of illness, main diagnosis, and length of the ICU stay) (Fig. 4). Although studies demonstrating the efficacy of such dashboards in reducing resistance have not been published so far, these tools could allow a structured audit-feedback approach that is one of the cornerstones of ASP implementation in the ICU [164,165,166].

Fig. 4
figure 4

A dashboard of dynamic and near real-time assessment of multi-resistance patterns in the ICU. HAI hospital-acquired infection, LOS length of stay, ICU intensive care unit

Concluding remarks

Both the poor outcomes associated with bacterial sepsis and the current epidemiology of MDRB urge the need for improving the management of antibiotic therapy in ICU patients. Well-designed studies remain warranted to definitely address several aspects of this issue, notably the clinical input of rapid diagnostic tools and TDM, the potential benefit of combination versus single-drug therapies, the optimal dosing regimens before the availability of AST results or for patients with culture-negative sepsis, and the prognostic yield of ASP. Although beyond the scope of this review, the exploitation of other research axes may further help to control the spread of MDRB in the ICU setting, including optimization of infection control policies [167], a comparative appraisal of the impact of broad-spectrum antibiotics on the gut microbiota through novel metagenomics approaches [168], and the evaluation of emerging options such as orally administered antimicrobial-adsorbing charcoals, probiotics, or fecal microbiota transplantation to protect or restore the commensal ecosystems of ICU patients [29, 169, 170].