The report of successful treatment of life-threatening infections early in the 1940s opened the “antibiotic era”. Stimulated by a widespread use during the Second World War and by an impressive industrial effort to develop and produce antibiotics, this major progress in the history of medicine is nowadays compromised by the universal spread of antibiotic resistance which has largely escaped from hospitals to project the human race into the post-antibiotic era [1].

Looking back at this incredible 75-year-long saga, we should emphasize that antibiotic resistance was described in parallel with the first antibiotic use and that a direct link between exposure and resistance was recognized in the late 1940s [2].

This review will focus on the most severely ill patients for whom significant progress in organ support as well as diagnostic and therapeutic strategies markedly increased the risk of developing hospital-acquired infections (HAI), and currently most ICUs are confronted with the challenge of multidrug-resistant organisms (MDR) [3].

Epidemiology of highly resistant bacteria worldwide with a focus on Europe

Antimicrobial multidrug resistance (MDR) is now prevalent all over the world [4, 5], with extreme drug resistance (XDR) and pandrug resistance (PDR) [6] being encountered increasingly often, especially among HAI occurring in large highly specialized hospitals treating patients. Emergence of antimicrobial resistance is largely attributed to the indiscriminate and abusive use of antimicrobials in society and particularly in the healthcare setting and by an increasing spread of resistance genes between bacteria and of resistant bacteria between people and environments. Even in areas hitherto known for having minor resistance problems, 5–10 % of hospitalized patients on a given day harbored extended-spectrum β-lactamase (ESBL)-producing Enterobacteriaceae in their gut flora, as seen in a recent French study conducted in ICU [7].

The number of different resistance mechanisms by which microorganisms can become resistant to an agent is increasing, the variety in resistance genes is increasing, the number of clones within a species which carry resistance is increasing, and the number of different species harboring resistance genes is increasing. All this leads to an accelerating development of resistance, further enhanced by the fact that the more resistance genes there are, the higher the probability that one or more of them will end up as a so-called successful clone, with exceptional abilities to spread, infect, and cause disease [8]. When this happens, the world faces major “outbreaks (epidemics) of antimicrobial resistance”. Sometimes these are local and occur as outbreaks of Staphylococcus aureus, Escherichia coli, Klebsiella pneumoniae, Acinetobacter baumannii, Pseudomonas aeruginosa, or Enterococcus faecalis or E. faecium (the latter species being very difficult to treat when glycopeptide resistant) in highly specialized healthcare settings such as neonatal wards or other ICUs, hematological wards, and transplant units. With the aforementioned bacteria, major problems evolved in the USA where outbreaks occurred in transplant units on the east coast and in Israel, Greece, and later Italy. Today smaller or larger outbreaks with multidrug-resistant E. coli and K. pneumoniae are seen all over the world. Recently several of these clones also show resistance to last resort agents like colistin making the situation desperate in some areas, in some hospitals, and for some patients. Since there is no influx of truly new antimicrobial agents and limited evidence of reversibility [9] of antimicrobial resistance, the future looks grim. Another sign of our desperation is the increasing interest in trying to redevelop old antimicrobials [10].

Microbiological issues: breakpoints, epidemiological cut-offs, and other susceptibility and identification problems

Methods in clinical microbiology for species identification and antimicrobial susceptibility testing (AST) have recently improved dramatically. Consequently, it is today possible for the clinical microbiology laboratory to drastically improve its turnaround time (the time used by the laboratory to receive, process, and make available a report).

Time of flight mass spectrometry has brought down the time required for species identification to 1 h from 1 day, and sometimes several days, for both bacteria and fungi. Blood cultures are most often positive within 8–20 h from the start of incubation [11]. With the novel techniques, species identification is feasible directly on the positive blood culture bottle [12] and within 1 h from a positive blood culture signal. Time wasted on transportation of blood culture bottles becomes important. Ideally bottles should be under incubation within 1 h from inoculation; if this time exceeds 4 h it constitutes malpractice.

AST can be performed using traditional methods for phenotypic antimicrobial susceptibility testing. These are based on the minimum inhibitory concentration (MIC) of the antibiotic and the application of breakpoints to categorize isolates as susceptible (S), intermediate (I), and resistant (R). AST can also be performed using genotypic methods, most commonly in the form of direct gene detection using polymerase chain reaction methods for the detection of specific resistance genes, such as the mecA gene encoding for methicillin-resistance in staphylococci or the vanA gene encoding for glycopeptide resistance in enterococci and staphylococci. The research community is currently exploring the use of whole genome sequencing for AST. The advantage of phenotypic methods is that they are quantifiable and can predict both sensitivity and resistance. Phenotypic methods, both MICs and disk diffusion, traditionally need 18 h of incubation—constituting the classical “overnight” incubation. However, the incubation time can, if traditional systems are recalibrated, be brought down to 6–12 h depending on the microorganism and the resistance mechanism. The advantage of the genotypic methods is that they are rapid and specific but so far they predict only resistance and they are not quantitative.

There is now international agreement on a standard method for the determination of the MIC, but for breakpoints there is still more than one system. Europe had for many years seven systems in use, including the Clinical and Laboratory Standards Institute (CLSI) system from the USA. The European Committee on Antimicrobial Susceptibility Testing (EUCAST) managed in the period 2002–2010 to unite the six European national systems and harmonize systems and breakpoints in Europe. Since then, many non-European countries have joined EUCAST. Both EUCAST [13] and CLSI [14] breakpoints are available in all internationally used susceptibility testing methods. EUCAST recommendations are freely available on the Internet [15], whereas the CLSI recommendations must be purchased. As a general rule, clinical breakpoints from EUCAST are somewhat lower than CLSI breakpoints. This is mainly because EUCAST breakpoints were systematically revised on the basis of recent information, whereas many breakpoints from CLSI were neither reviewed nor revised for more than 15–25 years.

Antibiotic stewardship

Targeted at education to provide assistance for optimal choice, dosage, and duration of antibiotics to improve outcome and reduce the development of resistance, antibiotic stewardship programs for critically ill patients translated into the implementation of specific guidelines, largely promoted by the Surviving Sepsis Campaign [16, 17]. Very early and adequate antibiotic treatment significantly improved the outcome of critically ill patients suffering from severe infections [18].

However, the very early start of antibiotics decreased the proportion of microbiological documentation which is mandatory for a successful subsequent de-escalation [19]. Adequate coverage for potential resistant microorganisms results in a vicious circle characterized by a progressive enlargement of the spectrum to be covered (Fig. 1). The concept of antibiotic stewardship then progressively evolved to individualize prescriptions by the introduction of new concepts such as avoiding unnecessary administration of broad-spectrum antibiotics and systematic de-escalation [20].

Fig. 1
figure 1

Vicious circle starting from the implementation of early and adequate empirical antibiotic treatment. Adequate coverage for potential resistant microorganisms results in vicious circle characterized by the need to enlarge the spectrum to be covered, with further continuous increase of the proportion of resistant microorganisms resulting in a progressive increase of inadequate empirical treatments and death from bloodstream infections (BSI), ventilator-associated pneumonia (VAP), and surgical site infections (SSI)

This has been further demonstrated to have a positive impact in critically ill settings. A systematic review showed that despite the relative low level of the 24 studies published from 1996 to 2010, including only three randomized prospective studies and three interrupted time series, antibiotic stewardship was in general beneficial [21]. This strategy has shown a reduction of the use of antimicrobials (from 11 to 38 % of defined daily doses), a lower antimicrobial cost (US$5–10 per patient day), a shorter average duration of treatment, less inappropriate use, and fewer adverse events. Interventions beyond 6 months resulted in reductions in the rate of antimicrobial resistance. Importantly, antibiotic stewardship was not associated with increases in nosocomial infection rates, length of stay, or mortality.

Moreover, in the context of high endemicity for methicillin-resistant S. aureus (MRSA), antibiotic stewardship combined with improved infection control measures achieved a sustainable reduction in the rate of hospital-acquired MRSA bacteremia [22].

Accordingly, as strongly recommended by ICU experts and supported by several national and international initiatives, antibiotic stewardship programs should be developed and implemented in every ICU or institution in charge of critically ill patients (Table 1).

Table 1 Suggested components of ICU-specific antibiotic stewardship programs

Treatment strategies

Adequate, prompt therapy and duration

Inappropriate antimicrobial therapy, meaning the selection of an antibiotic to which the causative pathogen is resistant, is a consistent predictor of poor outcomes in septic patients [23]. On the other hand, several studies have shown that prompt appropriate antimicrobial treatment is a life-saving approach in the management of severe sepsis [2426]. The concept of “adequate antimicrobial therapy” was defined as an extension of “appropriate antimicrobial therapy”, meaning appropriate and early therapy at optimized doses and dose intervals.

The most impressive data probably comes from Kumar et al. [27], who showed that inadequate initial antimicrobial therapy for septic shock was associated with a fivefold reduction in survival (52.0 vs. 10.3 %), remaining highly associated with risk of death even after adjustment for other potential risk factors (odds ratio/OR 8.99). Differences in survival were seen in all major epidemiologic, clinical, and organism subgroups, ranging from 2.3-fold for pneumococcal infection to 17.6-fold for primary bacteremia. This obviously means that, in severe infections, antimicrobial therapy must almost always be empiric, before isolation and identification of the causative organism and determination of the organism’s sensitivity, to achieve a timely initiation.

Unfortunately, the rising incidence of MDR microorganisms leads, at least, to one of two consequences: increased incidence of inappropriate antimicrobial therapy or higher consumption of broad-spectrum antibiotics [28]. The way out of this vicious spiral is to assume that the best antibiotic selection is the choice of the antibiotic with the best possible combination of high effectiveness for infection cure and relapse avoidance on the one hand and low collateral damage on the other.

In a recent multicenter study on hospital-acquired bacteremia [25], the incidence of MDR and XDR microorganisms was 48 and 21 %, respectively; in the multivariable model, MDR isolation and timing to adequate treatment were independent predictors of 28-day mortality. The same occurs in the community, with rising prevalence of MDR bacteria, for instance, among patients with community-acquired pneumonia who were admitted to the ICU (7.6 % in Spain and 3.3 % in the UK) [29]. Therefore, there is evidence that infection by MDR bacteria often results in a delay in appropriate antibiotic therapy, resulting in increased patient morbidity and mortality, as well as prolonged hospital stay [30]. Conversely, there is also proof that appropriate initial antibiotic therapy, early ICU admission, and maximized microbiological documentation are modifiable process-of-care factors that contribute to an improved outcome [31].

To help in reducing the antibiotic treatment duration, the most promising parameters appear to be plasma levels of procalcitonin (PCT). Besides PCT, no other sepsis biomarker has achieved universal use throughout different healthcare settings in the last decade. High PCT concentrations are typically found in bacterial infection, in contrast to lower levels in viral infection and levels below 0.1 ng/mL in patients without infection. Furthermore, serum PCT concentrations are positively correlated with the severity of infection. Thus, adequate antibiotic treatment leads to a decrease in serum PCT concentrations. A recent metanalysis showed a significant reduction of the length of antibiotic therapy in favor of a PCT-guided therapy strategy [32].

Monotherapy versus combination therapy

Basically, there are two reasons to favor antibiotic combination therapy over monotherapy. Firstly, there is a potential synergy of combined antibiotics in order to maximize clinical efficacy or prevent development of resistance. Synergistic effects, however, were proven in vitro for many combinations of antibiotics, but a clear benefit has only been demonstrated in vivo in the treatment of invasive pneumococcal disease and in toxic shock syndrome. In these two clinical situations, the combination of β-lactams with macrolides or lincosamides, respectively, was able to inhibit bacterial pathogenicity factors and was associated with improved survival. The second reason for the selection of a combination therapy is the desired extension of the expected spectrum of pathogens (so-called gap closing). However, none of the randomized controlled trials (RCTs) in patients with sepsis could demonstrate any advantage of combination therapies. A meta-analysis of RCTs comparing the combination therapy of a β-lactam plus an aminoglycoside versus a monotherapy with a beta-lactam alone did not show any benefits of the combination therapy in terms of morbidity and mortality [33]. Similarly, a subgroup analysis of septic patients in a Canadian ventilator-associated pneumonia (VAP) study comparing meropenem monotherapy with meropenem/ciprofloxacin combination therapy also found no differences in morbidity, mortality, or adverse reactions between the groups [34]. Finally, the MAXSEPT study of the German Study Group Competence Network Sepsis (SepNet) comparing meropenem with meropenem/moxifloxacin combination in severe sepsis and septic shock could not demonstrate a difference between the groups either for changes in SOFA scores or for 30-day mortality [35].

Results in favor of combination therapy were found in cohort studies only, in which benefits of combination therapy have been shown in terms of lower mortality, especially in patients with septic shock [36, 37]. The drawback of these studies was that various β-lactams and various combination partners were chosen.

An important difference between these observational studies and RCTs should be mentioned; the former were conducted in countries with significantly higher rates of MDR pathogens than the latter. In both the Canadian and the MAXSEPT studies mentioned above, resistant pathogens were found in less than 10 % of cases. This suggests that combination therapy may be rational when, as a result of the anticipated resistance pattern, treatment failure of a β-lactam monotherapy is likely. Several observational studies looking at carbapenem-resistant (CR) pathogens demonstrated a survival benefit for a combination therapy using a carbapenem together with colistin and/or tigecycline in comparison to meropenem alone [38, 39]. In addition, intravenous fosfomycin has been recently administered as part of combination regimens in patients with XDR K. pneumoniae infections to improve the effectiveness and decrease the rate of emergence of resistance [40]. Also aminoglycosides have been recently used in combination regimens in patients with difficult-to-treat infections, including XDR K. pneumoniae infections [41].

In conclusion, combination therapy should be recommended only in patients with severe sepsis and when an infection with a resistant organism is likely. De-escalation to an effective monotherapy should be considered when an antibiogram is available.


De-escalation of antimicrobial therapy is often advocated as an integral part of antibiotic stewardship programs [16, 17] and has been defined as reducing the number of antibiotics to treat an infection as well as narrowing the spectrum of the antimicrobial agent. Essentially this is a strategy that is applicable after starting a broad-spectrum empirical therapy prior to identification of the causative pathogen. Intuitively this would limit the applicability of de-escalation as a concept in MDR infections as after identification of an MDR pathogen, often the opposite—escalation of antimicrobial therapy—is required. Depending on the pathogen involved, even multiple antibiotics may be required, even with the same antibiotic class such as combination therapy with ertapenem and another carbapenem for K. pneumoniae carbapenemase (KPC) producers.

Not surprisingly, several clinical studies found that the presence of MDR pathogens was a motivation not to de-escalate [42, 43], even in situations where the patient was colonized with MDR organisms at sites other than the infection site. Other studies have reported de-escalation to be safe in MDR-colonized patients [44]. This does, however, not mean that the concept of de-escalation is not applicable in MDR infections. Depending on antibiotic susceptibility, an antimicrobial agent with a narrower spectrum may be available, or combination therapy may be stopped after initial therapy. If the infection is resolving, de-escalation may prove a valid option in order to avoid further antibiotic pressure [45]. However, the value of de-escalation in the setting of MDR infections has not been extensively explored and application should be considered on an individual patient basis [45]. Retrospective analyses have found de-escalation to be a safe approach when applied in selected patients [23], but in a recent, non-blinded RCT [19], de-escalated patients had more superinfections and increased antibiotic use. Although intuitively logical and frequently suggested [16, 46, 47], de-escalation of antibiotic therapy has not been clearly associated with lower rates of antibiotic resistance development.

PK/PD optimization

It is known that antimicrobial resistance, as defined in the clinical laboratory, often translates into insufficient in vivo exposures that result in poor clinical and economic outcomes [48]. In addition to resistance, it is now becoming increasingly recognized that the host’s response to infection may in and of itself contribute to considerable reductions in antimicrobial exposures due to alterations in the cardiovascular, renal, hepatic and pulmonary systems [49]. A recent ICU study has highlighted the potential impact of adaptations in the renal system as more than 65% of these critically ill patients manifested augmented renal function, defined by a creatinine clearance ≥130 mL/min/1.73 m2, during their initial week of hospitalization. This finding is particularly concerning because the backbones of most antimicrobial regimens in the ICU such as penicillins, cephalosporins or carbapenems are predominantly cleared via the renal route [50]. To this end, Roberts and colleagues have recently reported the results of a prospective, multinational pharmacokinetic study to assess β-lactam exposures in the critically ill population [51]. In that study, the investigators noted that among 248 infected patients, 16% did not achieve adequate antimicrobial exposures and that these patients were 32% less likely to have a satisfactory infection outcome. While a personalized approach to dosing that is based on a given patient’s specific pharmacokinetic profile for β-lactams is not yet the standard of practice as might be expected for the aminoglycosides and vancomycin due to lack of routinely available drug assays, the pharmacokinetic and pharmacodynamic (PK/PD) profile of β-lactams as well as other agents may be incorporated into treatment algorithms to optimize outcomes. To this end, we developed and implemented pharmacodynamically optimized β-lactam regimens which incorporated the use of higher doses as well as prolonged or continuous infusion administration technique to enhance in vivo exposures in patients with VAP [52]. Utilization of these regimens was shown to improve the clinical, microbiologic and economic outcomes associated with this ICU based infection. Clinicians need to recognize that in the early stages of infection, alterations in metabolic pathways as well as the reduced susceptibility of target pathogens may result in inadequate antimicrobial exposures if conventional dosing regimens are utilized. Therefore, pharmacodynamically optimized dosing regimens should be given to all patients empirically and once the pathogen, susceptibility and clinical response are known, local stewardship practices may be employed to redefine an appropriate regimen for the patient [53].

Infection control measures

Depending on the ICU setting, one-third to up to half of patients may develop a nosocomial infection. Unfortunately, host susceptibility to infection can only be slightly modified; the presence of microorganisms and the high density of care are unavoidable. Consequently, the prevention of infection dissemination relies on eliminating the means of transmission by the use of infection control measures [54]. The global idea of isolation precaution combines the systematic use of standard precautions (hand hygiene, gloves, gowns, eye protection) and transmission-based precautions (contact, droplet, airborne) [55] (Table 2). Conceptually, the objective is not to isolate, but to prevent transmission of microorganisms by anticipating the potential route of transmission and the measures to be applied for each action of care.

Table 2 Principles of isolation precautions: standard and transmission-based precautions

Physical contact is the main route of transmission for the majority of bacteria; however, this seems not to be true for certain bacteria, namely MRSA, as demonstrated in a recent study [56]. It occurs via the hands of healthcare workers (HCWs) from a patient or contaminated surfaces/instruments nearby to another patient during the process of care. Hand hygiene (hand washing and alcohol hand-rub), patient washing, and surface cleansing efficiently reduce transmission [57, 58]. Transmission can occur via droplet or airborne particles. Specific transmission-based precautions required to avoid infection are summarized in Table 2.

Isolation precautions have been widely diffused and they are nowadays the cornerstone of preventive measures used to control outbreaks, to decrease the rate of resistant microorganisms (MRSA, ESBL) and the spread of emergent infectious diseases such as respiratory viruses (SARS, influenza, corona virus) or viral hemorrhagic fevers. In this context, enhanced adherence to appropriate isolation precautions can markedly decrease resistance dissemination and potentially further need for broad-spectrum antibiotics.

Selective decontamination of the digestive tract (SDD) has been proposed to prevent endogenous and exogenous infections and to reduce mortality in critically ill patients. Although the efficacy of SDD has been confirmed by RCTs and systematic reviews, SDD has been the subject of intense controversy based mainly on insufficient evidence of efficacy and on concerns about resistance. A recent meta-analysis detected no relation between the use of SDD and the development of antimicrobial resistance, suggesting that the perceived risk of long-term harm related to selective decontamination cannot be justified by available data. However, the conclusions of the study indicated that the effect of decontamination on ICU-level antimicrobial resistance rates is understudied [59].

Although SDD provides a short-term benefit, neither a long-term impact nor a control of emerging resistance during outbreaks or in settings with high resistance rates can be maintained using this approach.

In the era of carbapenem resistance, antimicrobials such as colistin and aminoglycosides often represent the last option in treating multidrug-resistant Gram-negative infections. In this setting, the use of colistin should be carefully considered and possibly avoided during outbreaks due to resistant Gram-negative bacilli [60].

New therapeutic approaches

The antibiotic pipeline continues to diminish and the majority of the public remains unaware of this critical situation. The cause of the decline of antibiotic development is multifactorial. Drug development, in general, is facing increasing challenges, given the high costs required, which are currently estimated in the range of US$400–800 million per approved agent. Furthermore, antibiotics have a lower relative rate of return on investment than do other drugs because they are usually used in short-course therapies. In contrast, chronic diseases, including HIV and hepatitis, requiring long-term and maybe lifelong treatments that suppress symptoms, represent more rational opportunities for investment for the pharmaceutical industry. Ironically, antibiotics are victims of their own success; they are less desirable to drug companies because they are more successful than other drugs [61].

Numerous agencies and professional societies have highlighted the problem of the lack of new antibiotics, especially for MDR Gram-negative pathogens. Since 2004 repeated calls for reinvigorating pharmaceutical investments in antibiotic research and development have been made by the Infectious Diseases Society of America (IDSA) and several other notable societies, including Innovative Medicines Initiative (Europe’s largest public–private initiative) which funds COMBACTE [62, 63]. IDSA supported a program, called “the 10 × ′20 Initiative”, with the aim to develop ten new systemic antibacterial drugs by 2020 through the discovery of new drug classes or new molecules from already existing classes of antibiotics [63].

The current assessment of the pipeline (last updated August 2014) shows 45 new antibiotics in development or recently approved (Table 3). Of those, 14 are in phase 1 clinical trials, 20 in phase 2, seven in phase 3 (a new drug application has been submitted for one, and three were recently approved) [64]. Five of the seven antibiotics in phase 3, as well as one drug submitted for review to the US Food and Drug Administration (FDA), have the potential to address infections caused by MDR Gram-negative pathogens, the most pressing unmet need [65]. Unfortunately there are very limited new options for Gram-negative bacteria such as carbapenemase-producing Enterobacteriaceae, XDR A. baumannii, and P. aeruginosa. Aerosol administration of drugs seems to be a promising new approach for treatment of MDR lung infections. Nebulized antibiotics achieve good lung concentrations and they reduce risk of toxicity compared with intravenous administration. A new vibrating mesh nebulizer used to deliver amikacin achieved high concentrations in the lower respiratory tract. A tenfold higher concentration than the MIC90 of bacteria that are normally responsible for nosocomial lung infections (8 μg/mL for P. aeruginosa) was documented in epithelial lining fluid for amikacin [66].

Table 3 Antibiotics currently in clinical development [54]

Pathogen-based approach

ESBL producers

Infections caused by extended-spectrum β-lactamase-producing Εnterobacteriaceae (ESBL-PE) are difficult to treat owing to the resistance of the organisms to many antibiotics [67]. Since 2010 EUCAST and CLSI recommended the use of alternatives to carbapenems to treat these organisms. Indeed on the basis of antimicrobial susceptibility testing, β-lactam/β-lactamase inhibitors (BLBLIs) and specific fourth-generation cephalosporins (i.e., cefepime) with greater stability against β-lactamases could be theoretically used to treat ESBL-PE infections [68]. As far as we know these “alternatives” to carbapenems have not been evaluated in critically ill patients. Outside the ICU, the data are still scarce and conflicting [69]. A post hoc analysis of patients with bloodstream infections due to ESBL-E. coli from six published prospective cohorts suggested that BLBLI (including amoxicillin–clavulanic acid and piperacillin–tazobactam) were suitable alternatives to carbapenems [70]. A meta-analysis by Vardakas et al. [71] compared carbapenems and BLBLIs for bacteremia caused by ESBL-producing organisms; taking into account the considerable heterogeneity in the trials included, the fact that none was powered to detect outcome differences and the fact that most severely ill patients tended to receive carbapenems, there was no statistically significant difference in mortality between patients receiving as empirical or definitive therapy BLBLIs or carbapenems. Regarding the use of cefepime in this specific situation, the data available are more confusing but it seems that the use of this β-lactam is safe in case of infection with isolates with an MIC value of 1 mg/L or less [7274]. For critically ill patients, dosages of 2 g every 12 h or higher are probably preferred. As suggested in recent studies in this specific situation, practitioners should be aware of the risk of suboptimal dosage.

According to all the recent data, for critically ill patients carbapenems are still preferable to alternatives as empirical therapy when ESBL-PE is suspected. Alternatives as definitive therapy could be possible once susceptibilities are known. However, high dosage and semi-continuous administration of β-lactams should be preferred.

Carbapenem-resistant Enterobacteriaceae (CRE)

The vast majority of CRE isolates are resistant to the most clinically reliable antibiotic classes leaving colistin, tigecycline, and gentamicin as the main therapeutic approaches, whereas several reports have revealed high risk of mortality associated with these infections [38, 75, 76]. Given the lack of data from randomized clinical trials, therapeutic approaches in CRE infections are based on the accumulating clinical experience and particularly from infections by K. pneumoniae producing either KPC or Verona integron-mediated metallo-β-lactamase (VIM). Recent evidence supports combination treatment containing two or three in vitro active drugs, revealing significant advantages over monotherapies in terms of survival [38, 39, 77, 78]. Although paradoxical, since KPC enzymes hydrolyse carbapenems, the most significant improvement seems to be obtained when the combination includes a carbapenem, providing substantial survival benefit in patients who are more severely ill and/or those with septic shock [78, 79]. Carbapenems’ in vivo activity against CRE was compatible with MICs reaching 8–16 mg/L [38, 77, 78], probably attributed to an enhanced drug exposure with high-dose/prolonged-infusion regimens of carbapenems. Aminoglycoside-containing combinations, particularly gentamicin, were associated with favorable outcomes compared to other combinations and could serve as a backbone, particularly in view of increasing rates of colistin resistance [38, 39, 78]. Colistin and tigecycline represent the remaining agents to be selected for the combination, based on the sensitivity pattern. A recently reported clinical success of 55 % in the treatment of infections by XDR and PDR pathogens with combinations of fosfomycin make it another therapeutic candidate, particularly in the treatment of Enterobacteriaceae against which susceptibility rates are promising [80]. High doses (up to 24 g/day) and avoidance of monotherapy are strongly recommended in the setting of critically ill patients with MDR pathogens.

Failing monotherapies with colistin or tigecycline may be explained by a suboptimal exposure to the drug; recent PK/PD data favor dose escalation compared to the initially recommended dose regimens. A small single-center non-comparative study employing a loading dose (LD) of 9 MIU followed by 4.5 MIU bid and adaptation according to renal function [81, 82] showed that colistin monotherapy might be adequate [83]. A concise guide to optimal use of polymyxins is shown in the Electronic Supplementary Material. Higher doses up to 200 mg/day may optimize tigecycline PKs and result in improved clinical outcomes [84].

Double carbapenem combinations, consisting of ertapenem as a substrate and doripenem or meropenem as the active compound, have been recently proven successful in case series and small studies, even when the pathogen expressed high MIC to carbapenems [85]. Finally, decisions regarding the empiric antibiotic treatment of critically ill patients must be based on a sound knowledge of the local distribution of pathogens and on analysis of presence of risk factors for infection caused by CRE [76, 86].

Pseudomonas aeruginosa

P. aeruginosa, along with E. coli, K. pneumoniae, and A. baumannii, is a leading pathogen in the ICU setting, causing severe infections (VAP, bacteremia) with mortality directly related to any delay in starting an appropriate antibiotic therapy [87].

In 2011, high percentages of P. aeruginosa isolates resistant to aminoglycosides, ceftazidime, fluoroquinolones, piperacillin/tazobactam, and carbapenems were reported from several countries especially in Southern and Eastern Europe. Resistance to carbapenems was above 10 % in 19 of 29 countries reporting to the European Center for Diseases Control (ECDC); MDR was also common, with 15 % of the isolates reported as resistant to at least three antimicrobial classes. CR-resistant P. aeruginosa now accounts for about 20 % of the isolates in Italian ICUs, with few strains (2–3 %) being also resistant to colistin [88].

Primary regimens for susceptible isolates, depending on the site and severity of infection, are summarized in Table 4. A beta-lactam antibiotic with anti-pseudomonas activity is generally preferred and administered with extended infusion after a LD to rapidly achieve the pharmacodynamic target [89]. Although there is not clear evidence supporting the advantage of combination therapy (i.e., a β-lactam plus an aminoglycoside or a fluoroquinolone) over monotherapy [90], many clinicians adopt this regimen for serious infections (bacteremia, VAP) and in patients with severe sepsis and septic shock. When a combination therapy with an aminoglycoside (amikacin or gentamicin) is preferred, we recommend a maximum duration of 5 days.

Table 4 Primary regimens for treating P. aeruginosa infection

For infection caused by a strain susceptible only to colistin, a regimen of high-dose colistin (9 MU LD, then 4.5 MU bid) is recommended. Nebulized administration of colistin is also considered for VAP, and intrathecal administration is required for meningitis [91]. The advantage of adding a carbapenem (in case of a non-carbapenem-susceptible strain) to colistin is unclear, and several experts prefer to use a combination showing synergistic activity “in vitro” (i.e., colistin plus rifampin). Fosfomycin shows variable in vitro activity against P. aeruginosa MDR/XDR strains and may be administered, mainly as part of a combination regimen, for systemic infections (4 g every 6 h). New drugs with activity against P. aeruginosa include ceftazidime/avibactam, a non-lactam inhibitor of class A and C β-lactamases and AmpC from P. aeruginosa, the new aminoglycoside plazomicin, and the combination of the new cephalosporin ceftolozane with tazobactam, which shows activity also against MDR and XDR P. aeruginosa strains and completed phase 3 trials for the treatment of complicated intra-abdominal infections (cIAIs) and complicated urinary tract infections (cUTIs) [65, 90].

Acinetobacter baumannii

A. baumannii has gained increasing attention because of its potential to cause severe infections and its ability in developing resistance to practically all available antimicrobials. Adequate empirical therapy of severe infections caused by A. baumannii is crucial in terms of survival [92].

The empirical treatment for A. baumannii infections often represents a challenge and might be considered in case of severe sepsis/septic shock and in centers with greater than 25 % prevalence of MDR A. baumannii [93]. Traditionally, carbapenems have been the drug of choice and are still the preferred antimicrobials for Acinetobacter infections in areas with high rates of susceptibility. Sulbactam is a bactericide against A. baumannii and represents a suitable alternative for A. baumannii susceptible to this agent. Unfortunately, a steady increase in the resistance to sulbactam in A. baumannii has been observed [94]. Nowadays, polymyxins are the antimicrobials with the greatest level of in vitro activity against A. baumannii [95, 96]. However, their indiscriminate use may contribute to further selection of resistance and may also expose patients to unnecessary toxicity. Thus, selection of patients who should receive empirical treatment covering Acinetobacter is essential. Colistin is the most widely used in clinical practice although polymyxin B seems to be associated with less renal toxicity [97]. The recommended doses of these antimicrobials are shown in Table 5. Tigecycline, active in vitro against a wide range of Gram-negative bacilli including A. baumannii, is approved in Europe for the treatment of complicated skin structure infections and intra-abdominal infections. Nevertheless, although diverse meta-analyses have warned about the increased risk of death in patients receiving tigecycline compared to other antibiotics particularly in HAP and VAP [98100], a high dose regimen (Table 5), usually in combination with another antimicrobial, may be a valid alternative for severe infections including A. baumannii pneumonia [75, 101].

Table 5 Recommended doses of antimicrobials for A. baumannii severe infections in patients with normal renal function

Although in vitro studies have demonstrated synergy of colistin with rifampin, a recent RCT demonstrated no improved clinical outcomes with the combination of colistin/rifampin while better eradication was achieved [102]. Different in vitro studies have documented the existence of an unforeseen potent synergism of the combination of colistin with a glycopeptide against carbapenem-resistant A. baumannii; however, a combination of colistin plus a glycopeptide in A. baumannii infections is actually discouraged [103, 104].


The main treatment options for treating MRSA infections in critically ill patients include glycopeptides (vancomycin and teicoplanin), linezolid, and daptomycin; daptomycin is contraindicated for the treatment of pneumonia because of its inactivation by surfactant. Alternative anti-MRSA agents are tigecycline, for which there is a regulatory warning concerning possible small increased (unexplained) mortality risk [105], telavancin, which has associated warnings contraindicating its use notably in patients with renal failure [99], and ceftaroline, dalbavancin, and oritavancin, which have limited evidence for their efficacy in very severe infection. There have been numerous meta-analyses to compare the efficacy of the aforementioned agents in MRSA infection. Four meta-analyses are unusual in that they have assessed all possible treatment options [106109], although only one study [107] examined all MRSA infection types (as opposed to MRSA cSSTIs). This latter (network meta-analysis) study identified 24 RCTs (17 for cSSTI and 10 for HAP/VAP) comparing one of six antibiotics with vancomycin. In cSSTI, linezolid and ceftaroline were non-significantly more effective than vancomycin. Linezolid ORs were 1.15 (0.74–1.71) and 1.01 (0.42–2.14) and ceftaroline ORs were 1.12 (0.78–1.64) and 1.59 (0.68–3.74) in the modified intention to treat (MITT) and MRSA m-MITT populations, respectively. For HAP/VAP, linezolid was non-significantly better than vancomycin, with ORs of 1.05 (0.72–1.57) and 1.32 (0.71–2.48) in the MITT and MRSA m-MITT populations, respectively. The data of the Zephyr trial suggested a clinical superiority of linezolid compared with vancomycin with higher rates of successful clinical response, acceptable safety and tolerability profile for the treatment of proven MRSA nosocomial pneumonia. Microbiologic responses paralleled clinical outcomes, and MRSA clearance was 30 % greater with linezolid than with vancomycin. A difference of at least 20 % persisted until late follow-up, suggesting that linezolid treatment may result in more complete bacterial eradication [110].

Uncertainties surrounding the relative efficacy of vancomycin have been fuelled by reports of worse outcomes in patients with MRSA infection caused by strains with elevated MICs. A recent meta-analysis of S. aureus bacteremia studies failed to find an overall increased risk of death when comparing cases caused by S. aureus exhibiting high-vancomycin MIC (at least 1.5 mg/L) with those due to low-vancomycin MIC (less than 1.5 mg/L) strains [111]. Outbreak of MRSA resistant to linezolid mediated by the cfr gene has been reported and was associated with nosocomial transmission and extensive usage of linezolid [112]. However, the authors cautioned that they cannot definitely exclude an increased mortality risk, and to emphasize this point it remains possible that specific MRSA strains/clones are associated with worse outcomes. Attempts to address elevated MICs and so improve target attainment by increasing vancomycin dosages are associated with more nephrotoxicity [113].

Clostridium difficile

Severe C. difficile infection (CDI) is characterized by at least one of the following: white blood cell count greater than 15 × 109/L, an acute rising serum creatinine (i.e., greater than 50 % increase above baseline), a temperature of greater than 38.5 °C, or abdominal or radiological evidence of severe colitis. There are currently two main treatment options for severe CDI: either oral vancomycin 125 mg qds for 10–14 days, or fidaxomicin, which should be considered for patients with severe CDI at high risk for recurrence [114]. The latter include elderly patients with multiple comorbidities who are receiving concomitant antibiotics. Metronidazole monotherapy should be avoided in patients with severe CDI because of increasing evidence that it is inferior to the alternatives discussed here [115]. In severe CDI cases who are not responding to oral vancomycin 125 mg qds, oral fidaxomicin 200 mg bid is an alternative; or high-dosage oral vancomycin (up to 500 mg qds, if necessary administered via a nasogastric tube), with or without iv metronidazole 500 mg tds. The addition of oral rifampicin (300 mg bid) or iv immunoglobulin (400 mg/kg) may also be considered, but evidence is lacking regarding the efficacy of these approaches. There are case reports of tigecycline being used to treat severe CDI that has failed to respond to conventional treatment options, but this is an unlicensed indication [116, 117].

In life-threatening CDI (i.e., hypotension, partial or complete ileus, or toxic megacolon) oral vancomycin up to 500 mg qid for 10–14 days via nasogastric tube (which is then clamped for 1 h) and/or rectal installation of vancomycin enemas plus iv metronidazole 500 mg three times daily are used [118], but there is a poor evidence base in such cases. These patients require close monitoring, with specialist surgical input, and should have their blood lactate measured. Colectomy should be considered if caecal dilatation is more than 10 cm, or in case of perforation or septic shock. Colectomy is best performed before blood lactate rises above 5 mmol/L, when survival is extremely poor [119]. A recent systematic review concluded that total colectomy with end ileostomy is the preferred surgical procedure; other procedures are associated with high rates of re-operation and mortality. Less extensive surgery may have a role in selected patients with earlier-stage disease [120]. An alternative approach, diverting loop ileostomy and colonic lavage, has been reported to be associated with reduced morbidity and mortality [121].


Current clinical practice relating to critically ill patients has been extremely challenged by the emergence of multidrug resistance among the commonly encountered pathogens. Treatment options seem to be more optimistic for Gram-positive pathogens (including C. difficile), for which the pipeline is more promising; however, the recently launched anti-MRSA agents have not been extensively investigated in critically ill populations. In the field of Gram-negative MDR infections there is great concern about the therapeutic future, as only a handful of the upcoming agents will address the unmet medical needs. Associations of beta-lactams with beta-lactamase inhibitors seem promising against Gram-negative MDR pathogens, but their real clinical utility will be known only after results of large clinical trials are available. Currently, the most effective approach is the PK/PD optimization of the available antibiotics, particularly given the increasing awareness of the pharmacokinetic alterations that occur in the critically ill patient. Combination treatments seem to be important, at least in the empirical phase of treatment, to ensure adequate coverage of the patient and improve clinical outcome. However, randomized clinical trials are urgently needed to define the possible benefit from combinations in various settings. Most importantly, infection control measures and prompt diagnostics are the cornerstones to prevent further transmission of MDR and XDR pathogens in healthcare settings and to optimize early antimicrobial treatment.