Acute Respiratory Failure

Acute lung injury

Acute lung injury (ALI) and acute respiratory distress syndrome (ARDS) have been the subjects of several important studies which address the issues of biological markers, respiratory monitoring tools to guide ventilatory settings, and risk factors for developing ARDS.

Fibroproliferation in ARDS seems to have important prognostic implications. Because fibroproliferation markers, such as procollagen I, predict mortality in patients with ALI and ARDS, Budinger et al. examined whether bronchoalveolar lavage fluid (BALF) from patients with lung injury contains mediators that would activate procollagen-I promoter, and if this activation predicts important clinical outcomes [1]. The BALF was also collected from patients with pulmonary fibrosis and cardiogenic pulmonary edema. The BALF active TGF-beta 1 levels were measured in 29 ARDS patients, with nine negative and six positive controls being enrolled also. The BALF from ARDS patients induced 41% greater procollagen-I promoter activation than that from negative controls (p < 0.05), and a TGF-beta-1 blocking antibody significantly reduced this activation in ARDS patients. The authors conclude that bronchoalveolar lavage fluid from ALI/ARDS patients activates procollagen-I promoter, which is due partly to TGF-beta 1. Activated TGF-beta 1 may impact ARDS outcome independent of its effect on procollagen-I activation.

The role of the pressure-volume (PV) curve to guide the ventilatory settings is still debated. In particular, the expiratory part of the whole loop could be more useful than the inspiratory part, because PEEP is indeed an expiratory setting. Albaiceta et al. studied the effects of two levels of positive end-expiratory pressure (PEEP), 2 cmH2O above the lower inflection point of the inspiratory limb (15 ± 3 cmH2O) and equal to the point of maximum curvature on the expiratory limb of the pressure – volume curve (23.5 ± 4 cmH2O), on gas exchange, respiratory mechanics, and lung aeration in eight patients with early acute lung injury [2]. The PEEP, according to the expiratory point of maximum curvature, induced an improvement in oxygenation, an increase in normally aerated and a decrease in nonaerated lung volumes, and greater alveolar stability. There was also an increase in PaCO2, airway pressures, and hyperaerated lung volume; therefore, high PEEP levels according to the point of maximum curvature of the deflation limb of the pressure-volume curve have both benefits and drawbacks.

Another potential bedside monitoring technique is electrical impedance tomography (EIT). Riedel et al. applied functional EIT to measure relative changes of lung tissue during tidal breathing and create images of local ventilation distribution [3]. Change of body position from supine to prone, left and right lateral, during spontaneous breathing and positive pressure support ventilation, allows interindividual comparison. The article by Riedel et al. [3] is accompanied by an editorial by Calzia et al. who emphasize that this technique has now reached a level of robustness and user-friendliness which permits bedside application, even in an ICU setting [4].

Two articles have explored the physiological effects of prone position and how they could explain clinical benefits. Vieillard-Baron et al. studied the improvement provided by prone position in terms of mechanics and alveolar ventilation in ARDS [5]. They stated that prone position improves homogenization of tidal ventilation by reducing time-constant inequalities, thus improving alveolar ventilation. They previously reported in patients with ARDS that these inequalities are responsible for the presence of a “slow compartment,” which is excluded from tidal ventilation at supportive respiratory rate. In 11 ARDS patients treated by ventilation in the prone position because of a major oxygenation impairment (PaO2/FIO2  < 100 mm Hg), the prone position significantly reduced the expiratory time constant from 1.98 ± 0.53 s at baseline with ZEEP to 1.53 ± 0.34 s, and significantly decreased PaCO2 from 55 ± 11 mm Hg at baseline with ZEEP to 50 ± 7 mm Hg.

In an editorial, Koutsoukou discussed that dynamic hyperinflation and PEEPi was presumably due to tidal expiratory flow limitation with sequential dynamic compression of the peripheral airways during expiration and consequent inhomogeneous regional lung emptying [6]. The reduction of dynamic hyperinflation and PEEPi with proning was probably due largely to abolishment or reduction of the extent of tidal flow limitation. It should be stressed that tidal flow limitation is a risk factor for low-volume lung injury during mechanical ventilation.

Finally, Mentzelopoulos et al. studied static pressure volume curves and body posture in 13 patients with early ARDS [7]. Prone position vs preprone semirecumbent position resulted in significantly reduced pressure at lower inflection point of lung PV curve (2.2 ± 0.2 vs 3.7 ± 0.5 cmH2O) and increased volume at upper inflection point (0.87 ± 0.03 vs 0.69 ± 0.05). Postural reduction in lower inflection point pressure of lung PV curve was the sole independent predictor of pronation-induced increases in PaO2/FIO2, which is also significantly related to increases in functional residual capacity.

Moran et al. provided a detailed and complex meta-analysis of controlled trials of ventilator therapy in acute lung injury and ARDS, a subject of considerable debate in the literature [8]. Overall treatment-effect estimate favored protective ventilation but did not achieve statistical significance. Protective ventilation depended upon threshold levels of tidal volume, plateau pressure, and plateau pressure difference. In particular, the treatment effect favored protective ventilation for a tidal volume < 7.7 ml/kg predicted (treatment group) and a mean plateau pressure of 30 cmH2O or higher (control group), but was not influenced by plateau pressure 21–30 cmH2O (treatment group) and depended on plateau-pressure difference greater than 5–7 cmH2O between protective ventilation and standard ventilation. This interesting result supports the view that a threshold in plateau pressure may exist as a risk factor for ventilator-induced lung injury (VILI).

Gajic et al. published an important observation suggesting that initial ventilatory settings are a risk factor for the secondary development of ARDS in mechanically ventilated patients. They tested this hypothesis in a large sample of patients prospectively enrolled in a multicenter study on mechanical ventilation [9, 10]. Of 3261 mechanically ventilated patients who did not have ARDS at the outset, 205 (6.2%) developed ARDS 48 h or more after the onset of mechanical ventilation. Multivariate logistic regression analysis adjusted for baseline patient characteristics found the development of ARDS to be associated with a high tidal volume (odds ratio 2.6 for tidal volume > 700 ml), high peak airway pressure (odds ratio 1.6 for peak airway pressure > 30 cmH2O), and high positive end-expiratory pressure (odds ratio 1.7 for end-expiratory pressure > 5 cmH2O) as initial ventilator settings. These provocative findings were commented upon in an editorial by Bonetto et al [11]. They remind us that the theoretical basis of VILI relies on the seminal study by Mead and coworkers [12] who examined the intrapulmonary distribution of pressure in a lung model that included normal and collapsed alveoli. These data therefore led to the concept that all the pathophysiological characteristics of ARDS (ventilation-perfusion mismatch and reduced compliance, lung edema, atelectasis, pulmonary inflammation) may be worsened by inappropriate ventilator settings because of the nonhomogeneous distribution of normal lung regions mixed with consolidated, atelectatic regions and regions that can be recruited/de-recruited depending on the particular ventilatory strategy used, and that normal, homogeneous lungs should not be affected by ventilator settings otherwise injurious for nonhomogeneous lung. Two conditions may therefore be required for stress due to mechanical ventilation to induce a relevant inflammatory stimulus: (a) the mechanical conditions determining the activation of mechanical forces such as shear stress and stress failure; and (b) the presence of a primary inflammatory stimulus represented by the underlying causes of ARDS, cardiopulmonary bypass, or other ischemia-reperfusion conditions such as lung transplant. Did the lungs of patients included in the Gajic et al. study have both the mechanical conditions and a primary inflammatory stimulus theoretically required to develop VILI? The PaO2/FIO2 ratio was lower and the incidence of pneumonia was greater in the group that 48 h after ICU admission developed ARDS than in the group that did not develop ARDS. By contrast, the data from Wrigge et al. suggest that ventilation with lower VT had no or only minor effect on systemic and pulmonary inflammatory responses in 44 patients with healthy lungs after uncomplicated cardiopulmonary bypass surgery [13]. Their data, based on serum tumor necrosis factor (TNF)α, interleukin (IL) 6, and IL-8, do not suggest a clinical benefit of using low VT ventilation in these selected patients.

Modes of ventilation

A helium-oxygen mixture has been shown to offer interesting properties in asthmatic patients or during noninvasive ventilation (NIV). Tassaux et al. studied the effects of helium-oxygen (He/O2) in ten intubated chronic obstructive pulmonary disease (COPD) patients during pressure support ventilation (PSV). The use of He/O2 decreased intrinsic PEEP and the number of ineffective breaths, and also reduced the magnitude of inspiratory effort and work of breathing [14]. He/O2 could prove useful in such patients with weaning difficulties.

Pressure support ventilation is a widely used ventilatory technique, but it has the potential disadvantage of providing inadequate tidal volume in the case of a sudden increase in resistance or elasticity. Volume support ventilation (VSV) is an alternative mode in which the pressure support level is continuously adjusted to deliver a preset tidal volume, developed by manufacturers as an attempt to minimize the drop in tidal volume observed during PSV. Jaber et al., in a randomized cross-over study performed in ten intubated patients, compared the two modes of ventilation in terms of patient behavior and ventilatory response when ventilatory demand was abruptly increased by addition of dead space in the circuit [15]. As a result of the augmented demand, minute ventilation and PaCO2 increased similarly in both modes, but all the indexes of respiratory efforts were significantly higher during VSV vs PSV, and two patients showed signs of overt respiratory distress. This “paradoxical” response during VSV was caused by a decrease in the inspiratory assistance delivered from 15 ± 6 to 9 ± 5 cmH2O, whereas no change occurred with PSV. The authors concluded that during a dual-control mode responsive to tidal volume, such as VSV, an increase in the ventilatory demand leads to a decrease in the level of pressure support, augmenting at the same time the inspiratory effort, compared with PSV.

It is well known that meals represent a stressing activity for patients affected by severe chronic diseases, such as those recovering from acute stroke or COPD with chronic respiratory failure. Vitacca et al. have investigated in a physiological study the effects of meals in a group of 16 difficult-to-wean tracheotomized COPD patients, during total unsupported breathing or PSV [16]. At the time of the study the patients were able to successfully sustain periods of spontaneous breathing averaging about 4 h per day. The authors observed a significant, even though clinically irrelevant, increase from baseline of respiratory and heart rates and end tidal CO2 during the unsupported trial both during meals and 30 min after meals. Compared with spontaneous breathing, the benefit of PSV was attributed to a greater tidal volume, lower respiratory and pulse rates, and decreased rapid-shallow breathing index (respiratory rate/tidal volume), but in only 4 patients this latter index was > 105. Indeed, PSV obviated the occurrence of a significant increased in the dyspnea score, which was recorded during the meal with unsupported breathing. This study confirms that meals are associated with respiratory distress in critically ill patients, although to a lesser extent than that suspected previously. These physiological changes are minimized by the application of PSV.

Noninvasive ventilation

Noninvasive ventilation (NIV) has been recommended in patients with hypercapnic acute respiratory failure (ARF) to prevent endotracheal intubation and to reduce the rate of complications and mortality. Interestingly, the large majority of patients enrolled in the previous studies during an episode of hypercapnic ARF were affected by COPD. Phua et al. expanded our knowledge about this problem, comparing in a prospective study the effectiveness of NIV and risk factors for failure in 43 patients with COPD vs 68 non-COPD patients, all during an episode of hypercapnic ARF [17]. The non-COPD group included patients with pneumonia, neuromuscular disorders, pulmonary edema, bronchiectasis, sepsis, and asthma. After controlling for baseline severity, the use of NIV in the group of COPD patients was significantly more effective than in non-COPD, since the need for endotracheal intubation was 19 and 47%, respectively. A high APACHE II score, the presence of pneumonia, rapid heart rate, and a high PaCO2 after 1 h were predictors of NIV failure in the non-COPD group, whereas the sole predictor in the COPD group was the presence of a high APACHE II score. The ICU and hospital mortality rates were significantly higher in the non-COPD, with the failure of NIV being the only independent predictor of mortality. The NIV should be attempted with great caution if an episode of hypercapnic ARF occurs in non-COPD patients.

Both noninvasive PSV (nPSV) and continuous positive airway pressure (CPAP) have been considered very effective treatments of ARF due to cardiogenic pulmonary edema (CPE). Some studies have suggested that nPSV is superior to CPAP in unloading the inspiratory muscles, so that the use of the former seems to be more indicated than CPAP in patients with CPE showing signs of ventilatory pump failure (i.e., hypercapnia). Bellone et al., in a randomized controlled trial [18], compared the efficacy of nPSV (n = 18) or CPAP (n = 18) for the treatment of hypercapnic CPE (i.e., PaCO2 > 45 mmHg) [18]. There was no difference in resolution time, defined as clinical improvement with a respiratory rate < 30 breaths/min and SaO2 > 96%. No significant differences in arterial blood gases, endotracheal intubation, in-hospital mortality, or myocardial infarction rate existed between the two ventilatory modalities. The authors concluded that both CPAP and nPSV are equally effective in the treatment of hypercapnic CPE. That article was followed by an accompanying editorial by Mehta and Nava, in which the authors highlight the questions that are still unanswered about the use of nPSV or CPAP during an episode of CPE [19]. Since CPAP is not a “true” ventilator mode because it does not actively assist inspiration, its comparison with a ventilatory mode, such as nPSV, is difficult, unless you titrate the expiratory pressure similarly during nPVS and CPAP. Indeed, we still need to identify whether there is a subset of patients who benefit more from nPSV application than from CPAP and to address the question of whether any form of non-invasive ventilation (i.e., nPSV or CPAP) should be considered first-line treatment for CPE. It is important to know how many patients with CPE actually require ventilatory assistance and how many will be successfully managed with standard medical therapy alone.

Hemodynamics

Hemodynamic and oxygen-derived parameters in septic shock

In recent guidelines on septic shock the target value of mean arterial pressure (MAP) varies from 65 to 75 mmHg. Two assumptions explain this choice: firstly, 65 mmHg is the threshold value at which the autoregulation of blood flow to vital organ ceases. Secondly, while the optimal level of MAP might be dependent on the previous blood pressure and vessel status, it has not been demonstrated that increasing MAP above 65 mmHg improves organ function and survival. Target value of central venous saturation (Sv̄CO2), which is considered as an indicator of the adequacy of whole-body tissue perfusion, was also used, during septic shock, in goal-oriented therapy. It was demonstrated by Rivers et al. that targeting this oxygen-derived parameter above 70% during the first 6 h results in a significant reduction in mortality during septic shock in the emergency department [20].

To try to confirm this recommendation, Varpula et al. planned to demonstrate, in a retrospective study in 111 patients suffering from septic shock, that MAP > 65 mmHg and mixed venous oxygen saturation (Sv̄O2) > 70% are the most predictive threshold levels regarding mortality [21]. They performed univariate analysis and forward stepwise logistic regression analysis with the 30-day mortality as the primary end point. They demonstrated that the most important hemodynamic variables predicting 30-day outcome in septic shock were MAP and lactate for the first 6 h and MAP, Sv̄O2, and central venous pressure for the first 48 h. Their results also showed that the best predictive threshold level for 30-day mortality was 65 mmHg for MAP and 70% for Sv̄O2. For this latter parameter Varpula et al. could not demonstrate that the target goal for Sv̄O2 should be lower than for Sv̄CO2, even if, from a physiopathological point of view, Sv̄O2 is often lower than Sv̄CO2 in critically ill patients. These results, which obviously need further investigation, support the guidelines which aim to maintain MAP > 65 mmHg and Sv̄CO2 > 70% in patients with septic shock.

To assess the hemodynamic status of critically ill patients, and particularly of patients suffering from septic shock, the monitoring of cardiac output is of paramount importance. For a long time, the thermodilution technique using a flow-directed pulmonary artery catheter (PAC) has been the goal standard. More recently, noninvasive cardiovascular techniques (echocardiography, Doppler, esophageal Doppler) were developed to prevent the need for invasive techniques. Unfortunately, these latter technologies are not always available, for example, in the emergency department. The mixed venous arterial PCO2 difference has been suggested to serve as a surrogate of cardiac output since mixed venous PCO2 is dependent on circulatory flow; thus, an increase in mixed venous arterial PCO2 difference reflects decreased flow. The need for PAC to determine mixed venous PCO2 and, in addition, the access to cardiac output, precludes the clinical utility of this alternative method.

Since, in critically ill patients, central venous access is widely used, Cuschieri et al. suggested determination of the potential correlation between the central venous to arterial PCO2 difference and the cardiac index [22]. In 83 critically ill patients equipped with a PAC and a central venous catheter, the authors demonstrated (a) a good agreement between central and mixed venous-arterial PCO2 difference, and (b) a significant inverse correlation between cardiac index and mixed (R2 = 0.90) or central (R2 = 0.89) venous-arterial difference. The article by Cuschieri et al. confirms recent publications which suggest that central venous oxygen-derived parameters can be used instead of mixed venous oxygen-derived parameters determined in the pulmonary circulation. Cushieri et al. [22] indicated that this substitution provides an alternative to determine cardiac output and to approach global tissue hypoxia. The substitution of central venous PCO2 to mixed venous PCO2 in the ratio of the mixed venous PCO2 to the arteriovenous oxygen content difference was previously reported to be correlated to lactic acid elevation.

It is widely assumed, however, that the use of a global hemodynamic and oxygenation approach in critically ill patients fails to provide early adequate information on tissue perfusion. Noninvasive monitoring of peripheral perfusion could be a complementary approach serving as a warning of an imminent global tissue hypoxia. Near-infrared spectroscopy (NIRS) offers a technique for continuous, noninvasive, bedside monitoring of tissue oxygenation, and to investigate microvascular function. The causes of microcirculatory dysfunction during sepsis and septic shock are multiple. Sepsis increases the proportion of nonperfused capillaries and systemic vascular permeability. It also decreases tissue oxygen diffusion and cellular metabolism. Lastly, sepsis could disrupt local autoregulation. De Blasi et al. designed their study in 26 patients (13 with septic shock and 13 nonseptic post-surgical patients) to investigate microvascular function, regulation, oxygenation, and cellular metabolism in the human brachioradial muscle during subacute septic shock using NIRS [23]. These authors demonstrated that tissue blood volume increases, microvascular compliance diminishes and post-ischemic reperfusion time lengthens. They also demonstrated a decrease in muscle oxygen consumption. All these results confirmed the well-known alteration in the microcirculation and the maldistribution of local blood flow during septic shock; thus, despite limitations, the potential to monitor regional perfusion and oxygenation noninvasively at the bedside makes clinical application of NIRS technology of particular value in critically ill patients.

Heart and lung interaction

Lung recruitment maneuvers (LRM) are used in critically ill patients submitted to mechanical ventilation to prevent the formation of atelectasis, to improve oxygenation, and to augment end-expiratory lung volume. Increasing positive pressure ventilation induces an increase in intrathoracic pressure, which could compromise hemodynamics by influencing preload, afterload, and contractility of the ventricles. As previously demonstrated by Jardin et al. [24], the most important hemodynamic consequence of a high level of positive pressure ventilation is an increase in right-ventricle afterload due to the huge augmentation of the transpulmonary pressure, which becomes higher than the pulmonary venous pressure. These authors nicely demonstrated 25 years ago, with transthoracic echocardiography, that the use of positive end-expiratory pressure > 12 cmH2O produced right-ventricle pressure overload and leftward septal shift [24]. The study reported by Nielsen et al. confirmed these findings and suggested that LRM could be associated with derecruitment of the pulmonary circulation [25]. They designed a prospective randomized cross-over study in ten adults undergoing coronary artery bypass surgery. They were randomized to two durations of LRM, 40 cmH2O for 10 s and 20 s, or vice versa, after 5 min. Based on exhaustive cardiovascular monitoring (PAC, PiCCO monitor, and transesophageal echocardiography), LRM reduced cardiac output by > 50% and left ventricular end diastolic area by 45%. Moreover, the analysis of the evolution of the eccentric index of the left ventricle demonstrated a reduction in left ventricular end-diastolic area due to an acute right ventricular overload produced by the LRM. Despite the excellent design of this study, its clinical application remains to be determined. Firstly, it is not known if iterative short periods of low cardiac output have any long-term effects. Secondly, the hemodynamic consequences of LRM were observed in patients with healthy lung and normal compliance submitted to relatively tidal volume (8–12 ml/kg). In ARDS the consequences of LMR could be different; however, in hemodynamically instable patients with inadequate volume loading, this potential side effect of LRM could supervene. Even the indications for these maneuvers should be strict and carefully monitored.

Fluid responsiveness

Only half of the patients with acute circulatory failure increase their left ventricular stroke volume in response to volume expansion; therefore, predicting preload responsiveness is a very important issue in intensive care patients. Recent studies demonstrated that in patient's fully adapted to mechanical ventilation the respiratory variation in surrogate of left ventricular stroke volume, i.e., pulse pressure variation, was a reliable predictor of cardiac preload responsiveness. Oesophageal Doppler, by measuring peak aortic velocity in the descending thoracic aorta, allows a correct estimation of aortic blood flow and therefore of cardiac output. Using this technique in 38 mechanically ventilated patients with sinus rhythm and without spontaneous breathing activity Monnet et al tested whether a threshold value of the respiratory variation in peak aortic velocity and in aortic blood flow provides a good prediction of fluid responsiveness [26]. The esophageal Doppler technique used by Monnet et al. (Hemosonic 100 device, Arrow) enables continuous measurement of peak aortic velocity and thoracic aorta diameter and the calculation of aortic blood flow. The authors demonstrated that a respiratory variation in aortic blood flow before volume expansion of at least 18% predicted fluid responsiveness with a sensitivity of 90% and a specificity of 94%. They also reported that the area under the receiver operating characteristics curve generated for variation in aortic blood flow was greater than that generated by flow time corrected. Obviously, this latter parameter, which is the duration of the aortic velocity signal corrected for heart rate, is a static indicator of cardiac preload and thus is subject to a lack of sensitivity. These important results confirm those obtained in experimental conditions with the same technique, and also the findings in critically ill patients recorded with conventional Doppler echocardiography to measure blood velocity at the level of the aortic annulus.

The vast majority of the patients included in the studies aimed at testing the accuracy of the dynamic indices of preload to adequately predict fluid responsiveness have been ventilated with large tidal volume (> 8 ml/kg), e.g., 8 ± 2 ml/kg in the study of Monnet et al. [26].

De Backer et al. [27] designed a study in 60 mechanically ventilated patients who required fluid challenge to determine the predictive value for fluid responsiveness of pulse pressure variations in patients ventilated with different tidal volumes. They also investigated whether a lower pulse pressure variation threshold should be used when patients are submitted to low tidal volume. They confirmed that pulse pressure variation is a good predictor of fluid responsiveness when patients are ventilated with a tidal volume ≥ 8 ml/kg. By contrast, their results suggested that in low-tidal-volume patients (< 8 ml/kg) pulse-pressure variations were no longer better than static indices of preload; however, in their editorial, Teboul and Vieillard-Baron considered that, despite the great value of the data reported by De Backer et al., they provided insufficient support to form a definite conclusion [28]. Teboul and Vieillard-Baron noted that low tidal volumes are obviously recommended in ARDS patients and not in normal lung patients; thus, in ARDS patients, respiratory changes in transpulmonary pressure should still be greater than normal and cyclic changes in intrathoracic pressure, and should be sufficiently significant to keep their sensitivity to predict fluid responsiveness. They also pointed out some methodological limitations (choice of the cut-off values for increase in cardiac output, potential errors in thermodilution cardiac output measurement, lack of software to minimize measurements errors), which preclude a definitive answer, particularly in patients suffering from ARDS.

As it is now widely accepted, the dynamic indices using heart and lung interaction cannot be used in patients with spontaneous breathing activity or arrhythmias. Passive leg rising was suggested in spontaneous breathing patients to predict fluid responsiveness. Vallee et al. reported another original approach to test fluid responsiveness which can be used in all patients who respond to a first fluid challenge, whatever their respiratory or cardiovascular status [29]. These authors suggested taking into account the results of the first fluid challenge to predict the response to the second fluid challenge in patients who respond to the first one. They considered that the potential deleterious effect of the first fluid challenge is marginal, and that in practice the question is the potential need for iterative small fluid challenges. Using esophageal Doppler the authors demonstrated that during volume expansion, an increase of 11% in stroke output index discriminated between responders and nonresponders to subsequent volume expansion with a very high sensitivity and specificity. They chose stroke output index (the ratio between stroke volume and flow time corrected) instead of stroke volume index, because they thought that it is more physiologically effective to follow the variations in an output than the variation in a volume or a static parameter to guide fluid administration. It will be tested if the use of these indices could preclude iterative measurements of pulse pressure variation or the repetition of passive leg-rising maneuvers in hemodynamically unstable patients.

Vasoactive agents and septic shock

When adequate fluid resuscitation is unable to maintain arterial pressure, the administration of vasopressors is required. Norepinephrine has demonstrated some advantages over the other vasopressors. Norepinephrine is more potent than dopamine; it does not alter splanchnic circulation in comparison with ephedrine and phenylephrine. Lastly, norepinephrine increases cardiac output more than vasopressin.

Natalini et al. compared the hemodynamic effects of norepinephrine (alpha-beta receptor agonist) to those of metaraminol (alpha agonist, able to indirectly stimulate norepinephrine release) in an open-label controlled clinical trial in ten septic shock patients equipped with a PAC [30]. Both drugs were titrated to maintain MAP ≥ 65 mmHg. The replacement of norepinephrine with metaraminol had no significant effect on hemodynamic and oxygen-derived parameters. Cardiac output, heart rate, pulmonary artery occlusion pressure, and oxygen consumption were not significantly different. That study suffered from limitations since the order of administration of the two vasoconstrictors was not randomized, the duration of administration of metaraminol was short, and, lastly, the consequences of these two vasoconstrictors on the regional hemodynamics were not assessed. Thus, this study could not demonstrate that metaraminol could serve as an alternative in refractory septic shock resistant to norepinephrine.

Myocardial dysfunction could supervene in about one-third of septic shock patients. Despite some controversy, the use of dobutamine is now widely recommended. But increasing doses of dobutamine induces arrhythmias and a huge augmentation in oxygen consumption. The use of an inodilatator, levosimendan, which increases the force of the contraction of the heart without enhancing the influx of calcium in the cytosol, has been recommended. Levosimendan sensitises Troponin C to calcium and thus improves cardiac contraction at low energy cost.

Morelli et al. have compared, in a prospective randomized controlled study, the global and regional hemodynamic effects of dobutamine (5 gamma / kg min–1) to levosimendan (0.2 gamma / kg min–1) in 28 septic shock patients previously submitted to dobutamine [31]. Twenty four hours after the onset of the infusion, levosimendan maintained the same MAP as dobutamine, but significantly increased cardiac output and decreased pulmonary artery occlusion pressure. With reference to dobutamine, levosimendan increased gastric mucosal flow analyzed with gastric laser Doppler, urinary output, and creatinine clearance, while plasma lactate concentration significantly decreased. This study, conducted in “real life,” suggested that levosimendan could be an alternative to dobutamine, if the physician is reluctant to increase dobutamine doses, e.g., in a patient known to be suffering from ischemic cardiomyopathy. This study did not demonstrate that levosimendan is superior to dobutamine, since its design did not assess the equipotent doses of each inotropic agent in terms of cardiac output.

Weaning failure of cardiac origin

Recognition of the cardiac origin of weaning failure is crucial, since the use of vasodilators and/or diuretics may increase the rate of successful weaning. An early identification of the cause of weaning failure is of paramount importance, since weaning failure prolongs the duration of mechanical ventilation and alters outcome of these critically ill patients. The onset of cardiogenic pulmonary edema during spontaneous breathing trial was documented 20 years ago by Lemaire et al. using a PAC in COPD patients suffering from chronic heart failure and/or acute myocardial ischemia [32]. This weaning-induced acute cardiogenic pulmonary edema was the consequence of the increase in venous return, the reduction in left ventricular compliance, and the increase in afterload secondary to the decrease in intrathoracic pressure. Zakynthinos et al. presented a detailed analysis of the cardiovascular and metabolic consequences of spontaneous breathing trial in 18 patients who failed to wean [33]. They reported two cardiovascular behaviors classified in accordance with the evolution of oxygen consumption. In 9 patients a significant increase in oxygen consumption was observed to be associated with a decrease in Sv̄O2. In these patients weaning-induced left ventricular dysfunction probably developed, with an increase in pulmonary artery occlusion pressure. The slight augmentation in cardiac index was obviously insufficient to meet the increase in oxygen demand. In the other 9 patients, oxygen consumption and Sv̄O2 remained stable despite the weaning failure. As stated by Richard and Teboul in their editorial, the cardiovascular behavior of this group of difficult to wean patients is not easy to explain [34]. The lack of increase in oxygen consumption is unusual during spontaneous breathing, which is generally assumed to induce an increase in oxygen demand, even in patients who succeed in the weaning trial. These result could be the consequence of an improbable depression of the respiratory center and/or the presence of sepsis. Despite limitations, that article emphasizes the interest of invasive cardiovascular monitoring with the PAC to determine the pathophysiology of weaning failure.

An indirect approach to identify weaning failure from cardiac origin could be the analysis of the cumulative fluid balance in difficult-to-wean patients. This could be suggested by a high incidence in cardiovascular weaning failure in patients with positive cumulative fluid balance and the potential role of the administration of diuretics to prevent these weaning failures. To test this hypothesis. Upadya et al. examined the relationship between fluid balance and weaning outcomes in 87 patients who have performed 205 spontaneous breathing trials [35]. Positive fluid balance prior to weaning was higher in patients who failed to be weaned than in patients who succeeded. Using multivariate analysis Upadya et al. demonstrated that the presence of a negative fluid balance the day before the spontaneous breathing trial was an independent predictor of first-day weaning success; however, weaning outcome was not independently associated with concomitant administration of diuretics. These results are preliminary and the study suffered from several limitations, but the results might suggest the need for a prospective randomized study to determine whether diuretics which induce negative fluid balance will be associated with an increased incidence of weaning success in the population of the difficult-to-wean patients.

Education and teaching

Two interesting articles have been proposed on the educational process in the ICU. The first of these articles was focused on the impact of education on physicians' compliance for the use of lung protective mechanical ventilation. Wolthuis et al. accomplished an observational study on the attitude of doctors in academic and nonacademic hospitals to use a lung protective strategy with low tidal volumes to ventilate patients with ARDS [36]. The main intervention consisted of feedback and education on lung protective mechanical ventilation with special attention given to the importance of closely adjusting tidal volumes to predicted body weight (PBW). Six months after the intervention, tidal volume declined significantly (from 9.8 ± 2.0 at baseline to 8.1 ± 1.7 ml/kg PBW), as well as the percentage of undesirable ventilation data points, defined as tidal volumes greater than 8 ml/kg PBW (84 vs 48%). This work suggested the efficacy of a thorough and targeted educational action in modifying clinical behavior. The second study was a prospective, randomized, controlled trial that compared the effectiveness of traditional and on-line teaching methods for educating anesthesiology residents in the principles and practice of difficult airway management.

Bello et al. allocated 56 physicians enrolled among residents in anesthesiology and intensive care medicine in two different groups [37]. Twenty eight residents took a traditional 5-h course on the principles and practice of airway management, which included lectures, slide projection, and dummy demonstrations. The other 28 physicians had the same material presented in an exclusively on-line format, which could be individually accessed for a period of 36 h. In the on-line course, student-instructor interaction was provided through threaded discussion forums during three 30-min real-time question-and-answer sessions. The measured written tests and practical skills tests were not different between the two groups. Semi-quantitative ratings of learner satisfaction were significantly higher in the on-line group (P  =  0.014). The instructors spent an average of 144 ± 10 min preparing answers and interacting with on-line students. The conclusion was that the on-line teaching formats may be a valid alternative for teaching residents, but the interaction with instructors that was an important element, and may require substantial time commitments by instructors. An editorial by Smith and Menon was published on this second paper [38]. The editorial commented on the advantages and disadvantages of the on-line educational approach. The authors sustained the thesis that the on-line teaching offers an innovative and exciting way of improving the learning experience of residents, with wide dissemination, good consistency to the learning experience, and easy updating. It is important to acknowledge the limitations represented by potentially limited access to personal computers and high-speed Internet access often required for video images. The editorial concluded that it is crucial to recognize that on-line teaching represents only one step, not yet conclusive, in the process of training clinicians.

Renal failure

Even if acute renal failure (ARenF) develops rapidly in a critically ill patient during a sudden fall in perfusion, with reduction in glomerular filtration and diuresis, clinical recognition may be slow, partly because of the time dependency of increases in serum creatinine and urea, and the development of complications including hyperkalemia and severe overhydration. On the other hand, early recognition could help early management and thereby retard or prevent further renal injury and function deterioration.

The implication is that ARenF is a gradable, evolving syndrome, and, along these lines, authors have redefined ARenF according to severity criteria based on increases in creatinine, urea, and decreases in urinary output [39]. Ostermann et al. used grading criteria for ARenF in Saudi Arabia to evaluate their prognostic significance [40]. They confirmed that acute renal injury, renal failure syndrome, and severe acute renal failure syndrome are associated with increasing morbidity and mortality, even in multivariate analyses, whereas renal replacement therapy does not contribute. Conversely, polyuria in the course of acute renal injury or failure syndrome was associated with a better prognosis than oligoanuria.

Looking more closely at the clinical pathophysiology of ARenF, one of the alleged actions of diuretics favoring a beneficial role in the treatment of ARenF is the decrease in absorption of salt and water, which is energy-consuming. Diuretics may thus decrease renal demands and therefore exert renoprotective properties, even though highly controversial in practice, during impending ARenF. Swärd et al. documented, in post-cardiac surgery in patients with normal renal function, that furosemide decreased sodium reabsorption and oxygen consumption of the kidney, whereas atrial natriuretic peptide infusion did not have these effects [41]; the latter can be attributed to distal reabsorption of the increased filtered sodium load, whereas furosemide decreased reabsorption because of diminished glomerular filtration and filtered load. The findings may have future implications, although the mechanisms of action and energy-sparing effect may be attenuated during (impending) ARenF with a relatively high sodium excretion.

Although a number of biomarkers of acute renal injury or dysfunction have been suggested to serve the aforementioned purpose of early recognition, allowing early prevention and management, none has become a routine measure to guide treatment. Eijkenboom et al. report on urinary secretion of A1 and P1 glutathione S-transferase, markers for proximal and distal tubular injury, respectively [42]. These markers, although relatively hard to measure, have been studied in the past for their predictive value for renal injury, renal replacement, and reversibility of ARenF. In the current article, the authors argue that the urinary excretion of these enzymes is elevated after cardiac surgery, even in patients who do not develop a rise in serum creatinine; hence, the method is too sensitive (and probably also too nonspecific) to serve the clinical purposes outlined previously. This does not exclude a predictive value in a population with a higher prevalence of ARenF, when higher cut-off values are used.

The ARenF can be complicated by anemia, because of blood loss and reduced red cell production. Du Cheyron et al. examined the significance of anemia in ARenF [43]. They found that an admission hemoglobin level below 9 g/dl (5.6 mmol/l) was associated with increased mortality, independently of age and organ failure in multivariate analysis, in 205 ARenF patients on renal replacement therapy. The accompanying editorial stresses the importance of the observations for future studies on transfusion triggers [44]. Indeed, the causes and consequences of the findings for treatment remain obscure.

The literature is relatively scarce on the ultimate prognosis of surviving ARenF patients. In an article by Ahlström and coworkers from Finland, the quality of life and dialysis-dependency of ARenF survivors were studied [45]. They found that, at least in their country, the long-term outcome of ARenF is poor, with a meager 30% 5-year survival, particularly in the context of severe underlying disease and persistent chronic dialysis dependency. The patients also experienced a diminished but tolerable quality of life.

Taken together, the knowledge on the pathophysiology and natural course of ARenF in the ICU and beyond is growing and, hopefully, will lead to improved prevention and treatment of ARenF.