Intracranial pressure and brain compliance in traumatic brain injury

Elevated intracranial pressure (ICP) has been associated with increased mortality in patients with severe traumatic brain injury (TBI). More than 20 years ago, Marmarou [1], while analysing the data from the traumatic coma data bank (TCDB), recognized that the proportion of hourly ICP readings greater than 20 mmHg was fundamental in explaining long-term outcome in TBI victims. Since that moment, the cutoff point of 20 mmHg has been selected as most indicative in neurotraumatology. Not only times spent over the threshold, but also the responses to the therapy are crucial. Following this concept, Treggiari [2] systematically reviewed the literature to estimate the association between ICP patterns and outcome. Confirming Marmarou’s observations, raised ICP, i.e. an ICP >20 mmHg, was associated with an increased probability of death. Moreover, if ICP was raised but treatable, this condition was associated with a threefold to fourfold increase in the probability of death or poor neurological outcome. If ICP was raised and refractory, this was associated with a dramatic increase in the relative risk of death (OR 114.3). Therefore, refractory ICP and response to ICP treatment could be better predictors of neurological outcome than absolute ICP values.

Badri et al. [3] examined whether raised ICP is independently associated with mortality, functional status and neuropsychological functioning in adult TBI patients. Data from 499 participants in a randomized trial, comparing intravenous magnesium sulfate to placebo in moderate to severe TBI patients, were secondarily analysed. The primary endpoints were mortality and a composite measure of functional status and neuropsychological function, evaluated as memory, speed of information processing, and executive function over a 6-month period. Moreover, the authors examined different summarizations of ICP during the first 48 h, to test the hypothesis that ICP values or patterns are independently associated with mortality and neurobehavioral function in patients with TBI. ICP recordings were evaluated using different a priori specified measures as average ICP, defined as area under the curve (AUC) of the first 48 h of ICP measurements divided by the time monitored corresponding to time-weighted ICP average; time-weighted average ICP >20 mmHg; baseline ICP; maximum ICP recorded in 48 h; and last ICP at the end of the 48 h monitoring period. The area under the curve of the ICP profile (average ICP) during the first 48 h of monitoring was the main predictor of interest. The overall 6-month mortality was 18 %. The adjusted odds ratio of mortality comparing 10-mmHg increases in average ICP was 3.12 (95 % confidence interval [CI] 1.79, 5.44; p < 0.01). Overall, higher average ICP was associated with decreased functional status and neuropsychological functioning (p < 0.01). Importantly, among survivors, increasing average ICP wasn’t independently associated with inferior performance on neuropsychological testing. The conclusions of this study suggest that raised average ICP during the first 48 h of admission does not necessarily have adverse effects on the neuropsychological and functional abilities of TBI survivors, as the association of ICP and outcome was mostly contributed by the excess mortality. The lack of association between neuropsychological function and early ICP values among survivors has potential implications for treatment. It is encouraging to observe that patients with different ICP profiles in the ICU can achieve comparable neurological outcome at 6 months post-injury, provided they survive to achieve this milestone. These findings can have implications in clinical decision-making and prognostic considerations among TBI survivors.

Since the final part of the last century, neuroscientist and neurointensivists, being aware of the negative effect of high ICP, tried to anticipate its rise, studying the so-called ‘brain compliance’, the sum of compliances of the brain, of the venous and arterial compartment and of CSF. Therefore, the resulting overall brain compliance is a non-linear function of many compartmental pressures. Due to the complexity of this picture and the difficulties in its interpretation, and the advent of continuous ICP monitoring, compliance studies were utilized less and less in neurocritical care. Howells et al. [4] compared different measures of cerebrospinal compensatory reserve, i.e. ICP amplitude, ICP slope, and the correlation of ICP amplitude and ICP mean (RAP index), in a cohort of 84 patients with traumatic brain injury both before and after second tier medical and surgical therapies. Mean values of the three measures were calculated in the 2-h periods before and after surgery, and in the 12-h periods preceding and during periods of thiopental coma. ICP amplitude was significantly correlated with Glasgow Motor Score, and also with age for patients 35 years old and older. The correlations of ICP slope and the RAP index with Glasgow Motor Score and with age were not significant. All three metrics indicated significant improvements in compliance following surgery and during thiopental coma. The strong point of this study is that it characterizes the response to medication: the described measures of ICP waveform are dependent on the type of surgery and thiopental administration. None of the metrics were significantly correlated with outcome, possibly due to confounding effects of treatment. The results regarding thiopental coma demonstrate that all of these compliance metrics convey important information not evident from mean ICP alone. ICP amplitude performed was the only metric that was significantly correlated with severity of injury, indicating a declining compliance in older patients. It also has the advantages of simplicity and of being a very salient characteristic of the waveform. The quest for finding an anticipatory sign of high ICP continues.

TBI in ovulating females results in favorable neurological outcomes compared to males with similar insults [5]. Zlotnik et al. [6] observed that the administration of estrogen in male rats with TBI decreased blood glutamate levels and improved neurological application, thus providing neuroprotection.

Anaemia in TBI

Anaemia is one of the most common medical complications in neurocritically ill patients. In this cohort, lower haemoglobin concentrations are consistently associated with worse physiologic parameters and clinical outcomes. Nowadays, based on the results of clinical trials, transfusion practices in ICU have generally become more restrictive. However, because reduced oxygen delivery contributes to ‘secondary’ cerebral injury, anaemia may not be as well tolerated among neurocritical care patients. There have been no randomized controlled trials that have adequately evaluated optimal transfusion thresholds among brain-injured patients [7].

Along with increased flow produced by increased heart rate and contractility, thereby augmenting CO, and lower blood viscosity, anaemia also induces cerebral vasodilatation. As anaemia worsens, the resultant increases in cerebral blood flow and the increased oxygen extraction fraction eventually become insufficient to overcome the reduced CaO2 produced by low haemoglobin (Hgb) concentrations. The point at which this threshold is reached is unclear, and probably varies between patients. Physiological information, such as the local availability of oxygen in the brain tissue, could help the clinicians in identifying a tailored transfusion threshold for a specific patient.

For answering to this conundrum, Oddo et al. [8] investigated the relationship between Hgb and brain tissue oxygen tension (PbtO2) after severe TBI and examined its impact on outcome. He performed a retrospective analysis on a cohort of severe TBI patients. Four hundred seventy-four simultaneous Hgb and PbtO2 recordings from 80 patients were analysed. Hgb ≤ 9 g/dl was associated with lower PbtO2. Anaemia with simultaneous tissue hypoxia, i.e. a PbtO2 < 20 mmHg, but not anaemia alone, increased the risk of unfavourable outcome (odds ratio 6.24 (95 % CI 1.61, 24.22), p = 0.008), after controlling for other confounders.

These findings suggest that anaemia-associated reduced PbtO2 may aggravate TBI prognosis, and support the concept that PbtO2 or other monitoring systems of the oxygen availability in the brain may be a helpful physiologic target for the tailoring Hgb levels after severe head injury.

Sedation in neurocritical care patients

Recent evidence suggests that sedative drugs use in general intensive care unit patients should be minimized [9]. However, this does not apply to acute brain injury patients, because in this cohort, sedation, along with general targets, has additional aims: neuroprotective properties and improve intracranial pressure control [10]. The research of neuroprotection using sedation is still an open question. Recently, a large number of animal studies investigated the potential neuroprotective effect of volatile anesthetic agents. In vivo, it has been demonstrated that preconditioning with isoflurane improves long-term neurologic outcome after hypoxic-ischemic brain injury [11]. These seminal animal research studies, even if attractive for their potential benefit for ischemic damage, have not been translated into clinical studies, mainly because the use of a volatile agent carries the potential risk of vasodilatation producing an increase in cerebral blood flow (CBF) and, consequently, a rise in ICP. For this reason, use of these drugs has been traditionally considered unsafe in acute brain damaged patients, and confined outside neuro ICUs. Until recently, surpassing these boundaries seemed imprudent. Continuous ICP and CBF monitoring and the delineation of safety thresholds have allowed for securely setting sail for this adventure. Bösel et al. [12] investigated the effects of 3 days of volatile isoflurane sedation on cerebral oxygenation, circulation and pressure in ischemic and haemorrhagic strokes. They observed a clinically irrelevant increase in ICP. Mean arterial pressure (MAP) and cerebral perfusion pressure (CPP) were decreased, particularly in patients who were previously on midazolam. They concluded that it is possible to reach satisfactory sedation by applying volatile isoflurane for the long-term without causing a relevant raise in ICP if baseline ICP values are low or only moderately elevated and if arterial blood pressure and PaCO2 are kept stable.

Nevertheless, an increase in ICP should be considered and monitored when applying this sedative agent, and we agree with the author’s recommendation that multimodal neuromonitoring is mandatory when applying volatile sedation. However, the results are consistent with other recently published experience [13]. These data need to be confirmed in larger prospective studies, and the possibility to transfer the experimental evidence to the bedside is fascinating, mainly in patients with a therapeutic window of preconditioning, such as subarachnoid haemorrhage (SAH) victims. A large fraction of SAH patients develop a delayed, days after the initial bleeding, neurological deficit even when the aneurysm is successfully secured. This seems to be linked to vasospasm and the molecular cascades of early brain injury. An early therapeutic intervention, such as sedation with volatile anesthetics, has the potential advantage at stopping these molecular cascades and may reduce the long-term deficits. Nevertheless, this experience has opened a new, hitherto unexplored, scenario.

Delirium

Systematic monitoring of sedation, pain and delirium in the ICU is important in delivering adequate patient care. While the use of systematic monitoring tools is widely agreed upon, these are infrequently applied to daily ICU care. Radtke et al. [14] compared the effectiveness of two different training strategies, i.e. training according to the local standard versus modified extended method, on the implementation rate of scoring instruments for sedation, pain and delirium in the ICU. The modified extended training included establishing a local support team helping to resolve immediate problems.

In this experimental cohort study, the frequencies of scoring ICUs before and after training and a follow-up after 1 year were analysed, and the impact on patients’ outcome was evaluated.

ICUs that were trained by the modified extended method showed increased documentation rates of all scores per patient and day. At 1-year follow-up, increased scoring rates for all scores were maintained. Scoring rates with training according to the local standard training protocol did not increase significantly. Implementation of delirium and pain monitoring were associated with a decrease in mortality. Monitoring had no significant influence on ventilation time or ICU length of stay. A modified extended training strategy for ICU monitoring tools leads to higher intermediate and long-term implementation rates, and is associated with improved patient outcome. However, unmeasured confounders may have biased these findings.

Exploring the cause of delirium, Girard et al. [15] explored the associations of a priori-selected markers of inflammation and coagulation with delirium during critical illness. In a prospective cohort study, blood from 138 mechanically ventilated medical ICU patients was collected, and nine plasma markers of inflammation and coagulation were measured. Daily delirium was assessed using the Confusion Assessment Method for the ICU. Among the patients studied, 78 % were delirious at some point during the study. Two markers of inflammation and one of coagulation were significantly associated with delirium. After adjusting for covariates, lower plasma concentrations of matrix metalloproteinase-9 (MMP-9) and protein C were associated with increased probability of delirium, and higher concentrations of soluble tumor necrosis factor receptor-1 (sTNFR1) were associated with increased probability of delirium. These results suggest that specific aspects of inflammation and coagulation may play a role in the evolution of delirium during critical illness, and that these markers should be examined in larger studies of ICU patients.

Nephrology

Several epidemiologic studies have clearly demonstrated that mortality of critically ill patients is increased progressively with higher stages of acute kidney injury (AKI) [16], even if renal replacement therapy is not required [17]. A likely explanation for this may be the fact that AKI is not a simple organ failure, but rather must be considered a syndrome with extensive systemic consequences. Significant impairment of other organs like the lung, the heart or the brain has been demonstrated in association with AKI. By demonstrating that increasing stage of AKI is associated with decreased hepatic metabolism of midazolam, most likely by influencing CYP 3A activity even in the absence of overt signs of hepatic dysfunction, Kirwan CJ and co-workers [18] provide further support of the concept of systemic effects of AKI. The study indicates that AKI may impair several features of hepatic function, such as its detoxifying capability, resulting in reduced hepatic drug metabolism and consequently decreased drug clearance in patients with AKI.

A quite common adverse effect of AKI is hyperkalemia, which is also among the five most frequent indications to start renal replacement therapy [19]. However, as shown in an epidemiologic study on 39,705 patients from two tertiary care hospitals in Boston, USA, even lower degrees of hyperkalemia on ICU admission are associated with increased mortality [20]. In a multivariate analysis, the authors demonstrated a significantly increased OR for mortality (1.25, 95 % CI 1.16–1.35, p < 0.0001) starting at slightly elevated potassium levels of 4.5–5.0 mmol/l, and continuously increasing by every 0.5 mmol increment until an OR of 1.72 with potassium levels >6.5 mmol/l (p < 0.0001). The association between increased mortality and high admission potassium, however, appears to subside when hyperkalemia resolves within 48 h, usually reflecting effective treatment.

AKI is a frequent complication of liver failure. Whereas the mechanism of AKI in the setting of chronic liver failure is well known, the underlying pathophysiology of AKI in the setting of acute liver failure (ALF), developing in roughly 55 % of the patients, still needs to be elucidated [21]. One possible pathogenetic factor may be the renal toxicity of cadmium (Cd) released from the liver in ALF. This was indicated by a study in 20 patients with ALF [22] demonstrating a close association between release of marker proteins of renal tubular injury (retinol binding protein, α1-microglobulin, β2-mircroglobulin, Clara cell protein-CC16) and increased urinary Cd excretion, which could not be found in neurosurgical ICU controls. For all other major trace elements determined, only urinary copper excretion showed an association with tubular injury markers.

Patients undergoing cardiovascular surgery are a further cohort exhibiting a high risk of AKI, with average incidence reported around 22 % and with dialysis requirements around 2 % reported in large trials [23]. In addition to the well known risk factors of age, heart failure, diabetes, chronic kidney disease and hypertension, an observational study including 1,182 patients with normal baseline renal function undergoing off-pump coronary artery bypass surgery determined albumin levels <4 g/dl as independent risk factor for postoperative AKI (OR = 1.83), remaining highly significant even after correction by a propensity score [24]. The findings of this study substantiate a recent meta-analysis demonstrating increased risk for AKI as well as mortality in hypoalbuminemic patients in both ICU and surgical settings [25]. Whether reduced serum albumin is just a diagnostic marker for increased severity of disease or a derangement whose correction by substituting human serum albumin can improve outcome in critically ill patients will be clarified by the soon to be published large EARSS and ALBIOS trials.

In the intensive care unit, the role renal biopsy for differential diagnosis of AKI is still somewhat controversial. With only sporadic data available, most clinicians withhold from doing this procedure in this setting because of an unclear benefit to damage ratio. A large retrospective analysis of ten French intensive care units over a period of 10 years showed that renal biopsy is a quite rare diagnostic procedure performed at an average rate of 8.4/10,000 critically ill patients. Of note, complications of this procedure were reported in nearly 25 % of the patients undergoing this procedure; this is substantially higher than the roughly 2 % reported for renal biopsies of native kidneys in non-critically ill patients [26]. However, when performed successfully, a diagnosis different from acute tubular necrosis was found in up to 50 % of the cases, indicating that, if used selectively, renal biopsy may significantly improve diagnostic workup and result in a relevant change of treatment.

Two publications addressed technical aspects of renal replacement therapy. Filter survival is a major issue in patients with enhanced bleeding risk preferably treated without anticoagulation. The AN69ST membrane exhibits enhanced heparin binding capacity, and was developed for prolonging filter survival in anticoagulation free renal replacement therapy. A single centre, randomized, double-blind, crossover trial in 39 patients comparing efficacy of this specific membrane against a standard AN69 filter in circuits without anticoagulation could not find any significant benefit with average filter survivals of 14.2 and 13.3 h, respectively [27]. An area of uncertainty was the question whether continuous veno-venous hemofiltration (CVVH) at a high dose does render haemodynamic measurements (e.g. CI, GEDVI, EVLWI) based on unreliable thermodilution. This concern is comprehensible, because it is well known that high dose CVVH provides significant cooling to venous blood and therefore may cause errors in measurements. The study by Dufour et al. [28] finally proved reliability of haemodynamic measurement using PiCCO2, even when high volume hemofiltration using ultrafiltrate volumes up to 6 l/h was applied.

Epidemiology

The currently largest epidemiologic investigation of patients with liver cirrhosis in the UK showed that a gloomy prognosis ranging from 33 to 39 % has improved only slightly for patients with liver cirrhosis over a period of 13 years, as long as they do not suffer from other additional organ failure [29]. In liver cirrhosis, patients with sepsis requiring multiple organ support, mortality reached 90 %. As described previously, APACHE II score under-predicted mortality in cirrhotics. Conversely, patients with chronic renal disease showing similar characteristics and higher APACHE II score had mortality rates nearly 50 % lower.

Demographic changes in industrialized countries result in increased admission rates of old and very old patients to ICUs. This perception is clearly underlined by a study analysing 17,126 patients from 41 Austrian intensive care units [30], showing a continuous increase in the percentage of very old patients, defined by an age > 80 years, from 11.5 to 15.3 % over a 10-year period (1998–2008). Associated with higher female life expectancy, this age group is predominated by the female sex with an average of 63 %. Despite the associated increase in severity of illness, the authors found a decrease in standardized mortality ratio (SMR). This finding supports the concept that biological age alone is not a reliable predictor of outcome and should not be used a major selection criterion for admission. On the other hand, the observed reduction of SMR in this study may result from a bias occurring from change in the performance of Simplified Acute Physiology Score (SAPS) II over time. This problem was addressed in a study from the Netherlands [31] that included 12,143 patients, aged 80 and above. The authors could nicely demonstrate that in the setting of increasing average age, SAPS II scores requires frequent recalibration to avoid down-trending of SMR over time. The periodic introduction of new severity scores, like the recent SAPS III score, is another possibility to compensate for changes in case mix and treatment modalities requiring frequent recalibration. However, as shown by a prospective study in 103 Italian ICUs, new scores may not necessarily perform superior to older ones in specific populations [32]. Consequently, potential requirement of regional customisations of new severity of disease scores should be kept in mind.

Staffing and nursing workload are well known key factors for patient outcome [33]. Consequently, validated tools for quantifying nursing workload during in various ICU conditions are desperately warranted. An interesting step in that direction was undertaken by Debergh et al. [34], who developed a score based on nursing activities. By this approach, the authors could demonstrate and quantify considerable variability of nursing workload, based on case mix as well as time of the shift. This first step definitely requires further refinement and validation, but represents an important development for planning staffing requirements.

Sepsis, inflammation and biomarkers

This year, important contributions in the field of clinical classification, diagnosis, biomarkers and treatment of acute inflammatory conditions, including sepsis, were published in Intensive Care Medicine. New pathogenetic insights were published, as well as new consequences of widely applied treatments, such as low-dose steroids, transfusions and insulin.

The clinical classification of sepsis, severe sepsis, and septic shock could differ according to the way the data are recorded and interpreted. In their very clever and pragmatic study, Klein Klouwenberg et al. [35] prospectively audited a large tertiary intensive care unit over 22 months to quantify the influence of minor variations in the frequency, timing and method used to capture the data on the diagnosis. For instance, the effects of the use of hourly recorded data were compared with continuously recorded data; the number of simultaneous criteria present and the implication for diagnosis using different published sets of criteria and other similar sets. The results are impressive: the measured incidence of systemic inflammatory response syndrome (SIRS) varied from 49 % (most restrictive setting) to 99 % (most liberal setting). The incidences of sepsis, severe sepsis and septic shock ranged from 22 to 31 %, from 6 to 27 % and from 4 to 9 % for the most restrictive versus the most liberal measurement settings, respectively. In non-infected patients, 39 to 98 % of patients had SIRS. These observations very clearly stress the importance of minor variations in the definition of SIRS and mode of data recording on the incidence of sepsis and also eligibility for clinical trials. Similarly, the values of some physiological variables of septic patients could differ during the nychthemeron, as a result of the disruption of the circadian rhythm and of the alterations of the hormonal response to the ambient light. Verceles et al. [36] assessed the association between ambient light and circadian melatonin release, measured by urinary 6-sulfatoxymelatonin (6-SMT), in patients with severe sepsis. In essence, they found no significant differences among urinary 6-SMT levels across 4-h time periods or between the 2 days, in spite of differences in the light levels. These data support the disruption of circadian rhythm during severe sepsis.

In the field of biomarkers too, several important contributions were published last year in Intensive Care Medicine. The first of these important papers [37] reports a close correlation between the levels of soluble triggering receptor expressed on myeloid cells (sTREM-1) measured on samples of peritoneal fluid of patients with acute pancreatitis. This biomarker was highly predictive of an infection of the pancreatitis-associated necrosis. If confirmed, these important findings will support the routine determination of fluid level of sTREM-1 for the diagnosis of secondary infection of necrotic tissue in patients with severe acute pancreatitis. Another major daily challenge is the discrimination between bacterial and non-bacterial infection. The degree of expression of CD64 on the surface of neutrophils is believed to be a promising approach to ascertain the presence of bacterial sepsis. Gros et al. [38] measured the expression of CD64 expression by flow cytometry in blood samples drawn from a cohort of consecutive patients with at least two criteria for systemic inflammatory response syndrome and admitted over a 7-month period. Approximately half of the 293 patients had a confirmed bacterial infection. However, the sensitivity of CD64 for the diagnosis of bacterial infections was insufficient; this finding logically implies the CD64 index may not be practically recommended, although it may be useful in combination other biomarkers.

In the field of sepsis, two different reports assessed the prognostic accuracy of chromogranin A (CgA), a pro-hormone released by the adrenal medulla upon stress. The first of these two studies [39] aimed to examine the prognostic significance of the admission value of vasostatin-1 (VS-I), the N-terminal peptide of CgA, concentration in a population of 494 critically ill patients and healthy controls. As expected, critically ill patients had higher admission VS-I concentrations than controls; the plasma VS-I concentration was lower in survivors than in nonsurvivors, and lower in the absence than in the presence of shock. Interestingly, admission VS-I and lactate values were independent predictors of mortality. The combination of both plus age provided a better indication for predicting mortality than taking each alone (p < 0.01). Similarly, the CgA levels were measured at the time of study inclusion and 72 h later in 232 patients with severe sepsis recruited from 24 ICUs in Finland (FINNSEPSIS study) [40]. Consistently, CgA levels at inclusion and after 72 h correlated with several established indices of risk in sepsis. Patients who died during the hospitalization had higher baseline CgA levels than hospital survivors. Prior cardiovascular disease or current cardiovascular disorder were associated with higher CgA levels after 72 h by linear regression. CgA levels on study inclusion and after 72 h were independently associated with hospital mortality by logistic regression. The prognostic accuracy was comparable for CgA measurements and SAPS II score, and the addition of CgA measurements to the SAPS II score improved risk stratification of the patients, as assessed by the category-free net reclassification index. Taken together, the findings of these two independent studies clearly confirm the accuracy of CgA levels as predictor of poor outcome. Other biomarkers were studied as well. For instance, Cribbs et al. [41] hypothesized that the circulating levels of endothelial progenitor cells (EPC), a contributor to the vascular repair following sepsis-related endothelial dysfunction, will be inversely correlated with outcome during sepsis. They indeed found that the patients with sepsis had significantly lower mean endothelial progenitor cell colony counts compared with controls, and they reported an inverse correlation between EPC numbers and SOFA score, after adjusting for mortality.

Steroids in sepsis

From a therapeutic standpoint, the effects of steroid treatment yielded much attention this year: a first article [42] reports the hormonal effects of low doses of steroids given in a large landmark trial in which 299 septic shock patients received a 7-day treatment with a combination of hydrocortisone (50 mg intravenously four times daily) and fludrocortisone (50 μg orally once daily) or matching placebos. Overall, low-dose steroids induced both their expected glucocorticoid and mineralocorticoid biological effects and increased urine output. In contrast to these findings supporting a beneficial role for low-dose steroids during sepsis, the analysis of the database of the Surviving Sepsis Campaign (SSC), i.e. 27,836 adults with septic shock, at 218 sites yielded different results [43]. A total of 17,847 patients in the database required vasopressor therapy despite fluid resuscitation, and therefore met the eligibility criteria for receiving low-dose steroids. A total of 8,992 patients (50.4 %) received low-dose steroids for their septic shock. Interestingly, patients in Europe and South America were more likely to be prescribed low-dose steroids compared to their counterparts in North America. The adjusted hospital mortality was significantly higher in patients who received low-dose steroids compared to those who did not. There was still an association with increased adjusted hospital mortality with low-dose steroids even if they were prescribed within 8 h. These findings very clearly stress the need for frequent reassessment of recommendations in relation with the actual effect of any treatment, and in particular for widely available drugs.

Blood transfusion in sepsis

Among other frequent treatments, the metabolic effects of transfusion of red blood cell (RBC) in septic patients are mostly unknown. Kopterides et al. [44] carefully monitored the interstitial fluid concentrations of lactate, pyruvate, glycerol and glucose before (T0) and after transfusion of one to two RBC units. They found a decreased the lactate-to-pyruvate (LP) ratio during and after transfusion, which was unrelated to any of the possible confounders. These findings indicate that tissue oxygenation is affected by RBC transfusion in critically ill septic patients. Monitoring of tissue LP ratio by microdialysis may represent a useful method for individual clinical management.

Glycaemic control

Insulin therapy is also a common practice whose risks are only partially known. Van Iersel et al. [45] sought to identify risk factors for hypoglycaemia in neurocritical care patients receiving intensive insulin therapy (IIT). In this nested case–control study, the first hypoglycaemic event of every patient (index moment) was used to match to a control patient. Possible risk factors preceding the index moment were scored using hospital records. Of 786 neurocritical care patients, 57 % developed hypoglycaemia. Independent risk factors for hypoglycaemia were lowering nutrition without insulin dose reduction; mechanical; lowering the dosage of norepinephrine; a hyperglycaemic event; gastric residual in the 6 h preceding the index moment without insulin dose reduction; and dosage of insulin at the index moment. The risk-to-benefit of intensive insulin therapy requires such careful assessment, especially in the neurocritical care setting.

In studies over tight glycaemic control, there is most often a difference in blood glucose concentration and in the amount of insulin given. It has not been possible to directly link these results to a better outcome. On the contrary, the total dose of insulin is most often a predictor of an unfavourable outcome. Cappi et al. [46] reported from a study where patients with severe sepsis and septic shock (n = 63) were randomized to tight glucose control or not. They observed that during the initial 72 h of ICU stay, patients in the tight glucose control group had lower free fatty acid levels as compared to controls. In the study, it was also demonstrated, as has been reported before, that a high free fatty acid level was associated with an unfavourable outcome. The authors suggest free fatty acids might be involved in a possible mechanism behind the results of studies employing tight glycaemic control.

Nutrition and intestinal function

The connection often seen between parenteral nutrition and an unfavourable outcome has often been suggested to be linked to the fat emulsions used. Although the evidence for this proposal has never been conclusive, many centres limit their use of fat emulsions, in particular when triglyceridemia is present. As a part of metabolic monitoring of an unselected material of consecutive admissions, Devaud et al. [47] observed that the most common backgroud to hypertriglyceridemia was the use of propofol sedation. Although the significance of hypertriglyceridemia in terms of outcome is not well characterised, the authors suggest that this observation need to be more specifically elucidated.

It has long been recognised that gastrointestinal function is an important predictor of outcomes. However, gastrointestinal function is not a part of current scorings for organ failures, or for the most commonly used scorings to predict outcome at admission, such as APACHE II or SAPS III. An important background factor for neglecting the gastrointestinal tract is the absence of definitions for gastrointestinal dysfunction. The Working Group on Abdominal Problems (WGAP) within the European Society of Intensive Care Medicine (ESICM) has now put forward a suggestion of such a scoring [48]. They have reviewed the literature in the field and suggest a scoring based on the success of enteral feeding. Four grades were identified; from a self-limiting condition, to a condition demanding intervention. The WGAP now suggests that the grading be used in clinical studies, and it eventually may be included organ failure scoring systems.

Finally, Peigne et al. [49] reported the causes and risk factors of death in patients with thrombotic microangiopathies (TMA), drawn from the national French registry and from a case–control analysis. The leading causes of death were nosocomial infections, myocardial infarction, stroke, and pulmonary embolism. Cases and controls did not differ significantly regarding haemolysis parameters, ADAMTS13 activity, or neurological or gastrointestinal involvement. TMA was more frequently related to HIV or cancer in patients who died. Compared to survivors, non-survivors more often had cardiac involvement at diagnosis and less often received plasma exchange therapy.

Experimentals

Anti-inflammatory effects of anesthetic agent

Volatile anesthetics are known to attenuate inflammatory response and tissue damage markers in acute organ injury [50]. However, the mechanisms by which they exert the anti-inflammatory effects remain unknown. Fortis et al. [51] observed that anesthesia with sevoflurane suppresses pulmonary inflammation and thus protects the lung from the damage induced by intratracheal instillation of endotoxin followed by mechanical ventilation. The protective effects of sevoflurane appear to be associated with its agonistic effects at type-A gamma-aminobutyric acid (GABAA) receptors in lung epithelial cells, exerting anti-inflammatory properties.

Anesthesia in Pediatrics

Intrauterine growth restriction (IUGR) is one of the most significant causes of perinatal morbidity and mortality, and is associated with several neurodevelopmental disorders [52]. These cerebral alterations enhance the susceptibility to anesthesia-related neurodegenerative consequences. Schubert and colleagues demonstrated that the enhanced susceptibility to neurotoxicity induced by anesthesia is not associated with adverse systemic or organ-specific impairments, but may be related with intrinsic apoptotic pathways [53]. These data should be considered in clinical interventions in infants suffering from fetal growth restriction.

Fluid resuscitation and acid–base equilibrium impairment

Fluid resuscitation with normal saline has been frequently used in critically ill patients, but may lead to metabolic acidosis due to a reduction in the plasma strong ion difference (SID). Langer et al. [54] reported that in healthy animals, at constant PCO2, the plasma pH can be predictably increased, decreased, or maintained unchanged, according to the difference between the baseline plasma and SID of the crystalloid solution infused. A better understanding of these mechanisms may lead to a more rational approach for the use of crystalloid solutions in critically ill patients.

Mechanical ventilation and acute respiratory distress syndrome

Although mechanical ventilation is a life-saving intervention in patients suffering from respiratory failure, prolonged mechanical ventilation is often associated with numerous complications, including problematic weaning. Current guidelines recommend that weaning from ventilator be achieved using a two-step strategy [55]; however, there are controversies regarding the accuracy of the various weaning predictors. Diaphragm electromyographic activity (EAdi)-derived indices obtained from a neutrally adjusted ventilator assist (NAVA) catheter during spontaneous breathing trial provide reliable predictors of weaning [56]. Nevertheless, the performance of these indices is not better than the ratio of respiratory rate to tidal volume.

The weakness of the inspiratory muscles, which plays an important role in difficult weaning from mechanical ventilation [57], is associated with changes within the diaphragm muscle fibers [58]. Measuring contractile function of muscle single fibers is the best technique to study the involvement of contractile protein dysfunction, since force-generating capacity of muscle fibers strongly depends on the content of the contractile protein. Moreover, for structural stability and optimal active force generation, passive elastic structures are indispensable [59]. In this line, titin is the major determinant of passive elastic properties of striated muscles. Rats mechanically ventilated for 18 h showed: (1) impaired diaphragm fiber active force-generating capacity and passive force generation upon stretch; (2) loss of myosin, which contributes to reduced active force generation; (3) reduced passive force generation due to a decreased phosphorylation status of titin; and (4) no discernible changes in the soleus muscle [60].

Protective mechanical ventilation during acute respiratory distress syndrome is associated with hypercapnic acidosis. Hypercapnic acidosis may lead to: (1) pulmonary vasoconstriction [61], which contributes to improvement of arterial oxygenation; and (2) vasodilation in systemic circulation through increased production of nitric oxide (NO) [62]. If, on the one hand, hypercapnic acidosis increases NO production in hypoxic lung regions attenuating hypoxic pulmonary vasoconstriction (HPV), on the other hand, metabolic acidosis augments HPV. In order to clarify this issue, Nilsson and colleagues demonstrated that hypercapnic acidosis does not potentiate HPV, but transiently weakens it, without affecting endogenous pulmonary NO production [63].

Pulmonary ischemia–reperfusion (IR) is associated with a wide range of clinical events resulting in high morbidity and mortality [64]. Microarray analysis reported that tyrosine phosphorylation (Src PTK) is an important molecular mechanism related to acute inflammatory response. Blocking Src activation before ischemia may represent a novel therapy to reduce pulmonary IR-induced lung injury [65].

There are controversies regarding the beneficial effects of negative pressure ventilation compared to positive pressure ventilation [66]. Engelberts et al. [67] tested the hypothesis that negative pressure exerts the same pattern of lung distension as positive pressure when the pressure–time and volume history profiles are identical and the application of negative pressure is over the whole lung. They reported no differences in the effects of positive versus negative pressure ventilation on lung volumes and oxygenation.

The introduction of NAVA and measurements of diaphragm electrical activity (EAdi) enable the quantification of neural inspiratory drive [68]. So far, however, there has been no information regarding the relative contribution of the patient versus the ventilator to the inspiratory tidal volume (VTinsp). Grasselli and colleagues [69] showed that VTinsp and EAdi can be used to predict the contribution of the inspiratory muscles versus that of the ventilator during NAVA, and thus quantify and standardize the adjustment of the level of assist, reducing the risk of excessive ventilator assist.

Positron emission tomography (PET) has been used to monitor the inflammatory response in acute lung injury (ALI), but its role in monitoring the fibroproliferative phase of ALI has to be evaluated. At the late phase of experimental ALI, a correlation was found between fibrosis and high PET signal. Therefore, PET imaging is a valid means of tracking both the inflammatory response and fibrotic evolution of ALI [70].

There are several interesting studies examining the impact of ventilatory settings in experimental acute respiratory distress syndrome (ARDS) models implicated with other underlying problems. Intra-abdominal hypertension can induce a significant respiratory dysfunction, and is associated with an increased risk of mortality in critically ill patients [71]. Santos et al. [72] tested the hypothesis that intra-abdominal hypertension contributes to pulmonary inflammatory and fibrogenic responses in ARDS, and mechanical ventilation at high tidal volume is lung protective in extrapulmonary ARDS, but not in primary ARDS. Using a rat model of ARDS induced by intratracheal or intraperitoneal injection of endotoxin, the researchers showed that in the presence of intra-abdominal hypertension, mechanical ventilation at a tidal volume of 10 mL/kg could limit atelectasis, and thus be beneficial by decreasing lung elastance, lung apoptosis and the production of cytokines in extrapulmonary ARDS. However, mechanical ventilation at the same tidal volume resulted in an increase in cytokine response when intra-abdominal hypertension was present in primary ARDS. It is noted that this conclusion was made based on a complex animal model where the intra-abdominal hypertension was set at a fixed level. Thus clinical relevance of the study is yet to be evaluated. Retention of CO2 during mechanical ventilation, in particular at low tidal volume, may be problematic in certain clinical situations. Extracorporeal CO2 removal combined with low tidal volume or low frequency positive pressure ventilation has been shown to improve gas exchange in animals and in humans [73]. However, a safety concern of long-time application of CO2 removal devices remains. Wearden et al. [74] conducted an experiment in awake and standing sheep to examine the efficiency and safety of an extracorporeal veno-venous CO2 removal system over a chronic phase of 8 days. They reported that the device Hemolung removed significant amounts of CO2 at low blood flows with minimal adverse effects. Although the results are promising, the role of the CO2 removal device in ARDS conditions remains to be defined. Patients with ARDS are at risk for dying not only from their critical illness, but also from secondary processes such as nosocomial infection. Ventilator associated pneumonia (VAP) is the most common nosocomial infection in patients receiving mechanical ventilation, despite many advances in technology, antibiotic therapy and care of the patients [75]. Zanella et al. [76] developed a very interesting animal model to test the hypothesis that adequate tracheal orientation position can avoid aspiration of pathogens into the lung. The investigators tested the specific hypothesis that gravitational forces pull bacteria-laden secretions from the oropharynx into the lungs in both supine and semirecumbent position (i.e., tracheal tube is above horizontal). On the contrary, if the trachea tube is kept below horizontal, it will facilitate mucus clearance. The authors demonstrated in a pig model that all animals kept with an orientation of the trachea 45° above horizontal developed VAP and respiratory failure, while none of the pigs kept with an orientation of the trachea below horizontal developed respiratory failure or infection, which was likely a result of secretions being drained outward. This study brings in an interesting concept in the future management of VAP.

A couple of excellent studies investigated the noninvasive techniques to monitor chest wall tidal displacement and tissue oxygenation. Mechanical ventilation can have important impact on lung compliance or airway resistance, which in turn results in changes of the dynamics of chest wall motion [77]. The study by Waisman et al. [78], using a pneumothorax model in rabbits, monitored the dynamics of chest wall motion, with the attempt to identify the reasons for a delayed recognition in the evolution of pneumothorax. Conventional monitoring methods of oxygenation and expired CO2 served as gold standard. Interestingly, the investigators found that a delay in diagnosis was due to a decrease in CO2 without changes in SpO2, which was contrary to expectations. This finding could well explain the phenomenon of delayed diagnosis in the neonatal unit, where babies sometimes experience small, progressing pneumothorax that was misdiagnosed by CO2 readings. The study also concluded that the use of motion sensors to measure chest wall tidal displacements could indicate uneven ventilation on the affected side. An obvious limitation of the study is that the study was performed in healthy animals. Future studies are required in animal models of lung injury and at bedside in pediatric units. Early detection of oxygenation failure is crucial to save life. Both macro- and micro-circulatory transport of oxygen to tissues, causing regional hypoxia takes place in ARDS and sepsis [79]. It is a very challenging issue for physicians to identify tissue hypoxia, in particular local hypoxia, in a timely fashion. Sensitive tools to detect the status of tissue oxygenation are much appreciated at bedside. Dyson et al. [80] performed an interesting study in a pig model, where tissue hypoxia was induced by a variety of manipulations that are important in critical care medicine. These manipulations include changes in inspired oxygen fraction and positive end-expiratory pressure, induction of haemorrhage and challenge of vasopressors and inotropic agents. The investigators demonstrated that the monitoring of bladder tissue oxygenation provided a sensitive indicator of organ hypoxia compared to traditional global parameters during various cardiorespiratory challenges. This study establishes some promising baseline observations and offers a potentially useful tool for clinical monitoring of tissue oxygen tension.

Pulmonary hypertension (PH) is the most serious chronic disease of the pulmonary circulation but, so far, has been poorly investigated in critically ill patients. Lourenço and colleagues evaluated the role of dual endothelin-1 blocker (tezosentan) in an experimental model of PH [81]. Intravenous administration of tezosentan attenuated pulmonary hypertension, reduced lung inflammation and increased vasodilation. Therefore, when the enteric route of administration is not tolerated or a precise real time control of haemodynamics is required, tezosentan seemed to be a good therapy for PH.

Cardiac arrest and ischemia reprefusion

Asphyxia is the primary cause of death in newborns and, so far, there are no standard clinical treatments to protect the immature newborn heart and intestine from hypoxia-reoxygenation injury. However, an experimental study showed that a single early intravenous cyclosporine bolus following reoxygenation improved cardiac function and mesenteric perfusion with attenuated myocardial and intestinal damage [82]. Nevertheless, immediate treatment may not be feasible for a large portion of newborns. In this line, Gill et al. [83] compared both early and late cyclosporine therapy in asphyxiated newborn piglets and demonstrated that even though both therapies improved cardiac recovery and myocardial oxygen transport, early treatment offered superior cardio-protection and intestinal injury attenuation.

Hydrogen sulfide (H2S) presents neuroprotective effects, since it reduces both metabolism and temperature, and may be a good therapeutic option for cardiac arrest. Wei and colleagues [84] observed that inhalation of H2S reduced neurohistopathological damage and improved early neurological function after cardiac arrest and resuscitation.

Vasopressin has been used in hypotensive shock in adults [85], but its use remains limited in neonatal population. Dose–response effects of vasopressin on systemic haemodynamics were evaluated, along with mesenteric and cerebral perfusion in asphyxiated newborn piglets. Vasopressin improved systemic haemodynamics without compromising cardiac function, cerebral and mesenteric haemodynamics in asphyxiated newborn piglets [86].

Inhalation of nitric oxide has been used for many years for the treatment of ARDS and pulmonary hypertension [87, 88]. It is believed that inhalation of NO may have other pulmonary and extrapulmonary effects. In a case report, inhalation of NO has been shown to be beneficial in patients with acute right ventricular failure due to myocardial infarction [89]. Neye et al. [90] hypothesized that NO inhalation during the ischemic phase of acute myocardial infraction improves left ventricular function and reduce infarct size. The investigators examined the efficacy of inhaled NO during left anterior descending coronary artery occlusion and subsequent reperfusion, in comparison with nitrite or a soluble guanylate cyclase inhibitor that were administered intravenously in rats. The study showed that inhalation of NO during ischemia or reperfusion improved left ventricular systolic function by reducing myocardial infraction size and area-at-risk. The protective effects of inhalation of NO and nitrite infusion were blunted by the use of a soluble guanylate cyclase inhibitor. The current study provides baseline observation that inhalation NO may be protective against left ventricular dysfunction during ischemia and reperfusion. Further investigations are required to confirm the findings and to understand the mechanisms involved.

Sepsis

Patients suffering from sepsis are frequently implicated with acute kidney injury. The primary resuscitation strategy for these patients is fluid replacement aiming to improve organ perfusion and oxygenation [91]. In animal models of sepsis, it has been reported that LPS-induced endotoxemia is associated with renal hypoperfusion, microvascular hypoxia and systemic inflammatory response [92]. Vasoconstrictors are thus commonly used in the treatment, but their effects on bioenergetics remain largely unknown. A study by May et al. [93] tested the hypothesis that the infusion of a vasoconstrictor would have a deleterious effect in renal blood flow and bioenergetics during early gram-negative sepsis. The authors employed a model of hyperdynamic sepsis in a large mammal of sheep to closely mimic human disease. It is interesting to notice that the intravenous administration of angiotensin II restored blood pressure, but further reduced renal blood flow in the septic model. However, no change in renal ATP was observed. The authors concluded that during early hypotensive endotoxemia, there was no evidence of renal bioenergetic failure despite decreased renal blood flow, and that the treatment with the powerful renal vasoconstrictor angiotensin II did not lead to deterioration in renal bioenergetics. The results are encouraging with respect to the kidney concern when vasoconstrictors are administered during sepsis, but further studies are required to examine the effects of different vasoconstrictors on renal bioenergetics in relation to organ function and tissue repair. In sepsis, tissue repair mechanisms may be impaired.

Sepsis is the leading cause of death in critically ill patients, and is associated with uncontrolled and excessive cytokine/chemokine release [94]. However, the therapeutic intervention target on cytokines has failed to reduce mortality, suggesting the molecular complexity within the inflammatory response. In order to better understand the pathogenesis of sepsis, Finney et al. [95] investigated the cytokine/chemokine release in response to lipopolysaccharide (LPS) or lipoteichoic acid (LTA), which is a cell wall component of Gram-positive bacteria and the functional equivalent to LPS in Gram-negative bacteria. At clinically relevant concentrations, LPS and LTA induced different cytokine/chemokine release, in accordance with clinical studies comparing Gram-positive and Gram-negative infections [96]. These data suggest the importance of understanding the nature of cytokine regulation between Gram-positive and Gram-negative bacteria for the future treatment of sepsis.

The mechanisms behind the effects of temperature on central nervous system (CNS) require elucidation. Hypothermia reduces while hyperthermia increases the early activation of nuclear factor (NF)-kB and the subsequent production of tumor necrosis factor (TNF)-α, interleukin (IL)-10, and nitric oxide. In this line, temperature dependent changes in microglial TNF-α production during the early phase and interleukin (IL)-10 and nitric oxide production during the late phase indicate that these factors could be useful as clinical markers to monitor hypothermia-related neuronal protection and hyperthermia.

The activation of innate immune cells is critical for appropriate control of invading microorganisms. In sepsis, the hyperresponsiveness may lead to production of an excess of proinflammatory mediators and consequently, multiple organ failure [97]. Myeloid-derived suppressor cells (MDSC), a heterogeneous population of cells, present immunomodulatory effects and have been studied in cancer, inflammation, and infection. Derive and colleagues showed a protective effect of MDSC in sepsis and suggested the development of pharmacological agents known to promote the expansion of MDSCs [98].

The accumulation of apoptotic cells can release toxic and proinflammatory mediators that lead to tissue damage. Milk fat globule-EGF factor VIII (MFG-E8), mainly produced by macrophages and dendritic cells, is an opsonin for apoptotic cells and acts as a bridging protein between apoptotic cells and phagocytes [99]. A study by Shah et al. [100] showed that the treatment with recombinant human (rh) MFG-E8 protein reduced organ damage in liver and kidney that was associated with an attenuated systemic inflammatory response in a rodent model of sepsis induced by cecal ligation and puncture. The protective effects of MFG-E8 appeared to be in a dose-dependent manner. However, there are many unanswered questions in the study: How does the rhMFG-E8 protein act on cells and organ systems? When is the MFG-E8 administered? What is the frequency of the drug delivery? The answers to these questions will justify the use of MFG-E8 in sepsis.