FormalPara Take-home message

Not just a result of excessive ultrafiltration, hemodynamic instability related to renal replacement therapy (HIRRT) can result from multiple, independent, and potentially overlapping mechanisms through which RRT results in decreased cardiac output or decreased peripheral resistance. An improved understanding of these mechanisms may lead to better interventions to limit HIRRT across all RRT modalities commonly used in the ICU.

Introduction

Acute kidney injury (AKI) requiring renal replacement therapy (RRT) is a frequent complication of critical illness, occurring in up to ~ 15% of patients admitted to the intensive care unit (ICU) [1]. Hemodynamic instability related to renal replacement therapy (HIRRT) is a complication of all RRT modalities commonly used in the ICU, including intermittent hemodialysis (HD), sustained low-efficiency dialysis (SLED) and continuous renal replacement therapy (CRRT). More specifically, HIRRT has been shown to affect 10–70% of HD treatments [2,3,4,5] and 19–43% of CRRT treatments [6, 7]. This variability in the reported frequencies within RRT modalities is partly attributable to the lack of a consensus definition for HIRRT [8, 9]. The Kidney Disease Outcomes Quality Initiative (K-DOQI) definition of intradialytic hypotension (IDH) for patients with end-stage kidney disease (ESKD) on maintenance HD involves a ≥ 20 mmHg drop in systolic blood pressure (SBP) or > 10 mmHg drop in mean arterial pressure (MAP) and the presence of symptoms related to IDH. This definition is not relevant to critically ill patients who may not be able to report typical symptoms of hypotension or hypoperfusion and whose blood pressure is strongly influenced by concurrent illness (e.g., sepsis, cardiogenic shock) and treatments (e.g., mechanical ventilation, vasopressors).

Does HIRRT matter clinically? It does as HIRRT is not only associated with a higher in-hospital mortality [10] but may also be associated with decreased kidney recovery [11]. Human and animal data suggest that patients with AKI may be particularly vulnerable to ischemic kidney injury when there is a drop in blood pressure since autoregulation of kidney perfusion is impaired in this setting [12, 13].

Our recent systematic review of intervention studies for HIRRT in critically ill patients found little high-quality evidence to guide prevention and treatment of this important and common complication [9]. Nevertheless, many studies provide insight into the underlying mechanisms responsible for HIRRT in critically ill patients with AKI. Although there is significant overlap, and HIRRT may be a consequence of multiple mechanisms in any given patient, the basic mechanisms involved (see Fig. 1) are decreased cardiac output [either as a consequence of hypovolemia (see Fig. 2) and/or pump failure (see Fig. 3)] and decreased peripheral resistance (i.e., distributive shock) (see Fig. 4), all in the context of inadequate physiologic compensation. The normal compensatory physiologic responses to hypotension (e.g., recruitment of unstressed blood volume, increased heart rate and contractility due to sympathetic activation) are more likely to be impaired in critically ill patients with AKI.

Fig. 1
figure 1

Summary of underlying mechanisms that contribute to HIRRT. Mechanisms include: (1) hypovolemia, (2) systolic/diastolic dysfunction, and (3) decreased vascular tone from distributive shock. They can be due to patient- or RRT-related factors or both. There is often an overlap of these mechanisms. HIRRT is associated with increased mortality. HIRRT (i.e., recurrent hypotension) may impair renal recovery, a phenomenon that may be exacerbated by impaired kidney blood flow autoregulation in AKI

Fig. 2
figure 2

Contributors to hypovolemia and HIRRT in critically ill patients with AKI requiring RRT. Both RRT-related and patient-related factors can contribute to hypovolemia and the development of HIRRT in the context of inadequate physiologic compensation

Fig. 3
figure 3

Contributors to cardiac dysfunction and HIRRT in critically ill patients with AKI requiring RRT. Both RRT and patient-related factors are implicated, in the presence of inadequate physiologic compensation. RRT induces transient episodes of reduced myocardial perfusion, leading to myocardial stunning, which is seen as regional wall motion abnormalities (RWMAs). UF and osmolar shifts can induce hypovolemia which can precipitate a Bezold–Jarisch reflex. Patient factors include underlying cardiac disease, critical illness and associated treatment (mechanical ventilation, fluid and vasopressors) and complications of critical illness such as bowel ischemia which itself can be exacerbated by HIRRT (not shown)

Fig. 4
figure 4

Contributors to decreased vascular tone that lead to HIRRT in critically ill patients with AKI requiring RRT (+) and some treatment strategies (−). Question mark represents an unproven, theoretical effect on vascular tone. RRT-related and patient-related factors are represented on the top and bottom of the figure, respectively

While excessive ultrafiltration alone can result in HIRRT, recent evidence suggests that it is often not the primary driver in critically ill patients [4, 5]. In what follows, we review RRT-related mechanisms for HIRRT (see Table 1), including an assessment of potential interventions to mitigate it.

Table 1 Potential RRT-related interventions to limit HIRRT

RRT-related mechanisms for HIRRT

Excessive ultrafiltration

Even in the absence of hypovolemic shock, the pathophysiology of hypotension often involves reduced cardiac preload by impaired mobilization of unstressed blood volume due to defective venoconstriction [14] as well as redistribution of fluids from the intravascular space into the interstitial compartment due to sepsis and the inflammatory response [15]. Nonetheless, repeated fluid administration to correct intravascular volume depletion can lead to significant fluid accumulation which is potentially harmful. Cumulative fluid balance is an independent risk factor for increased mortality in septic shock and AKI [10, 16, 17], and RRT is frequently initiated for fluid removal. The possibility that using RRT to more aggressively correct fluid overload is beneficial was demonstrated in a study by Murugan et al. which evaluated critically ill patients with fluid overload (> 5% increased body weight) and found more intensive ultrafiltration (UF) (i.e., fluid removal with RRT) was associated with lower 1-year risk-adjusted mortality [18].

RRT itself can result in an acute reduction of cardiac preload due to ultrafiltration and/or fluid shifts related to osmolality changes (discussed below and outlined in Fig. 2). Plasma refilling from fluid in the interstitial and intracellular compartments compensates for fluid removal with UF. When the UF rate exceeds the plasma refilling rate, the resultant intravascular volume depletion in the context of inadequate compensatory mechanisms will result in HIRRT [14, 19]. RRT modalities which utilize a lower UF rate but achieve an equivalent fluid removal thanks to a longer treatment time can be expected to be progressively less likely to precipitate HIRRT due to this mechanism (i.e., CRRT < SLED < intermittent HD). Although obvious, it warrants emphasizing that, no matter the UF rate, when the simultaneous rate of net fluid accumulation is positive, excessive UF cannot be the cause of HIRRT.

Preload dependence, defined as a significant increase in cardiac index (CI) with preload increase, can be assessed with a fluid challenge or with passive leg raising (PLR), a maneuver that increases venous return to the right heart by approximately 300 mL [20]. Not surprisingly, a positive PLR before RRT has been reported to be predictive for HIRRT. In a study by Monnet et al. [21], HIRRT was defined as a MAP lower than 65 mmHg (or a MAP lower than 80 mmHg in patients with chronic hypertension) and “requiring a clinical intervention”. This study of 39 patients measured CI using transpulmonary thermodilution prior to PLR, and peak CI during PLR using pulse contour analysis. The authors found that a PLR-induced increase in CI of > 9% predicted the occurrence of HIRRT with a sensitivity of 77% and a specificity of 96% [21]. Another study of 47 ICU patients requiring HD in ICU by Bitker et al. [4] measured CI using continuous pulse contour analysis with a PiCCO® device and performed PLR testing at the onset of HIRRT, defined as the first occurrence of MAP below 65 mmHg. They found that preload dependence was present for only 19% of such episodes. This suggests that preload dependence, although clearly implicated in HIRRT, may not be the primary cause of HIRRT in many critically ill patients with AKI undergoing treatment with RRT in the ICU [4]. In keeping with this finding, Schortgen et al. [5] previously showed that HIRRT frequently occurs early during HD in ICU patients, prior to when significant fluid removal via UF has occurred. Moreover, clinical variables associated with preload-dependent HIRRT identified by Bitker et al. included the use of mechanical ventilation and higher pulmonary vascular permeability index (PVPI) at HD onset. Positive pressure mechanical ventilation will itself reduce right ventricular preload and may thus have predisposed patients to HIRRT when there is additional preload reduction with RRT. A higher PVPI reflects increased capillary permeability in the pulmonary vasculature and likely correlates with impaired plasma refilling (beyond just the lungs) [4]. This underscores a crucial pathway that underpins an increased risk of HIRRT in critically ill patients: inflammation due to critical illness leads to increased capillary leak and impaired plasma refilling. This increases the likelihood of HIRRT with UF. Plasma refilling may also be impaired due to disruption or obstruction of the lymphatic system as a result of interstitial edema in critically ill patients [22]. When refilling is impaired for any reason, even lower rates of UF might be expected to precipitate HIRRT.

Lastly, dialysis-induced hypovolemia may be exacerbated by the Bezold–Jarisch reflex which, triggered by LV emptying, results in the loss of peripheral sympathetic vasoconstrictor tone, worsened hypotension and bradycardia [23]. Patients with underlying left ventricular hypertrophy (LVH) and diastolic dysfunction are thought to be predisposed to this phenomenon [14].

Rapid osmotic/oncotic shifts

When RRT results in rapid plasma solute clearance, there is a consequent reduction in plasma osmolality that promotes free water to shift from the intravascular to the interstitial and intracellular spaces that are left with relatively higher osmolality. This results in decreased effective arterial blood volume and reduced plasma refilling leading to HIRRT when compensatory mechanisms are inadequate or if coupled with excessive UF. The rate of change of plasma osmolality depends on the pre-RRT osmolality of the patient and the small solute clearance achieved by the RRT. Thus, one can expect more gradual osmotic shifts with prolonged intermittent or continuous RRT modalities relative to intermittent HD.

Maintenance HD patients with ESKD have elevated pre-dialysis plasma osmolality, typically in the range of 291–339 mOsm/kg (normal 275–295 mOsm/kg) and plasma osmolality may decline by up to 33 mOsm/kg with HD [24,25,26,27]. Higher pre-dialysis calculated plasma osmolality correlates with an increased risk of IDH in this population [28]. Moreover, a higher dialysis dose with HD and greater urea removal rate have also been shown to correlate with IDH [29]. Given the association between IDH and rapidity of the dialysis-associated decline in plasma osmolality, studies have evaluated the efficacy of isolated UF (not inducing an osmolality change) and hypertonic solutions (preventing fluid shifts) for mitigating HIRRT. A study of maintenance HD patients found that isolated UF [i.e., treatment using the HD machine to perform UF without any diffusive clearance (i.e., dialysis)] maintained blood pressure stability compared to usual HD, and that hypertonic mannitol stabilized plasma osmolality and prevented post-dialysis orthostatic hypotension [30]. A larger study including both AKI and maintenance HD patients (excluding those on vasopressors or transitioning from CRRT) found that administration of hypertonic mannitol during HD initiation prevented hemodynamic instability [31].

Although this data primarily came from ESKD patients on maintenance HD, patients with severe AKI who initiate RRT with increased plasma osmolality due to significant uremia can also be expected to experience rapid osmotic shifts that may contribute to HIRRT. The RRT-related interventions assessed to address this mechanism in critically ill patients have included the use of dialysate sodium profiling with or without UF profiling, and have shown mixed results in this population [32,33,34]. Briefly, sodium profiling involves initiating RRT with a high dialysate sodium, which is decreased in a stepwise manner, resulting in the reduction of osmotic fluid shifts between intravascular and interstitial compartments [35]. UF profiling refers to using variable rates of UF throughout the treatment so as to use higher rates when that is likely to be best tolerated (i.e., at the start rather than the end of treatment when there is more extravascular fluid available for refilling, or according to an online measurement of ‘relative blood volume’/hemoconcentration (as reflected by real-time hematocrit monitoring)). The impact of these interventions on hemodynamics with CRRT remains unexplored. However, serum osmolality changes with CRRT would be expected to be slower than for other RRT modalities and less likely to provoke HIRRT due to osmotic shift in the first place.

Use of a high sodium dialysate concentration (> 145 mmol/L, without profiling), to limit osmotic shift and promote hemodynamic stability, is one component of a regimen shown to limit HIRRT in an important study by Schortgen et al. [5] (discussed in more detail in the section on ‘Multimodal approach to the prevention of HIRRT’). Nonetheless, it must be noted that simultaneous co-interventions (such as using cool dialysate) may have also played a role in the efficacy of this regimen. A theoretical concern of high sodium dialysate in ESKD patients is that it may result in an overall positive sodium balance for the patient at the end of treatment thereby exacerbating fluid overload in the longer term, as this will drive increased thirst. This may be less applicable to critically ill patients. Sodium loading may also be somewhat abrogated if improvements in hemodynamic stability enable greater (sodium-containing) fluid removal through UF.

Plasma oncotic pressure itself is largely driven by albumin and hypoalbuminemia is associated with an increased risk of IDH in cohort studies of maintenance HD patients [36, 37]. Patients with critical illness are often hypoalbuminemic [38] and since albumin is the primary contributor to plasma oncotic pressure [39], this may be a mechanism that contributes to relative intravascular volume depletion in this population. Consequently, albumin is often proposed to prevent or treat HIRRT in maintenance HD and critically ill populations. Nonetheless, very little evidence exists in this area. A randomized cross-over trial in maintenance HD patients whose IDH was treated with 5% albumin compared to saline reported no significant difference in the achievement of ordered fluid removal [40]. In critically ill septic patients, a study evaluated priming the HD circuit with 17.5% albumin versus saline and reported that albumin significantly improved hemodynamic tolerance of HD. It should be noted, however, that this was a cross-over trial that included only eight patients [41].

Dialysis blood flow (Q B)

Although there is a pervasive false belief that higher QB is often associated with more hypotension [42], in the modern era, QB has no direct effect on HIRRT given the closed extracorporeal circuit in which blood removal is matched by its return at a nearly identical rate [42, 43]. The only way QB can affect hemodynamics is the extent to which it determines the rapidity of small solute clearance and associated osmolality changes.

Two cross-over design studies of stable ESKD patients on maintenance HD found that QB had no effect on blood pressure [44, 45]. In the context of critical illness and AKI, slower blood flow rates at the time of CRRT initiation with slower increase in QB to the target rate of 200 mL/min also did not show any impact [46]. There are no studies assessing the impact of the QB in critically ill patients with AKI with other RRT modalities.

Myocardial stunning

Cardiac mechanisms comprising patient and RRT-related factors implicated in HIRRT are summarized in Fig. 3.

Transient intradialytic cardiac dysfunction in maintenance HD patients, characterized by regional wall motion abnormalities (RWMAs) or ‘myocardial stunning’, has been reported and appears related to reduced myocardial perfusion in the absence of atherosclerotic coronary artery disease [47]. While it is unclear to which extent this is a cause or consequence of HIRRT, there is evidence that this phenomenon is independent of volume removal, given that LV systolic and diastolic dysfunctions frequently occur early after the start of HD sessions, before much ultrafiltration has occurred. In addition, coronary artery blood flow appears to be maintained in maintenance HD patients while they are experiencing this phenomenon [48]. Inflammatory reactions related to blood contact with the dialyzer surface may contribute to this volume-independent mechanism of myocardial stunning [49, 50]. Irrespective of the underlying etiology, RRT-related myocardial stunning has also been described in critically ill patients: a pilot study of 11 hospitalized patients with AKI treated with HD, none requiring inotropic or ventilator support, found that all patients developed myocardial stunning with RWMAs during dialysis and a decline in global LV contractility [51]. This phenomenon has recently also been reported in critically ill patients treated with CRRT: 10 of 11 patients developed new RWMAs within the first 4 h of starting CRRT (despite maintaining stable hemodynamics) [52].

Effect of temperature changes with RRT

The use of cool dialysate may improve hemodynamic tolerance of RRT by increasing systemic vascular resistance and centralizing blood volume by peripheral and visceral vasoconstriction [14, 53]. Low dialysate temperature prescriptions most commonly include fixed, programmed, and isothermic cooling, which involve empirically decreasing dialysate temperature, standard reduction in dialysate temperature below body temperature, and maintaining pre-dialysis temperature, respectively [53]. The evidence for cool dialysate improving hemodynamic tolerance of RRT without compromising the adequacy of treatment is summarized in systematic reviews and meta-analyses from studies of chronic ESKD patients on HD, which also included cooling methods in the analyses [54, 55]. An important proof-of-concept study by Selby et al. also demonstrated that, in ESKD patients on HD, dialysis-induced left ventricular dysfunction (myocardial stunning) was ameliorated by the use of cool dialysate [56].

The use of cool RRT dialysate for the prevention of HIRRT in critically ill patients across RRT modalities is less established. Isothermic dialysis with blood volume monitoring, a technique whereby dialysate temperature is adjusted to maintain patient temperature and blood volume is monitored to predict hypotension, did not reduce the incidence of HIRRT with HD in an RCT of 74 critically ill patients with AKI (N = 574 RRT sessions). The standard treatment group in this study was, however, dialyzed with cool dialysate, and high dialysate sodium and calcium concentrations [57]. Another RCT of 39 critically ill patients evaluated SLED sessions (N = 62 sessions) using cool dialysate combined with sodium and UF modeling and this resulted in improved hemodynamics [33]. A recent prospective randomized cross-over pilot study of patients receiving SLED for AKI (N = 21 patients, 39 RRT sessions per arm) found that cool dialysate alone (without sodium or UF modeling) was protective against HIRRT [58].

In CRRT, the evidence regarding cooling is less robust. Rokyta et al. conducted a study of continuous veno-venous hemofiltration (CVVH)-induced cooling (rather than cool dialysate) on global hemodynamics in nine septic patients, showing that mild core cooling raises SVR and MAP, without compromising hepatosplanchnic oxygen and energy balance [59]. One small pilot study (RCT with cross-over design, N = 30 patients) of lower temperature settings at the time of CRRT initiation showed that it improved hemodynamic stability [60]. Overall, reducing body temperature by means of RRT appears a promising strategy to refine and pursue with the goal of reducing HIRRT across RRT modalities.

Dialysate/RRT fluid composition

Calcium concentration

The dialysate calcium concentration (typical range in dialysate is 1.0–1.75 mmol/L) may also have an effect on myocardial contractility and risk of arrhythmia, as well as vascular tone. Lower dialysate calcium concentration and the consequent increase in serum to dialysate calcium gradient have also been associated with an increased risk of sudden cardiac arrest in maintenance HD patients [61]. Hypocalcemia is common in critically ill patients, and ionized calcium < 1.0 mmol/L has been found to be an independent predictor of higher all-cause mortality in critically ill patients with AKI requiring RRT [62]. Although Fellner et al. [63] found that total systemic vascular resistance was not significantly affected by changes in blood ionized calcium levels, Scholze et al. [64] showed that during HD with a dialysate calcium concentration of 1.75 mmol/L, the extracellular calcium concentrations increased along with increased arterial tone obtained from radial artery pulse waves. Mechanistically, physiologically relevant changes in extracellular calcium during RRT could reduce the risk of HIRRT by increasing blood vessel tone through direct activation of calcium-sensing receptors (CaSR) expressed on vascular smooth muscle cells [65]. There have been no studies assessing the hemodynamic impact of high dialysate calcium in critically ill patients with AKI. Rather, several studies employed a dialysate calcium concentration of 1.75 mmol/L in their standard dialysis prescription [5, 33, 57], suggesting standard use of higher dialysate calcium in their ICUs, although there is no consensus on this practice in current literature [66].

While it is possible to conclude that serum calcium concentration has an impact on hemodynamics as well as other outcomes and increasing dialysate calcium concentration may be a viable strategy to reduce HIRRT, there are important caveats. This is not likely to be applicable to patients receiving CRRT with standard citrate anticoagulation due to the requirement for calcium-free solutions (and post-filter calcium infusions) to achieve regional anticoagulation. More importantly, some pre-clinical and clinical studies have shown that the use of calcium to correct hypocalcemia in critically ill patients not requiring RRT may be harmful [67]. The use of higher dialysate calcium to prevent HIRRT has not been subject to rigorous studies evaluating its safety in critically ill patients. Overall, this highlights the need for trials that evaluate meaningful clinical outcomes prior to routinely adopting interventions to mitigate HIRRT based on the physiologic principles alone.

Potassium concentration

Potassium has vasoactive properties: intra-arterial infusion of potassium induces vasodilation from hyperpolarization of vascular smooth muscle cells due to activation of both the Na+–K+-ATPase and the inwardly rectifying potassium (Kir) channels, and, lowering blood potassium concentration can produce vasoconstriction [68]. Dolson et al. [69] compared the effects of different dialysate potassium concentrations (1.0, 2.0, and 3.0 mmol/L) in 11 ESKD patients on HD, and, although there were no significant differences in intra-dialytic blood pressure between the groups, they found significant increase in blood pressure 1 h post-dialysis with lower dialysate potassium concentrations (1.0 and 2.0 mmol/L), which the study termed ‘rebound hypertension’. They hypothesized that this effect was mediated in part by systemic arteriolar vasoconstriction [69]. A randomized cross-over study by Gabutti et al. [70] of 24 ESKD patients and 288 HD sessions showed, however, that low dialysate potassium concentration, inducing a rapid reduction in serum potassium, resulted in decreased blood pressure, which correlated with decreased peripheral resistance. This hemodynamic effect was most pronounced in the initial phase of dialysis when the difference between dialysate and serum potassium was largest and progressively diminished during the session as the gap narrowed [70]. The transient decrease in blood pressure associated with lowering serum potassium due to the use of low dialysate potassium is inconsistent with the experimental physiology described above. This may be attributed to important differences between the physiology of animal experimental models and patients with ESKD requiring RRT, who are prone to cardiac and autonomic dysfunction. Further, there are no studies evaluating the effect of dialysate potassium concentrations in critically ill patients with AKI.

Buffer

Historically, acetate-based dialysate buffers were implicated in HIRRT but are no longer relevant to current clinical practice.

Lactate, which is occasionally used as a dialysate buffer, is usually converted to bicarbonate by the liver. If this process is impaired by liver dysfunction in critical illness, acidosis by way of lactate accumulation can result [71]. A Cochrane systematic review of four studies comparing bicarbonate-buffered solutions with lactate-buffered solutions for AKI concluded that, although there was no significant mortality difference, bicarbonate-based dialysate results in less HIRRT and lower serum lactate levels [72]. As a result, bicarbonate-based buffers are standard in current practice [73].

Dialyzer bio-incompatibility

Dialyzer membrane (filter) bio-incompatibility can provoke vasodilatory shock (including anaphylaxis) and HIRRT. Bio-incompatibility refers to a measure of leukocyte and complement activation in the extracorporeal circuit primarily at the dialyzer membrane where blood comes into contact with non-biological material [73]. Bio-incompatible dialyzers are no longer routinely used and most problems are related to the historical use of unmodified cellulose-based membranes. Other causes of, and more details pertaining to, HIRRT related to dialyzer bio-incompatibility are reviewed in Table S1.

Endotoxemia

RRT confers a risk of dialysate bacterial contamination with endotoxin and bacterial DNA fragment release that can activate inflammatory pathways [74] leading to clinical complications [75]. Further, endotoxin permeability may vary across types of synthetic dialyzer membranes [76]. High-flux dialyzer membranes used in HD require ultrapure dialysate fluid to reduce the risk of bacterial or endotoxin contamination by dialysate back-filtration (i.e., convective transfer) [77]. Proper water treatment and water and dialysate quality control are required to reduce the risk of patient exposure to bacteria and their products, which has led to the establishment of water and dialysate standards [78].

The frequency and impact of fluid contamination with CRRT is less clear. One study of 24 CVVH replacement fluid circuits revealed breaches in microbiological integrity evidenced by positive replacement fluid cultures and endotoxin assays, and biofilm formation in the tubing [79]. The authors proposed that circuit contamination may have occurred through non-sterile connections of replacement fluid bags. The clinical significance of replacement fluid contamination was not evaluated, but this study still highlights the need for strict infection control measures.

Diffusive versus convective clearance modes

The modality of solute clearance is a modifiable aspect of the RRT prescription that could theoretically affect vascular tone, particularly in patients with sepsis: the use of convective clearance modes could enable better removal of pro-inflammatory/HIRRT-inducing larger molecular weight mediators than diffusive clearance modes. There is insufficient evidence to support any recommendations in this regard. The evidence that does exist is provided in greater detail in Table S1.

Clearance of vasopressors

A rare potential cause of HIRRT related to changes in vascular tone has been reported to occur when the venous port of the dialysis catheter lies within the same vein and close to the port of a separate central venous catheter that is being used to administer vasopressors [80, 81]. HIRRT is presumed to occur as the vasopressors are cleared almost immediately from the circulation by RRT, prior to exerting their systemic effect.

Multimodal approach to the prevention of HIRRT

Limited research has been conducted on the prevention of HIRRT in critically ill patients [9]. The available studies are summarized in Table 2. The most recent guidelines in this area, published in 2015, by French expert panels in adult and pediatric intensive care, anesthesia, and dialysis provide recommendations based on expert opinion and variable quality evidence for the implementation of RRT in critically ill patients [66].

Table 2 Summary of studies evaluating RRT-related interventions for HIRRT in critically ill patients.

These recommendations are supported by an older retrospective cohort study by Schortgen et al. comparing the hemodynamic tolerance of HD in ICU patients with AKI before and after the implementation of centre-specific HD practice guidelines which included recommendations for systematic use and recommendations for their most hemodynamically unstable patients [5]. Systematic recommendations included modified cellulosic membranes instead of unmodified cellulose (cuprophane), dialysate sodium concentration set to 145 mmol/L or higher, maximal QB of 150 mL/min with minimal session duration of 4 h, and dialysate temperature of 37 °C or lower. For ‘more unstable’ patients, they recommended initiating the HD session with dialysis and continuing with UF alone, or initiating the session without UF, and then adapting the UF rate to the hemodynamic response. They also recommended further lowering the temperature to 35 °C, and discontinuing any vasodilator therapy [5]. The population treated according to their practice guidelines had improved hemodynamic tolerance relative to historical controls, as determined by less hypotension at session onset and for the remainder of the session. Further, there was a 10% reduction in HIRRT (defined as a fall in SBP over 10% from baseline or need for therapeutic intervention) between the two groups (71% vs 61%) [5].

Taking account of the Schortgen et al.’s study [5] in addition to more recent evidence [9], suggestions for limiting and managing HIRRT in critically ill patients are summarized in Table 3.

Table 3 Suggestions for preventing and managing HIRRT in critically ill patients

Conclusions

HIRRT is common in critically ill patients treated with RRT. It is associated with increased mortality and may impair renal recovery. Excessive UF is an important cause of HIRRT yet it may not be the predominant mechanism in many cases. Multiple other RRT-related mechanisms can lead to either decreased cardiac output, decreased peripheral resistance or both. This suggests that the appropriate response to HIRRT might not always be a cessation of UF, particularly in the context of significant fluid overload. Nonetheless, to our knowledge, no studies have directly sought to define the optimal response to HIRRT with respect to UF goals.

In addition to reviewing RRT-related mechanisms for HIRRT, we have reviewed potential interventions to mitigate HIRRT and the evidence for their use (which is somewhat limited due to the extent that it comes from studies of patients with ESKD on maintenance HD). Overall, there remains a paucity of evidence in the critically ill AKI population for RRT-related interventions to limit the occurrence of HIRRT. Further understanding and appreciation of both patient and RRT-related factors that contribute to HIRRT in critically ill patients with AKI are essential for the development of clinical studies and implementation of new strategies for its prevention and treatment.