Keywords

15.1 Potassium Intake Needs

Potassium (K) is an essential nutrient, that has been labeled a shortfall nutrient by recent Dietary Guidelines for Americans Advisory Committees (National Academies of Sciences and Medicine 2019; DeSalvo et al. 2016). Physiologically, K is the most abundant cation in intracellular fluid where it plays a key role in cell function, maintaining intracellular fluid (ICF) volume and transmembrane electrochemical gradients (Stone et al. 2016). Because K is a major intracellular ion, it is widely distributed in foods once derived from living tissues. Potassium concentrations are generally highest in fruits and vegetables, but can also be quite high in cereals, grains, dairy, and meat (DeSalvo et al. 2016; Stone et al. 2016). The evolution of dietary practices in the USA over the last several decades, and more recently worldwide, has seen a higher intake of low nutrient density convenience foods, coupled with decreased consumption of fruits and vegetables, leading to a diet lower in K and higher in sodium (Na) (Weaver 2013). The average intake of K of the US adults participating in the National Health and Nutrition Examination Survey (NHANES) 2013–2014 was 2668 mg K day−1, below the adequate intake (AI) of 3000 mg day−1 set forth by the 2019 DRI committee, and well below the previous AI of 4700 mg K day−1 (Institute of Medicine 2005; National Academies of Sciences and Medicine 2019). This chapter gives a comprehensive overview of K as a nutrient, the physiology of how it moves through the body including K bioavailability and excretion, and how this may affect vascular pressure, glucose metabolism, the movement and storage of calcium (bone) throughout the body, and the health consequences of these relationships.

15.1.1 Dietary Reference Intakes

The reference values for the intake of any nutrient are referred to as the Dietary Reference Intakes (DRIs) and include: the Estimated Average Requirement (EAR), or intake level at which 50% of the population have adequate intakes; the Recommended Dietary Allowance (RDA), based on the EAR, is sufficient to meet the requirements of nearly the entire population (98%); Adequate Intake (AI), used in lieu of an RDA when there is insufficient evidence to set an EAR and thus an RDA; and the Tolerable Upper Intake Level (UL), the estimated maximum intake that poses no health risk, developed from a “NOAEL” with a safety factor applied (Fulgoni 2007; Lupton et al. 2016; Millen et al. 2016; Institute of Medicine 2005). Dietary reference intakes are quantitative values established by review committees commissioned by the National Academy of Sciences, Engineering, and Medicine (NASEM), Health and Medicine Division (formerly the Institute of Medicine), after a review of the appropriate research surrounding any nutrient’s role in eliminating nutritional deficiencies, and reducing the risk of chronic disease. Basic concepts of establishing the proper level of intake for each nutrient are that the needs of healthy (non-diseased) individuals are met, nutrients are grouped by physiological functionality, and age groupings are revised to reflect changes of biological patterns (e.g., gender, growth, pregnancy, etc.) (Lupton et al. 2016; Millen et al. 2016; Institute of Medicne 2005). Chronic disease endpoints are only considered when a sufficient body of knowledge has been established. To this point, the recent Dietary Reference Intake report for sodium (Na) and K was the first to establish a chronic disease risk reduction (CDRR) level for Na, a new DRI intended to help differentiate between nutrient intakes necessary for adequacy vs. those which may improve health (National Academies of Sciences and Medicine 2019).

15.1.2 Potassium Intakes Worldwide

Recommended K intakes in various countries worldwide often utilize the guidelines set by the North American DRIs or World Health Organization (WHO) (Strohm et al. 2017; World Health Organization 2012). Despite this, few countries meet these recommendations and large global variation in K consumption exists (Weaver et al. 2018).

The most recent WHO recommendations for K intake come from guidelines published in 2012, examining key chronic disease endpoints related to blood pressure (BP), stroke, CVD, coronary heart disease (CHD), blood lipids, and catecholamines (World Health Organization 2012). Based primarily off one large systematic review with meta-analysis (Aburto et al. 2013), the WHO set recommendations to consume at least 90 mmol (~3500 mg) of K day−1 to reduce BP, cardiovascular disease (CVD), stroke, and coronary heart disease (World Health Organization 2012; Weaver et al. 2018).

Current recommendations for the USA and Canada were recently revised by the National Academy of Sciences, Health, and Medicine Division. According to the 2019 DRI guidelines for K, lack of a sensitive biomarker and limitations across K bioavailability and retention studies offer insufficient evidence to establish EAR and RDA levels for adequacy or deficiency (National Academies of Sciences and Medicine 2019). Because of this, the committee set AIs using intake data from two nationally representative surveys, NHANES and Canadian Community Health Survey (CCHS). The highest median K intake across the two surveys was selected for each DRI group and set as the AI. For adults, the data that informed the K AIs were from healthy, normotensive individuals without a self-reported history of CVD. In contrast to the 2005 DRI report, adult AIs were separated by sex, with a K intake of 3400 mg day−1 for men and 2600 mg day−1 for women (National Academies of Sciences and Medicine 2019). This is remarkably lower than the AIs established in 2005, set at 4700 mg day−1 for adults 18 and older (Institute of Medicine 2005). Because observational data looking at increased K intakes and CVD (and associated disease) risk are mixed (Newberry et al. 2018; National Academies of Sciences and Medicine 2019), CDRR intake level for K could not be established. Blood pressure was considered for a surrogate marker for CVD risk reduction, based on findings that show a reduction in BP with increased supplemental K intake (Newberry et al. 2018), but given the lack of clear evidence supporting K intake alone in the reduction of CVD and related mortality, the committee decided against this.

Actual K requirements would vary with an individual’s genetics, Na intake, and status of various health-related biomarkers. Potential benefits of increasing K consumption may include decreases in vascular pressure, optimal kidney function, improvement in glucose control, and possible bone benefit (He and MacGregor 2008; Weaver 2013).

15.2 Internal Balance of Potassium

15.2.1 Potassium Tissue Movement

About 90% of dietary K is passively absorbed in the small intestine. In the proximal small intestine (duodenum, jejunum) K+ absorption primarily follows water absorption, while distally (ileum) movement is more influenced by changes in transepithelial electrical potential difference. In the colon, K is both excreted, in exchange for Na, as well as reabsorbed via H+/K+ ATPases (Meneton et al. 2004). Total body K is estimated to be approximately 43 mmol K kg−1 in adults, with only 2% of this found in the extracellular fluid. Most of the body K content is found in the intracellular space of skeletal muscle. Potassium is the primary intercellular cation and plays a key role in maintaining cell function, having a marked influence on transmembrane electrochemical gradients (Palmer 2015; Stone et al. 2016). The gradient of K+ across the cell membrane determines cellular membrane potential, which, based on the normal ratio of intracellular to extracellular K+, is −90 mV. This potential difference is maintained in large part by the ubiquitous ion channel, the sodium-potassium (Na+/K+) ATPase pump. Transmembrane electrochemical gradients cause the diffusion of sodium (Na+) out of the cell and K+ into the cell. This process is reversed, and cellular potential difference is held constant, via the aforementioned Na+/K+ ATPase pumps. When activated, the Na+/K+ ATPase pump exchanges two extracellular K+ ions for three intracellular Na+ ions, influencing membrane potential based on physiological excitation or inhibition. These channels are partially responsible, along with the Na+/K+ chloride symporter, and sodium-calcium exchanger, for maintaining the potential difference across the resting cell membrane as well. Both resting membrane potential and the electrochemical difference across the cell membrane are crucial for normal cell biology, especially in muscle, cardiac, and nervous tissue (Palmer 2015; Unwin et al. 2011; Stone et al. 2016; Stone and Weaver 2018).

Distribution of K under normal physiological conditions is referred to as internal balance. In healthy individuals, the blood K concentration ranges between 3.5 and 5.5 mM, with numerous homeostatic mechanisms in place for maintenance within this narrow margin. Changes in plasma concentrations of K+ alter the electrochemical gradient and can lead to physiological dysfunction. In hyperkalemia, when K concentrations exceed 5.5 mM, membrane depolarization can lead to muscle weakness, paralysis, and cardiac dysrhythmias (e.g., sinus bradycardia, ventricular tachycardia, ventricular fibrillation). Conversely, hypokalemia, when K plasma concentration is below 3.5 mM, can cause membrane hyperpolarization, interfering with normal nerve and muscle function leading to muscle weakness and decreases in smooth muscle contraction (Stipanuk 2006). Hypokalemia can also cause both atrial and ventricular cardiac dysrhythmias, as well as lead to paralysis and if left untreated, death. Total body K is found intercellularly (98%) primarily in the muscle (70%) and to some extent all other tissues. Distribution and metabolism of K are influence by hormones (insulin, aldosterone, catecholamines), acidemia, and fluid balance.

In response to the dietary consumption of a high K meal, insulin enhances the cellular uptake of K+. Insulin, released from pancreatic beta-cells, increases K uptake via the stimulation of Na+/K+ ATPase activity in skeletal and cardiac muscle, fat tissue, liver, bone, and red blood cells, attenuating the rise in plasma K+ following consumption (Greenlee et al. 2009). Potassium uptake is also influenced by the stimulation of both α and β2 adrenergic receptors by the circulating stress hormones catecholamines (epinephrine, norepinephrine) (Palmer 2015; Unwin et al. 2011). Mechanistically, the insulin-mediated regulatory pathway leads to Na+/K+ ATPase activation via stimulation of cell surface tyrosine kinase receptors (insulin substrate receptor-1; IRS1), which also stimulates the translocation of intracellular glucose transport proteins (GLUT4 in muscle) facilitating the influx of glucose into the cell. Downstream activation of signaling cascades involving IRS1-phosphatidylinositide-3-kinase (PI3-K) and protein kinase A (PKA) facilitate both K and glucose uptake (Unwin et al. 2011; Stone et al. 2016). Catecholamine binding to β2 adrenergic receptors activates pathways mediated by cyclic adenosine-mono-phosphate (cAMP) and PKA to increase Na+/K+ ATPase activity and cellular K+ uptake. In contrast, stimulation of α1 and α2 adrenergic receptors, primarily through increased circulating levels of the stress hormone norepinephrine, lead to activation of hepatic calcium-dependent K+ channels and increased plasma K concentration via K release from the liver. Aldosterone, which has a marked effect on renal handling of K, may also influence the transmembrane distribution of K+ via stimulation of cellular Na+ uptake through activation of Na+/H+ or Na+/K+/Cl transporters and subsequently Na+/K+ ATPases (Unwin et al. 2011; Stipanuk 2006). While hormones play an important role in the movement of K+ within the body, the concentration of other ions (inorganic and organic) is also influential in maintaining proper internal balance (Stone et al. 2016).

Metabolic acidosis caused by inorganic anions (mineral acidosis) can also stimulate the K+ movement. The effect of acidemia on enhancing cellular K loss is not related to direct K+-H+ ion exchange, but rather via action on transporters which normally regulate skeletal muscle pH (Aronson and Giebisch 2011; Stone et al. 2016). The decrease in extracellular pH reduces the rate of Na+/H+ exchange and inhibits Na+/bicarbonate (HCO3) cotransport. The fall in intracellular Na+ reduces Na+/K+ ATPase activity, leading to decreased K+ influx, cellular K+ losses, and possible hyperkalemia (Palmer 2015; Stone et al. 2016). Additionally, a fall in extracellular HCO3 increases inward flux of Cl via upregulation of Cl/HCO3 exchange, increasing K+/Cl cotransport and subsequent K+ efflux. In metabolic acidosis via organic anion (e.g., lactic acid) accumulation, loss of K from the cell is much smaller. Accumulation here, through the movement of both anions and H+ through monocarboxylate transporters (MCT; MCT1, MCT4), leads to a lower intracellular pH, stimulating the movement of Na+ via Na+/H+ and Na+/HCO3 transporters. An increase of intracellular Na+ maintains Na+/K+ ATPase activity, limiting the efflux of K+. Generally, metabolic acidosis (inorganic or organic) causes greater K+ efflux than respiratory acidosis, HCO3 being the primary anion accumulating in the cell to balance the influx of hydrogen ions (Perez et al. 1981; Stone et al. 2016). Movement of cellular K varies similarly in response to different types of physiological alkalosis as well. In respiratory alkalosis, K+ influx is reduced compared to metabolic alkalosis, due to the efflux of cellular HCO3 (Stone et al. 2016).

15.2.2 Renal Potassium Handling

The majority of consumed K is excreted in the urine, with the remaining excreted in the stool, and, under homeostatic conditions, a variable amount in sweat (Shils and Shike 2006). Potassium has a higher ratio of dietary intake to extracellular pool size; recall only 2% of the total body K+ is distributed in the extracellular fluid (ECF) with the remaining distributed in the intracellular fluid (ICF) of various tissues. To meet the challenge of a high K meal, the K homeostatic system is very efficient at clearing plasma K via an increase in renal K excretion. When dietary K intake increases or decreases, the kidneys modulate excretion accordingly, ensuring the maintenance of plasma K+ concentration (Stone et al. 2016). In addition, with the administration of acute K loads, only approximately half of the dose appears in the urine after 4–6 h, suggesting that extrarenal tissues (e.g., muscle, liver, adipose) play an important role in K homeostasis as well via insulin and catecholamine uptake (Youn 2013; Bia and DeFronzo 1981; Stone et al. 2016). Excessive extrarenal K losses are usually small but can occur in individuals with diarrhea, severe burns, or excessive and prolonged sweating (Stone et al. 2016; Stone and Weaver 2018).

Potassium is freely filtered by the glomerulus of the kidney, with most of it being reabsorbed (70–80%) in the proximal convoluted tubule (PCT) and loop of Henle. Under physiological homeostasis, the delivery of K to the nephron remains constant. Conversely, the secretion of K by the distal nephron is variable and depends on intracellular K concentration, luminal K concentration, and cellular permeability (Palmer 2015; Stone et al. 2016). Two major factors of K secretion/loss involve the renal handling of Na and mineralocorticoid activity. Reabsorption in the proximal tubule is primarily passive and proportional to reabsorption of solute and water, accounting for ~60% of filtered K (Penton et al. 2015; Ludlow 1993; Stone et al. 2016). Within the descending limb of Henle’s loop, a small amount of K+ is secreted into the luminal fluid, while in the thick ascending limb (TAL), reabsorption occurs together with Na+ and Cl, both trans- and paracellularly. This leads to the K concentration of the fluid entering the distal convoluted tubule to be lower than plasma levels (~2 mM), facilitating eventual secretion (Ludlow 1993; Stone et al. 2016). Similar to reabsorption in the proximal tubule, paracellular diffusion in Henle’s loop is mediated via solvent drag, while transcellular movement occurs primarily through the apical sodium-potassium-chloride (Na+/K+/2Cl) cotransporter (Ludlow 1993; Stone et al. 2016). The renal outer medullary K channel (ROMK), also located on the apical membrane, mediates the recycling of K from the cell to the lumen, sustaining the activation of the Na+/K+/2Cl cotransporter and K reabsorption in the ascending limb. The movement of K through ROMK induces a positive lumen voltage potential, increasing the driving force of paracellular cation (e.g., Ca2+, Mg2+, K+) reabsorption as well. Na+/K+ ATPase pumps located basolaterally throughout the loop, maintain low levels of intracellular Na+ and further provide a favorable gradient for K+ reabsorption (Palmer 2015; Unwin et al. 2011; Stone et al. 2016; Stone and Weaver 2018).

Major regulation of K excretion begins in the late distal convoluted tubule (DCT) and progressively increases through the connecting tubule and cortical collecting duct. In the early DCT luminal, Na+ influx is mediated by the apical sodium chloride cotransporter (NCC) and continues into the late DCT via the epithelial Na+ channel (ENaC) (Meneton et al. 2004; Stone et al. 2016). Both are expressed apically and are the primary means of Na reabsorption from the luminal fluid. Sodium reabsorption leads to an electrochemical potential that is more negative than peritubular capillary fluid. This charge imbalance is matched by an increase in the aforementioned paracellular reabsorption of Cl from the lumen, as well as increases in Na+/K+ ATPase and ROMK activity. Increased distal delivery of Na increases Na reabsorption, leading to a more negative luminal/plasma potential gradient and an increase in K secretion (Stone et al. 2016).

Most K excretion is mediated by principal cells in the collecting duct. Principal cells possess basolateral Na+/K+ ATPases, which facilitate the movement of K from the blood and into the cell. The high cellular concentration of K provides a favorable gradient not only for the movement of K into the tubular lumen but for the reabsorption of Na as well. Movements of K and Na occur through the ROMK and ENaC channels, respectively. In conditions of K depletion, reabsorption of K occurs through H+/K+ ATPases, located on the apical membrane of α-intercalated cells in the collecting duct, thus, providing a mechanism in which K depletion increases K reabsorption (Meneton et al. 2004; Stone and Weaver 2018).

Two primary types of K channels have been identified in the cortical collecting duct, the aforementioned ROMK, as well as the maxi-K channel (also known as the BK large conductance K+ channel). The ROMK is known to be the major K secretory pathway, characterized by activity during the low conductance of normal physiologic renal fluid excretion. Conversely, the maxi-K channel is quiescent in basal conditions and becomes activated during periods of increased tubular flow, increasing K secretion in a flow-dependent manner (e.g., hypervolemia, high arterial pressure) (Palmer 2015).

15.2.3 Interactions with Sodium Balance

Sodium and K+ are the primary electrolytes found in body fluids and work in concert to maintain normal fluid balance. There are no known receptors capable of detecting fluctuations of Na+ within the body, however physiological mechanisms that control extracellular fluid volume effectively control Na+ balance, influencing K+ movement as well. Perturbations in extracellular fluid volume lead to the recruitment of mechanisms that influence both the volume and pressure of circulation (cardiac and arterial pressure). Vascular pressure receptors (baroreceptors) sense changes in stretch or tension in vascular beds. Receptors that respond to low-pressure found in the central venous portion of the vascular tree respond to changes in blood volume, while high-pressure receptors located in the arterial circulation respond to changes in blood pressure (Stipanuk 2006). With hypovolemia (low fluid volume) baroreceptors are activated in the vasculature of the pulmonary vein and/or walls of the cardiac atria and send efferent signals to the central nervous system (CNS) to induce both a sympathetic and hormonal response. Hormonally this causes increased release of arginine vasopressin (AVP; antidiuretic hormone) from the posterior pituitary gland, which increases the permeability of the collecting ducts of the kidneys to water, facilitating water reabsorption and increased fluid volume. AVP also increases the reabsorption of Na+ and Cl in the TAL and collecting duct, overall decreasing Na and water loss. As part of a reflex response to a fall in systemic pressure, sympathetic neurons that innervate the afferent/efferent arterioles of the glomerulus release the neurotransmitter norepinephrine, causing an increase in renal vasculature resistance and a decrease in fluid filtration. The decrease in renal blood flow leads to an overall reduction in filtration and Na loss. Stimulated α1 and α2 adrenergic receptors in the proximal tubular cells of the kidney also increase the activity of basolaterally located Na+/K+ ATPase and apical Na+/H+ exchanger, respectively, increasing reabsorption of Na+ from the PCT luminal fluid.

In addition to affecting renal hemodynamics, stimulation of α1 adrenergic receptors induce the release of renin from the juxtaglomerular cells of the kidney afferent and efferent arterioles. Renin, as part of the renin–angiotensin–aldosterone hormonal axis, is a proteolytic enzyme that when released into circulation is responsible for cleaving the hepatically produced protein angiotensinogen into angiotensin 1. Angiotensin 1 undergoes further cleavage into angiotensin 2 (ANG-2), catalyzed by angiotensin-converting enzyme (ACE) which is produced primarily by the epithelial cells of the lungs. Angiotensin 2 is a vasoactive hormone, increasing total peripheral vascular resistance in response to low blood volume thus normalizing total pressure. In the CNS ANG-2 stimulates the release of AVP from the posterior pituitary, and increases thirst and salt appetite. Angiotensin 2 also has direct and indirect effects on renal Na loss. Directly ANG-2 increases vascular resistance of the efferent arterioles, decreasing renal plasma flow. Angiotensin 2 also has direct effects on the tubular transport system, increasing expression of the Na+/K+ ATPase and Na+/HCO3 exchanger in the basolateral and apical membrane of the proximal kidney, respectively, and the Na+/H+ exchanger and ENaC in the distal tubules (Gumz et al. 2015). Overall decreasing loss and increasing Na reabsorption. Indirectly ANG-2 in circulation stimulates the release of the mineralocorticoid aldosterone from the adrenal cortex. Aldosterone is secreted in response to low plasma Na (hypovolemia), high plasma K, and increases in ANG-2. Aldosterone increases K secretion by stimulating an increase in luminal Na reabsorption. Aldosterone directly increases renal cellular uptake of Na via apical stimulation of ENaC and ROMK expression and increased activity of basolateral Na+-K+ ATPases (Shils and Shike 2006). Increased reabsorption of Na+ also increases the potential difference across the tubular cell, enhancing the secretion of K+ from the cell into the more electronegative lumen.

15.3 Potassium Bioavailability

Potassium is found in most plant and animal tissues, with fruits and vegetables having a higher nutrient density than cereals and animal foods. Potassium is intrinsically soluble and quickly dispersed in the luminal water of the upper digestive tract. The small intestine is the primary site of K absorption, with approximately 90% of dietary K being absorbed by passive diffusion (Demigne et al. 2004; Stone et al. 2016). Little is known about the bioavailability of K, with the majority of work being centered on the assessment of urinary K losses after K salt supplementation (Melikian et al. 1988; Bechgaard and Shephard 1981; Betlach et al. 1987; Stone et al. 2016).

15.3.1 Kinetic Modeling and Potassium Bioavailability

Many different models of K movement within the body have been proposed, each developed to fit various areas of biological interest. The complexity of each model varies, from early recommendations by the International Commission on Radiological Protection for evaluation of radio potassium exposure limiting the body to one large mixed pool of K to more complex anatomically related compartmentalization (ICRP 1975, 2007; Valentin 2002; Stone et al. 2016). In one of the earliest schemes, Ginsburg and Wilde (1954) constructed a five-compartment model, mathematically derived from murine data looking at tissue groupings (muscle/testes, brain/RBC, bone, lung/kidney/intestine, liver/skin/spleen) and their K exchange between a common compartment of ECF (Ginsburg 1962; Ginsburg and Wilde 1954; Stone et al. 2016). Utilizing 42K+ intravenous (IV) injections, researchers noted a wide spectrum of tracer exchange rates between tissues, with kidneys being the fastest (equilibrium with plasma at 2 min) and muscle and brain being the slowest (≥600 min) (Ginsburg 1962). Based on this model, the total K mass of the four primary tissue compartments should be equivalent to total body K. However, findings revealed that this was not the case, the total sum only accounting for 73% of K mass. Investigators concluded that exchange rates/pools may be heterogeneous across both organs and organ groups, making the idea of grouping tissue compartments even more complex, and the internal movement of K more nuanced. Later, Leggett and Williams (1986) proposed a more anatomically specific model based on the quantitative movement of K through mathematically derived compartments within a physiologically relevant framework (Stone et al. 2016). Their model, similar to previous depictions, identifies plasma/ECF as the primary feeding compartment, with equilibrium distribution of K, regional blood flow rates, and K tissue extraction fractions, all influencing K exchange. The model also describes K exchange from plasma/ECF to tissues as a relatively rapid and uniform process; skeletal muscle being the only exception, with slower exchange due to its role as the main site of K storage. This concept is confirmed by earlier studies looking at exchange rates of total body K using measures of whole-body counting of radioactivity, IV administration of 42K, and 40K/42K ratios (Edmonds and Jasani 1972; Jasani and Edmonds 1971; Stone et al. 2016). These early works revealed that after absorption, most body K exchanges rapidly with a half-life of less than 7 h, while a small portion thought to be contained primarily in skeletal muscle exchanges more slowly (~70 h) (Jasani and Edmonds 1971; Surveyor and Hughes 1968). A better understanding of kinetic modeling and K movement throughout the body may help to reveal how specific tissues influence K bioavailability, and further the understanding of its role in health (Stone et al. 2016).

In a recent study conducted by Macdonald and colleagues (2016), researchers aimed to assess and compare the bioavailability of K from potato (Solanum tuberosum L.) sources (non-fried white potatoes, French fries) and a K supplement (potassium gluconate). Thirty-five healthy men and women (age of 29.7 ± 11.2 year, body mass index of 24.3 ± 4.4 kg m−2) were randomized to nine, five-day interventions of additional K equaling: 0 mmol (control at phase 1 and repeated at phase 5), 20 mmol (1500 mg), 40 mmol (3000 mg), 60 mmol (4500 mg) K day−1 consumed as K+ gluconate or potato, and 40 mmol K+ day−1 from French fries. Bioavailability of K was determined from the area under the curve (AUC) of serial blood draws and 24-h urinary excretion assessed after a test meal of varying K dose given on the fourth day. Investigators found increases in serum K AUC with increasing dose regardless of source, while 24-h urine K concentration also increased with dose but was greater with potato compared to supplement. Blood pressure (BP) was also assessed throughout the study but resulted in no significant findings. These outcomes reveal the need for a full K balance study, looking at intakes from a variety of dietary sources and complete losses (urine and feces), to fully understand K bioavailability differences between dietary K and supplements and their subsequent health effects (Stone et al. 2016).

15.4 Potassium and Hypertension

Hypertension, or high blood pressure, is the leading cause of cardiovascular disease and a major contributing risk factor for the development of stroke, coronary heart disease, myocardial infarction, heart failure, and end-stage renal disease, amounting to a US public health financial burden of $53.2 billion (Roger et al. 2012; Benjamin et al. 2018). Approximately one in three American adults ≥20 years (~86 million) are estimated to have HTN, while nearly 60 million are at risk for developing HTN (BP greater than 120/80 mmHg) (Benjamin et al. 2018). Approximately 90% of US adults older than 50 year are at risk for the development HTN, with systolic rises being the most prevalent (Svetkey et al. 2004). Hypertension is a leading cause of morbidity and mortality worldwide and second only to smoking as a preventable cause of death in the US (Lopez and Mathers 2006; Stone et al. 2016; Stone and Weaver 2018).

15.4.1 Mechanisms of Arterial Pressure Control

Regulation of systemic arterial pressure is the most important role of the cardiovascular system. Arterial pressure is a result of cardiac output (heart rate × stroke volume), or blood being pumped from the heart into the systemic circulation, and total peripheral vascular resistance, or the degree to which the systemic vasculature is in a state of constriction or dilation (Mohrman and Heller 2010). Blood pressure is regulated by both short-term and long-term mechanisms. In the short-term, arterial baroreceptors, located predominately in the walls of the aorta and the carotid arteries, respond to sensory inputs of increased stretch in the vasculature sending afferent signals to the medullary cardiovascular center in the CNS. Subsequently, the CNS integration process is such that increased input from the arterial baroreceptor reflex, caused by increases in arterial pressure, will cause a decrease in the tonic activity of cardiovascular sympathetic nerves and an increase in cardiac parasympathetic nerve activity. The result of this negative feedback system being an overall decrease in BP. Conversely, a decrease in mean arterial pressure would increase sympathetic and decrease parasympathetic neural activity (Ekmekcioglu et al. 2016). If arterial pressure remains elevated for several days the baroreceptor reflex will gradually adjust to this new pressure set point and cease firing. Because of this, it is not considered a good mechanism for long-term control. Long-term pressure regulation is closely tied to the prevalence and potential causes of hypertension. Long-term regulation is theorized to be primarily dependent on the way the kidneys handle Na (e.g., extracellular osmolarity) and regulate blood volume. Arterial pressure has a marked effect on urinary output rate and total body fluid volume. A disturbance that leads to an increase in arterial pressure will in turn cause an increase in urinary output, decreasing total fluid volume and bring arterial pressure back to a homeostatic level. Again conversely, a decrease in arterial pressure would lead to fluid volume expansion. Similar to short-term regulation, long-term regulation works as a negative feedback loop, utilizing modulation of fluid volume as a means for pressure regulation.

As discussed previously, the kidneys play a major role in regulating electrolyte balance and the osmolarity of blood plasma. Plasma is filtered within the glomerular capillaries before entering the renal tubules of the nephron. The rate at which this process occurs is referred to as the glomerular filtration rate (GFR) and is influenced by both the hydrostatic and oncotic aspects of arterial pressure. Increased blood volume and pressure will increase GFR, and when the body is at physiological steady-state, arterial pressure must remain at a level that ensures urinary output equal fluid intake (Mohrman and Heller 2010). Filtered fluid enters the renal tubules where it is either reabsorbed and reenters the cardiovascular system, or is excreted as urine. As stated earlier, the kidneys regulate blood osmolarity primarily via modulation of total body water rather than total solutes, although some fluid reabsorption occurs because Na+ is actively pumped out of the renal tubules. The previously discussed hormonal influences of arginine vasopressin (antidiuretic hormone; AVP) and the renin–angiotensin–aldosterone axis stimulate both water and Na+ reabsorption in response to low fluid volume/low blood pressure. The resulting increase in BP, and overall dysregulation of this long-term control mechanism, may explain the incidence of hypertension to some degree, although the majority of primary HTN remains idiopathic.

Systemic HTN is defined as an elevation of systolic BP (vascular pressure during cardiac muscle contraction) above 140 mm Hg and diastolic blood pressure (vascular pressure during cardiac muscle relaxation) above 90 mm Hg. Secondary HTN can be traced to a preexisting comorbidity such as kidney disease, obesity and/or diabetes, and various forms of cancer. Primary or essential HTN (“essential” to drive blood through the vasculature) often has no diagnosable cause, leaving only the symptom of high BP to be treated either pharmacologically, or through lifestyle modification (e.g., exercise and diet).

15.4.2 Potassium and Arterial Pressure

Theorized mechanisms of how K+ influences vascular health include effects on the renin–angiotensin–aldosterone system, reduction in adrenergic tone, increased Na+ excretion (natriuresis), and increases in vasodilation. Short-term increased consumption of K+ may improve the function of endothelial cells, a monolayer of cells within the vasculature that control the tone of the underlying vascular smooth muscle. Elevated serum K+, within the physiological range, may induce endothelial hyperpolarization via a stimulation of Na+/K+ ATPase pumps and the activation of plasma membrane K+ channels, leading to subsequent vasodilation via efflux of Ca2+ from vascular smooth muscle cells (Haddy et al. 2006; Ekmekcioglu et al. 2016). Increased K+ intake may also enhance vasodilation and improve BP regulation via inhibition of sympathetic neural transmission and reduced sensitivity to catecholamine-induced vasoconstriction, increased endothelial nitric oxide release, alteration of baroreceptor sensitivity, and increased Na+ excretion (Haddy et al. 2006; He et al. 2010; Stone and Weaver 2018).

In relation to Na+, increases in K+ intake can lead to increased Na+ excretion which may improve the overall fluid volume and BP control. As described previously, active Na+ and K+ reabsorption and excretion are primarily regulated by the epithelial Na+ channels (ENaC; Na reabsorption) and the renal outer medullary K+ channel (ROMK; K excretion) transporters of the kidney. Na+ is also actively reabsorbed in DCT by the Na+/Cl cotransporter (NCC), which determines the delivery of Na+ to the downstream ENaC and ROMK, and directly influences the reabsorption of Na+ and excretion of K+. Assessed in animal models, increased K+ feeding increases extracellular K+ concentration leading to a decrease in NCC activity (via a phosphorylation-dephosphorylation mechanism) reducing Na+ reabsorption, and increasing urinary loss (Veiras et al. 2016). Prospective human population studies show that higher fruit and vegetable intake (and assumed increased dietary K+) increases Na+ excretion, which may lead to improvements in fluid balance and BP control (Cogswell et al. 2016). In contrast, low K+ intake may lead to excessive Na+ retention independent of fluid dynamics. In animal models, inadequate K+ upregulates the Na+/H+ exchanger in the PCT, leading to increased Na+ reabsorption and fluid expansion (Soleimani et al. 1990). Potassium depletion may also lead to increased activity of the NCC, increasing Na+ and fluid reabsorption in the distal kidney, and promoting arterial pressure dysregulation. While the influence of both K+ and Na+, and the complex physiological relationship between the two are intimately tied to fluid balance and arterial pressure, the mechanisms behind this are still unknown (Stone and Weaver 2018).

15.4.3 Epidemiological Data

Numerous epidemiological studies suggest diet as a key component in BP control, with some studies showing lower BP in populations consuming higher amounts of fruits and vegetables (INTERSALT 1988; Young et al. 1995; Elford et al. 1990). Dietary patterns shown to lower BP include increased K and reduced Na intake, increases in fruit and vegetable consumption, as well as other foods rich in antioxidants (Appel et al. 1997; Svetkey et al. 1999). A population study conducted by Khaw et al. in St. Lucia, West Indies suggested an increase in K by ~700–1200 mg/day (20–30 mmol/day) resulted in a 2–3 mmHg reduction in systolic blood pressure (SBP) (Khaw and Rose 1982). In adults, a 2-mmHg reduction in BP can reduce CHD and stroke mortality rates by 4 and 6%, respectively (Stamler 1991). The INTERSALT study, a worldwide epidemiologic study (n = 10,079 men and women aged 20–59 year from 32 countries) that looked at the relationship between 24 h. Na excretion and BP provided evidence of K intake as an important factor affecting population BP, independent of Na, among diverse population groups (Stamler 1991). The American Heart Association has estimated that increasing K intake may decrease HTN incidence in Americans by 17% and lengthen life span by 5.1 years (Roger et al. 2012; Stone et al. 2016; Stone and Weaver 2018).

15.4.4 Potassium Supplementation Studies

Epidemiological studies have evaluated the effects of K from foods, while clinical intervention trials have primarily used K supplements. Several meta-analyses show a significant reduction in BP with increasing K supplementation (Beyer et al. 2006; Whelton et al. 1997; Cappuccio and MacGregor 1991; Geleijnse et al. 2003). In an early meta-analysis Cappuccio and MacGregor reviewed 19 clinical trials looking at the effect of K supplementation on BP in primarily hypertensive individuals (412 of 586 participants). With the average amount of K given at 86 mmol day−1 (~3300 mg day−1; as primarily KCl) for an average duration of 39 days, researchers found that K supplementation significantly reduced SBP by 5.9 mm Hg and diastolic blood pressure (DBP) by 3.4 mm Hg. Greater reductions were found in individuals who were on supplementation for longer periods of time (Cappuccio and MacGregor 1991). Another regression analysis looked at the effect of K supplementation in both normotensive and hypertensive individuals. Researchers found an average K dose of 60–120 mmol day−1 (2500–5000 mg day−1) reduced SBP and DBP by 4.4 and 2.5 mm Hg, respectively, in hypertensive patients, and by 1.8 and 1.0 mm Hg, respectively, in normotensive individuals (Whelton et al. 1997). As is evident, the effect of K supplementation on BP reduction is generally positive, but not consistent. According to a more recent meta-analysis conducted by Dickinson et al. (2006), K supplementation did not significantly reduce BP in those with hypertension, although this analysis was only based on five trials, and findings, while not statistically significant, did reveal reductions in both SBP and DBP (Beyer et al. 2006; Dickinson et al. 2006). In general, these outcomes show that the BP-lowering effects of K supplementation are greater in those with HTN and more pronounced in blacks compared to whites. Other noted factors that may influence the effects of K supplementation on BP include pre-treatment BP, age, gender, intake of Na and other ions (magnesium, calcium), weight, physical activity level, and concomitant medications. In addition, these analyses suggest the optimal K dose range as 1900–3700 mg day−1, for lowering of approximately 2–6 mm Hg in SBP and 2–4 mm Hg in DBP (Houston 2011; Stone et al. 2016; Stone and Weaver 2018).

15.4.5 Dietary Intake Clinical Trials

Findings from the recent Agency for Healthcare Research and Quality (AHRQ) report on K intake and chronic disease concluded, with a moderate strength of evidence, that increasing K intake decreases BP, particularly in those with HTN (Newberry et al. 2018). Although, of the 18 randomized controlled trials assessed by the AHRQ, only 4 were dietary interventions, the rest involved K supplementation as described above.

Evidence from dietary interventions is extremely limited, with the majority of findings being extrapolated from The Dietary Approaches to Stop Hypertension (DASH) study (Appel et al. 1997). The DASH interventions determined that a diet higher in fruits and vegetables, fiber, and low-fat dairy products, and lower in saturated and total fat and Na could improve BP outcomes compared to the average American diet (Sacks and Campos 2010). Although the DASH diet does lead to a dramatic increase in K consumption (4100–4400 mg day−1), due to its other dietary modifications, the beneficial effects on arterial pressure cannot be attributed to K alone. In an earlier study conducted by Chalmers and colleagues in an Australian cohort, researchers assessed the effects of both the reduction of Na and the increase of K in the diet on BP (Chalmers et al. 1986). Two-hundred-and-twelve hypertensive (DBP between 90- and 100-mmHg) adults (age 52.3 ± 0.8 year; 181 males and 31 females) were recruited and placed in one of the 4 following diet groups: a normal diet group (control), a high K diet (>100 mmol K day−1 or > 3900 mg day−1), a reduced Na diet (50–75 mmol Na+ day−1 or 1150–1725 mg day−1), or a high K/low Na diet. The duration of the diet intervention for this parallel design study was 12 weeks in which subjects were given nutrition coaching on how to adjust their diet choices based on their group (e.g., increasing fruit/vegetable intake, avoiding table salt and foods high in Na). Investigators found significant reductions in both SBP and DBP in each intervention group compared to controls, but no significant differences between diet manipulation groups, with reductions in the high K group being −7.7 ± 1.1 and −4.7 ± 0.7 mm Hg for SBP and DBP, respectively. Although high K intake did appear to reduce BP the lack of differences between groups points to the possibility of an overall diet effect. In a more recent study conducted on a UK cohort, Berry et al. assessed the effects of increased K intake from both dietary and supplement sources on BP in untreated pre-hypertensive individuals (DBP between 80 and 100 mm Hg) (Berry et al. 2010). In a cross-over design, subjects (n = 48, age 22–65 year) completed four, 6-week dietary interventions including a control diet, an additional 20 or 40 mmol K day−1 (780 or 1560 mg day−1) from increased fruit and/or vegetable intake, and 40 mmol K citrate day−1 as capsules. Each treatment was followed by a washout of at least 5 weeks. Similar to the Chamlers study, subjects were counseled by nutrition professionals on how to regulate their food choices during each dietary intervention, primarily focused on increasing fruit and vegetable intake. Findings revealed no significant changes in the primary outcome measure of ambulatory BP between the control group and any of the interventions. The lack of control used to conduct these K dietary interventions is the primary limiting factor in their ability to adequately assess the true effect of increased dietary K intake on BP outcomes. A complete balance study with a controlled diet is necessary to accurately assess K retention, and its acute and prolonged effects on BP and related outcomes. Currently, the lack of evidence from clinical trials looking specifically at dietary K intake and its effect on BP points to a large gap in the K literature. More research is needed in this area to completely understand the effects of dietary K intake on the regulation of arterial pressure and the potential for health benefit (Stone et al. 2016; Stone and Weaver 2018; Weaver et al. 2018) (Table 15.1).

Table 15.1 Published studies that showed an effect of additional dietary potassium (K) intake on blood pressure (BP) outcomes

15.5 Potassium, Diabetes, and Glucose Control

Blood glucose levels are tightly regulated within a range of 70–100 mg dL−1. After ingestion of a meal the rise in circulating glucose levels, along with other factors, stimulates the release of the hormone insulin from the pancreas. Insulin is secreted from the islets of Langerhans from the β-cells of the pancreas and has an action on target tissues (e.g., skeletal muscle, liver, adipose) to facilitate cellular glucose uptake. Antagonistically the hormone glucagon is secreted from the α-cells of the pancreas in response to low levels of blood glucose, stimulating the release of glucose from tissues and glucose production (gluconeogenesis) in the liver (Stipanuk 2006). Continually changing levels of insulin and glucagon are important signals in informing various physiological systems of the body’s nutritional state (Stone et al. 2016).

Diabetes Mellitus (DM) is a degenerative disease associated with a lack of, or insufficient secretion, of insulin or an insensitivity to insulin stimulation in the cells of target tissues. DM comes in two forms: type 1, insulin-dependent, DM, or type 2, non-insulin-dependent DM. Type 1 DM is primarily characterized as an autoimmune disease in which the immune system attacks the cells of the pancreas leading to nearly complete β-cell destruction or extreme dysfunction. This results in essentially a complete inability to produce insulin, and the requirement of daily insulin injections to control blood glucose. Type 2 DM (T2DM) is more complex and is often the result of obesity coupled with poor dietary and lifestyle choices. In T2DM the pancreas may still produce insulin, often in increasing amounts in response to increases in glucose load, but this is often insufficient to maintain homeostatic glucose levels if intake becomes too high and frequent. Eventually, β-cell insulin granules become depleted and target tissues exhibit resistance to insulin stimulation, leaving blood glucose levels unchecked. Prolonged elevated blood glucose levels can be damaging to small vessels, especially in the brain, kidneys, eyes, and extremities, and can eventually lead to nerve damage and tissue death. While the use of drugs to increase insulin secretion and improve tissue insulin sensitivity can be effective, lifestyle changes including better dietary choices and increased physical activity will often lead to control of the disease (Delli and Lernmark 2016; Hupfeld and Olefsky 2016; Stone et al. 2016).

Potassium plays a role in blood glucose control by modulating the secretion of insulin from the pancreas. On the cellular level, K+ efflux from ATP sensitive K+ (K+/ATP) channels influences β-cell excitability and holds membrane potential at low levels (~−60 mV) (Ekmekcioglu et al. 2016). Increases in blood glucose lead to increased β-cell glucose uptake and subsequent ATP generation, which in turn inhibit K+/ATP channels. Decreased K+ efflux leads to stimulation of voltage-gated Ca2+ (Ca2+V) channels, cellular depolarization via Ca2+ influx, and increased insulin secretion. Potassium efflux through voltage-gated K+ channels leads to repolarization and an inhibition of Ca2+V channels, inhibiting insulin release. While experimentally supraphysiological concentrations of K+ (≥10 mM) induce a depolarizing effect on β-cell membrane potential, the effects of extracellular K+ at the upper end of the physiological range (5.5 mM) are unknown (Meissner et al. 1978; Stone et al. 2016).

15.5.1 Potassium and Glucose Control

Glucose intolerance can often be a result of severe hypokalemia due to a deficit in K balance that may occur in primary or secondary aldosteronism or prolonged treatment with diuretics (He and MacGregor 2008). The use of thiazide diuretics is widely considered the preferred initial pharmacological treatment for hypertension (Haddy et al. 2006). The tendency of thiazide diuretics to negatively influence glucose tolerance and increase the incidence of new-onset diabetes is well known. In a recent systematic quantitative review, researchers analyzed 59 clinical trials in which the relationship between the use of thiazide diuretics, hypokalemia, and glucose intolerance was strong (Zillich et al. 2006). Thiazide diuretics have a common side effect of lowering serum K and evidence shows that diuretic-induced hypokalemia may lead to impaired glucose tolerance via the reduction in insulin secretion in response to glucose loads (Chatterjee et al. 2012). In healthy individuals, there is also evidence to support the role of K in glucose control. Studies involving K depletion (e.g., low K diets) show that low levels of K can lead to glucose intolerance via impaired insulin secretion (Rowe et al. 1980; Sagild et al. 1961). In addition, when patients with thiazide-induced hypokalemia are given K supplements, the defects in insulin release in response to glucose loads are corrected, thus indicating that hypokalemia may be a significant contributing factor to the glucose abnormality (Helderman et al. 1983; Stone et al. 2016).

15.5.2 Potassium and Diabetes

The relationship between K intake and diabetes was examined in a prospective cohort study conducted by Colditz et al. (1992) looking at women (n = 84, 360; age 34–59 year) from the Nurse’s Health Study. After a six-year follow-up, investigators found that high K+ intake may be associated with a decreased risk for developing T2DM in women with a body mass index (BMI) of 29 kg m−2 or less (Colditz et al. 1992). When compared with women in the lowest quintile, women in the highest quintile for K+ intake had a relative risk of 0.62 (p trend = 0.008) for T2DM. More recently, Chatterjee et al. assessed the association between K+ intake and T2DM using data from the Coronary Artery Risk Development in Young Adults (CARDIA) study (Chatterjee et al. 2012). Researchers examined the relationship between urinary K+ and diabetes risk for 1066 participants. Use of multivariate models adjusted for potential confounders including BMI, fruit and vegetable intake and other dietary factors revealed that those in the lowest quintile of K intake were more than twice as likely to develop diabetes compared to those in the highest quintile (HR 2.45; 95% CI 1.08, 5.59; p for trend = 0.04). Investigators also found that those in the lowest quintile of K intake were significantly more likely to develop diabetes than those in the highest quintile of K intake (p = 0.008). Of the 4754 participants, 373 (7.8%) developed T2DM during the follow-up period of 20 year, and, overall, the mean K intake of those who developed diabetes was significantly lower than those who did not (3393 vs. 3684 mg day−1; p = 0.002). This same research group examined data from 12,209 individuals participating in the Atherosclerosis Risk in Communities (ARIC) cohort and found serum K+ to be independently associated with diabetes risk. Using multivariate cross-sectional analyses, a significant inverse relationship between serum K+ and fasting insulin levels was identified (p < 0.01) (Chatterjee et al. 2010). Dietary K+ intake was significantly associated with diabetes risk in unadjusted models, with adults having serum K+ levels lower than 4.0 mM at highest risk for DM incidence. This relationship continued to hold true after covariate adjustment (e.g., age, sex, race, BMI, serum magnesium, serum calcium, physical activity, hypertension, etc.) in multivariate models, with lower K+ levels associated with higher BMI, larger waist circumference, lower serum magnesium levels, and higher fasting insulin levels as well (Chatterjee et al. 2010; Stone et al. 2016).

The relationship between K and T2DM also extends to the kalemic effects of insulin. Higher plasma insulin levels are associated with increased K+ absorption into cells (DeFronzo et al. 1980), and without a threshold, as seen in glycemic response, these kalemic effects continue to increase as insulin levels rise. DeFronzo et al. examined this relationship using the insulin clamp technique and graded doses of insulin. Investigators found a dose-dependent decline in plasma K+ concentration with increasing insulin dose, independent of glucose uptake. This effect is likely to be mediated by an increased sensitivity to intracellular Na, activation of Na+-K+ ATPase, and inhibition of K efflux (DeFronzo et al. 1980; Stone et al. 2016).

15.6 Potassium and Bone

Osteoporosis, or a severe reduction in bone mass leading to decreased bone health and increased fracture risk, is a global health problem with great financial impact. Over 200 million people worldwide suffer from osteoporosis, including 30% of postmenopausal women in both the US and Europe (Sözen et al. 2017). Peak bone mass is achieved by the third decade of life, after which bone loss begins, accelerating with aging in both men and women (Weaver and Fuchs 2014). The bone mass present at any given point during life is determined by factors that influence the acquisition, maintenance, or loss of bone throughout the lifespan, many of which are modifiable lifestyle factors (Weaver et al. 2018).

Adequate K intake may benefit overall bone health and has been proposed to do so through its effect on acid-base balance (Barzel 1995; Brandao-Burch et al. 2005). Support for the acid-base bone theory stems from the idea that the Western diet is high in meats and cereal grains and low in fruits and vegetables, creating an environment of low-grade metabolic acidosis (net acid excretion (NAE) = 75–100 mEq acid/day) (Barzel 1995). Buffering of this increased acid load via bone tissue-derived Ca salts, is proposed to lead to bone loss. Alkaline K salts produced from metabolizing fruits and vegetables or K supplements (potassium bicarbonate or citrate) are thought to provide bicarbonate precursors and help to maintain pH homeostasis (~7.35–7.45). The impact of excess systemic acid on bone is suggested to be mediated by two mechanisms: pH buffered through the dissolution of the bone matrix, and cell-based mechanisms (e.g., upregulation of bone-resorbing cell (osteoclast) activity) (Barzel 1995; Brandao-Burch et al. 2005). However, opposition to the acid-base balance theory exists. In a rat model, looking at the relationship between the inhibitory effect of vegetables on bone resorption and base excess, the addition of potassium citrate at levels that neutralized urinary acid excretion from an acidogenic diet had no effect on bone turnover (Muhlbauer et al. 2002). Researchers measured bone turnover via a urinary excretion tracer from prelabeled bone and concluded reductions in bone resorption via increased vegetable intake and subsequent base excess were not causally related. The authors suggested that bioactive compounds (e.g., flavonoids) in fruits and vegetables may be responsible for benefits to bone. Despite this, there is some consistency in the literature that increased K intake benefits bone, though the mechanisms behind this remain unclear (Weaver 2013; Stone et al. 2016; Weaver et al. 2018).

15.6.1 Potassium and Calcium Balance

Potassium intake has been associated with reduced urinary Ca2+ excretion. Clinical trials show persistently increased calciuria in both men and women given K+ supplements (bicarbonate or citrate) vs. similar Na supplements, suggesting K may have a role in bone benefit beyond acid balance (Lemann et al. 1989; Frassetto et al. 2005). In the kidney, Ca is reabsorbed via solvent drag in the PCT (60–70%) and the TAL (20%). Active reabsorption of Ca takes place in the DCT via specific transport proteins. Calcium is reabsorbed via the Ca2+ channel TRPV5 from the tubular fluid into the cell where it binds to the transfer protein calbindin 28 K and is shuttled across and out of the cell via the plasma membrane Ca2+ ATPase (PMCA) and Na+/Ca2+ exchanger (NCX). High Na intakes have been shown to increase urinary Ca losses, with a loss of approximately 24–40 mg of Ca2+ for a Na+ intake of ≈2.3 g (Shils and Shike 2006). The mechanism for this is not well defined, but most likely involves Ca following Na excretion via solvent drag. Increased intracellular Na within the kidney tubular cells may also affect the dynamics of the NCX (which exchanges 3 Na for 1 Ca2+), leading to its dysregulation and possible reversal. Increased intakes of K may have the opposite effect on Ca, in which paracellular reabsorption in the TAL is facilitated by movement of Na+, K+, and Cl across the Na+/K+/2Cl cotransporter (NKCC) on the apical membrane. Potassium shuttled into the cell via NKCC is subsequently re-secreted into the lumen via ROMK, maintaining an electropositive lumen, facilitating the passive reabsorption of Ca, decreasing urinary loss, and improving Ca balance.

15.6.2 Potassium Bone Turnover and Bone Mineral Density

Beyond the effect of K on Ca balance, several studies have assessed the influence of K on biochemical markers of bone turnover. Studies have shown decreases in the bone resorption markers C- and N-telopeptide and procollagen type I N-terminal propeptide, with K supplementation (Dawson-Hughes et al. 2009; Marangella et al. 2004). In postmenopausal women, K bicarbonate at 60–120 mmol day−1 decreased urinary hydroxyproline excretion by 10%, while increasing serum osteocalcin, a marker of bone formation (Sebastian et al. 1994; Weaver 2013).

The relationship between increased K intake and bone mineral density shows conflicting results as well. Only three clinical trials have been reported all done in populations of postmenopausal women or the elderly (>60 year). One trial showed protection from bone mass density (BMD) loss in the spine, hip, and femoral neck, with a 30 mmol K day−1 dose as K citrate compared to KCl, but lacked a placebo control (Jehle et al. 2006). A second trial revealed no BMD benefit with increased intake from K citrate (55 or 18.5 mmol day−1), fruits and vegetables (18.5 alkali mmol day−1), or a placebo (Macdonald et al. 2008). And the third, and strongest, reported a 1.7% increase in spine BMD with K citrate supplementation (60 mmol K day−1) compared to placebo (Jehle et al. 2006, 2013; Macdonald et al. 2008). While generally inconclusive, findings may reveal the significance of K form and dose in any potential benefit for BMD.

15.7 Opportunities for Future Interdisciplinary Efforts to Improve Potassium Recommendations of Agricultural Crops

There is still much to learn about the role of K in overall human health. The importance of K in normal physiology is clear, but how adequate to greater than adequate intakes can help facilitate benefit to these systems is not well understood. Increasing dietary K has potential benefit to lowering the risk of hypertension, and may provide benefit to normal kidney function, glucose control, and bone (Fig. 15.1). We need to understand more about bioavailability and retention of K from foods as well as other sources. Are there unidentified inhibitors to K absorption or food matrix effects? Do some anions that accompany K in foods have differential functional advantages? Organic salts of K appear to have more benefit to the bone, perhaps through effects on acid-base balance. The form seems less important for controlling blood pressure. Research on dietary K intake is likely to increase because it is an identified shortfall nutrient and increasing K consumption may have a marked influence on arterial pressure and hypertension, an important risk factor for all cardiovascular and related chronic diseases (Stone et al. 2016; Stone and Weaver 2018).

Fig. 15.1
figure 1

A better understanding of how K consumed from the environment moves through the body including K storage and excretion, and how this may affect various physiological systems, which will in turn improve outcomes related to the health consequences of these relationships