Abstract
Biochemical chain reactions are signal transduction cascades that can transmit biological information about the intracellular environment. In this study, we modelled a chain reaction as a code string for applying information theory. Herein, we assumed that cell signal transduction selects a strategy to maximize the transduced signal per signal event duration. To investigate the same, we calculated the information transmission capacity of the reaction chain by maximizing the average entropy production rate per reaction time, indicating the idea of the entropy coding method. Moreover, we defined a signal cascade trajectory. Subsequently, we found that the logarithm of the forward and reverse transition ratio per reaction time is equal to the entropy production rate, which derives the form of the fluctuation theorem in signal transduction. Our findings suggest the application of information entropy theory for analysing signal transduction.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
1 Introduction
Signal transduction is one of the frameworks for understanding cellular reaction networks. In this study, we modelled signal transduction as the flux of biochemical information [1, 2]. A typical signal transduction is signalling molecule modification. For example, an environmental change, such as increased ligand levels in the extracellular area, can trigger chemical modification of the receptor protein on the cell membrane. The modification allows recruiting of the adaptor protein to the receptor. The receptor–adaptor complex further modifies the proteins in the cytoplasm, and the modified proteins catalyze further modification of other molecules [3]. The modification includes phosphorylation or co-factor binding, for example, GTP (guanosine triphosphate). The final modified protein in the reaction cascade is translocated to the nucleus. It binds to DNA, alters its structure, and subsequently promotes the transcription of genetic information, followed by protein translation. The EGFR (epidermal growth factor receptor) signal cascade is a well-known example [4].
In summary, environmental change is transduced to the expression of genetic information through signal molecule modification. Conventionally, this information transduction has been termed cell signal transduction, and we will call each modification signal step. Mostly, the signal transduction analysis has not been quantified in information science. This makes it challenging to compare gene expression level change by signal transduction or determine ligand dose for receptor stimulation.
Previously, we reported that the signal reaction could be modelled as a string of code representing the modified signalling molecule [1], one of the source coding methods in information science [5]. In this case, the logarithm of the concentration ratio of signalling molecule to total signalling molecules provides the information entropy. In addition, the code length is given by the reaction time. Furthermore, the cascade crosstalks with other cascades to create a signal network. In that case, the reaction steps form the network nodes. Because information is transduced in the direction that particular modification reaction step follows, the signal cascade network forms a directional network. In this way, the signal transduction phenomenon can be reconceptualized based on information science.
On the modelled cell signal network, we can calculate the information amount [4, 6]. First, we define information entropy in cell signal transduction and consider a network of signal cascades [3]. Next, to evaluate the signal transduction efficiency, the average entropy rate, i.e., capacity, is formulated when signalling efficiency is maximised. In terms of information science, this formulation corresponds to entropy coding. Finally, signal transduction thermodynamics is linked to the fluctuation theorem (FT), a major recent breakthrough in nonequilibrium thermodynamics, giving the ratio of the probability distribution function of an event (information gain, signal molecular modification in most cases) to the probability distribution function of the reverse event (information loss, de-modification in most cases) [6].
2 Results
2.1 A model chain reaction for information transmission
Consider a biochemical chain-reaction cascade of n biochemical species Xj (1 ≦ j ≦ n) in a reaction chain. Herein, we employed the biochemical species that transmit information via their modification (i.e. phosphorylation or mediator binding).
In the model, the cell chemostat supplies an information signal mediator, such as adenosine triphosphate (ATP). ATP is hydrolyzed into adenosine diphosphate (ADP) and phosphate to modify Xj into Xj*. The asterisk represents the modified form of Xj, and Xj* can modify another species Xj +1 into Xj +1*. After that, Xj* is demodified to Xj. This reaction generates a cycle chain, and the step proceeds from the jth to j + 1th (Fig. 1). For example, an increase in X1* can be transmitted as an increase in the final species X4* through the following four-step chain-reaction cascade (1 ≦ j ≦ 4):
In this case, the code sequence for the forward chain reaction can be written as (Fig. 2A):
The reverse chain-reaction cascade can be described as (Fig. 2B):
The code sequence for the reverse chain reaction can be written as:
Furthermore, the appearance of the species coding can repeat as:
We interpreted the reaction time of the jth step as the jth code length (1 ≦ j ≦ n), which corresponded to the code length in the information resource code theory. Then we introduced X, which represents the total concentration of the signalling molecules:
The concentration ratios pj and pj* were defined as
where
Next, we considered duration corresponding to the reaction time of the jth-step during which modification and de-modification. The total duration of the message, τ, was given as:
where τj signifies the duration of the Xj to Xj* conversion per one Xj molecule, and τj* signifies the duration of the Xj* to Xj conversion per one Xj* molecule. Here we set τj > 0 and τj* < 0, determining the direction of the signal transduction.
2.2 Channel capacity of the signal transduction
Subsequently, the total number of signal transduction events, Ψ, was defined for the entire cascade as follows:
Taking the logarithm of Ψ, Shannon's entropy S was given by Starling's formula as follows [1]:
where
Here, as in previous studies [3], we assumed that cells transmit the maximum signal amount at each step at a given time. The maximisation suggests that the signal cascade does not allow for signal redundancy and that signaling transduction system adopts a strategy that transduces as many signals as possible. Because ATP and GTP, which are mediators for the phosphorylation of signalling molecules, are also involved in essential cellular activities such as the synthesis of metabolites and nucleic acids, their amounts are limited in the cell. This maximum amount of information transmission is defined here as transmission capacity, using the language of information science.
To obtain the capacity, we defined a function G and applied Lagrange’s method to maximise entropy S under the constraints of (6), (7), (8) and (9).
In the above, a and b are non-determined parameters. Differentiating G gave us:
Setting the left-hand sides of Eqs. (14), (15), and (16) as zero gave us
From Eqs. (18) and (19), b could be considered the average “entropy (production) rate”. Furthermore, substituting Eqs. (18) and (19) into the right-hand side of Eq. (11) gave us
In the above, the “max” suffix denotes the maximum value of the entropy. Therefore, the channel capacity of the signal transduction cascade, i.e., the maximum average rate of the entropy, was given by Eq. (20) as follows [1]:
Here, if entropy units are used, we take K = kB, Boltzmann’s constant. In contrast, in information science, K is equivalent to log2e. Therefore, the negative average entropy production rate was equal to the channel capacity of the signal transduction cascade. The channel capacity was one of the conserved quantities in the transduction cascade.
2.3 Fluctuation theorem holds in a single signal cascade
Thereafter, the transitional probability of the j + 1th step, if given the jth step, could be defined as P (j + 1|j), while the probability of the jth step, if given the j + 1th step, was defined as P (j|j + 1) using the duration ratios as follows:
We then introduced a logarithmic function of the jth step as:
Substituting Eqs. (18), (19), (23) and (24) into (25), gave a function zj:
We then took the average zj for the reaction time Xj τj – Xj*τj* which represented the total duration of the jth step:
In many biochemical reactions, such as the information transmission reaction involving EGFR-related cascades and mitogen-activating protein kinases (MAPKs), tj = τj – τj* was anticipated to be sufficiently long [7,8,9,10]. Based on experimental data, |τj*| is more prolonged than several hours, and τj is a few minutes; therefore, τj/|τj*| < 0.05, and 1/(Xj τj – Xj*τj*) is sufficiently small [11, 12]. Therefore, the second term of the right-hand side of Eq. (27) in the limit operation is equal to −0.01 or smaller and is small enough to be neglected. Accordingly,
Therefore, we obtained:
tj = τj −τj* represents the duration of the sum of the modification and de-modification duration of a signal cascade trajectory, and the sum of zj gave:
with
Therefore, b equals the entropy per time t taken from the beginning to the end of a single cascade. Then we obtained Eqs. (32) from Eqs. (25), (29), and (30):
In the above, we used τj /τj* < 0.05. Therefore, Eq. (33) could be rewritten as:
And finally:
In the above, we replaced K with kB. Here, we set Smax /X* to smax. In this way, the logarithm of the forward and reverse transition ratio per the reaction time is equal to the entropy. Considering that sj max represents the maximum entropy production at the jth step per one signal transduction in producing a single molecule of the active form X*, we obtained the following Eq. (37) by identifying smax with the heat production ΔQj (1 ≦ j ≦ n) in the modification at the jth signal transduction step:
Here, T represents the system temperature. Equation (37) shows that enough time has passed, and the signalling amount approaches the maximum value. This equation possesses the detailed balance condition [13].
2.4 Path integral of signal transduction
The time-course scheme of signal transduction along the step-by-step trajectory may include forward and backward fluctuations. Therefore, the path and reverse path of signal transduction were introduced, respectively, as:
where π(+) and π(−) denote the probability that the signal transduction and the reverse signal transduction occur in the given signal transduction system, respectively. π(t0) and π(tn) denote the probability when t = t0 (start of the transduction) and t = tn (start of the reverse transduction). Taking the logarithms of Eqs. (38), and (39), we had:
Suppose that entropy production Δs follows the probability distribution P (Δs) while taking a value close to the maximum value smax. In the above, we noted that the negative logarithm of π(t0) and π(tn) was considered entropy s' and set log π(t0) - log π(tn) = ∆s' /kB. By taking the difference between the right and left sides of (36), and the transition probability π(+) and π(-) from Eqs. (40) and (41), we obtained:
In the above, we set ∆s/kB = ∆s'/kB + ∑j logP(j+1|j) - ∑j logP(j+1|j). When integrating the product along the transduction cascade path, the relationship between the probability distribution of ∆s, P (Δs) for the transduction trajectory path and P (-Δs) for the reverse trajectory path (taking the opposite entropy −Δs) was given by the following equation [14,15,16,17]:
Therefore, we obtained a form of the FT in terms of signal transduction.
In conclusion, Eq. (44) formulates the ratio of the probability distribution function of a signal transduction event (information gain) and the rare reverse signal transduction event (information loss).
3 Discussion
This study considered a chain reaction in signal transduction as a model of code sequence in terms of information science [2]. We modelled a chain reaction using two type signal molecules, an inactive Xj, and an active form Xj*, which is a type of binary coding [18]. First, Eqs. (18) and (19) were derived from the viewpoint of source coding in information theory and both equations describe entropy coding. Second, the channel capacity in Eq. (22) was given as a form of the entropy-time average, which is essential in quantifying signal transduction. Third, we obtained a new form of the FT in Eq. (44). Thus, the chain reaction model of signal transduction provides a unified understanding of thermodynamical and informational entropy. Below is an overview of the features of this model.
3.1 Unidirectionality of signalling
So far, consideration of signal transduction duration and direction have rarely been included in systems biology studies of signal transduction. One of the novel points of the current study was considering the code length and direction of signal transduction. In this study, the unidirectionality of signalling was introduced into the framework by the significantly longer reverse time compared to the forward time in signal transduction. The irreversibility of signal transduction could be estimated in Eqs. (25–28). Besides, we assigned a negative sign to the duration of the reverse transduction, as τj* < 0. This negative duration contributed to expressing the loss of information carried by signal transduction. As a result, we were successful in expressing the irreversibility in information science. In addition, the mobile flow and oscillation wave of slime moulds and bacteria are well-knownmodels point of information transmission by biological populations and natural calculation computing. The presented model may be adopted for the interpretation of such models in the future [19].
3.2 Appication of Fluctuation theorem (FT) for Cell Biology and Information Science
FT is a significant achievement in thermodynamics and has been applied to study a nonequilibrium system [20, 21], membrane transport [22], and molecular machines [23]. FT has a general form of the thermodynamic framework to demonstrate the second law of thermodynamics, the dissipative theorem, and Onsager's reciprocity relations [24, 25]. Recently, biophysical applications have been further developed [26, 27]. This study aimed to interpret the FT in terms of information theory. It is not necessarily obvious whether our formulation can be extended to other biological systems, and more detailed analyses based on information thermodynamics are still required. As with other quantification frameworks of signal transduction, we have proposed several quantifications of signal transduction based on information entropy [3, 4], queueing theory [28], and nonlinear thermodynamics [29]. These theoretical frameworks may be closely linked, and their relationship will be an object of biophysical project (Table 1).
In conclusion, a code string model of biochemical chain reaction can be used to analyse information transmission. Our model suggests measuring cell information transmission or signal transduction capacity and presents a possible application of FT for analysing biochemical information transmission.
Data availability
Not applicable.
References
L. Brillouin, Science and information theory, 2nd edn. Dover Publication Inc, New York, 2013.
C. Waltermann, E. Klipp, Information theory based approaches to cellular signaling. Biochim Biophys Acta 1810, 924–932 (2011)
T. Tsuruyama, The conservation of average entropy production rate in a model of signal transduction. Information thermodynamics based on the fluctuation theorem. Entropy, 20 (2018).
K. Kiso‑Farnè, T. Tsuruyama, Epidermal growth factor receptor cascade prioritizes the maximization of signal transduction. Scientific Reports 12.16950 (2022).
C.E. Shannon, A mathematical theory of communication. Bell Syst Tech J 27, 379–423 (1948).
S. Ito, T. Sagawa, Information thermodynamics on causal networks. Phys Rev Lett 111, 18063 (2003).
H. Wang, J. J. Ubl, R. Stricker, G. Reiser, Thrombin (PAR-1)-induced proliferation in astrocytes via MAPK involves multiple signaling pathways. Am. J. Physiol. Cell. Physiol. 283, C1351–64 (2002).
X.S. Yang, S. Liu, Y. J. Liu, J. W. Liu, T. J. Liu, X. Q. Wang, Q. Yan, Overexpression of fucosyltransferase IV promotes A431 cell proliferation through activating MAPK and PI3K/Akt signaling pathways. J. Cell. Physiol. 225, 612–619 (2010).
M. Zumsande, T. Gross, Bifurcations and chaos in the MAPK signaling cascade. J. Theor. Biol. 265, 481–491 (2010).
X. Xin, L. Zhou, C.M. Reyes, F. Liu, L. Q. Dong, APPL1 mediates adiponectin-stimulated p38 MAPK activation by scaffolding the TAK1-MKK3-p38 MAPK pathway. Am. J. Physiol-Endoc M. 300, E103–E10 (2011).
D.R. Newman, C.M. Li, R. Simmons, J. Khosla, P.L. Sannes, Heparin affects signaling pathways stimulated by fibroblast growth factor-1 and-2 in type II cells. Am. J. Physiol-Lung C 287, L191–L200 (2004).
M. Petropavlovskaia, J. Daoud, J. Zhu, M. Moosavi, J. Ding, J. Makhlin, B. Assouline-Thomas, L. Rosenberg, Mechanisms of action of islet neogenesis-associated protein. comparison of the full-length recombinant protein and a bioactive peptide. Am. J. Physiol. Endocrinol. Metab. 303, E917–E927 (2012).
T. Van Vu, V.T. Vo, Y. Hasegawa, Entropy production estimation with optimal current. Phys. Rev. E 101, 042138 (2020).
U. Seifert, Entropy Production along a stochastic trajectory and an integral fluctuation theorem. Phys. Rev. Lett. 95(4), 040602 (2005)
N. Shiraishi, K. Funo, K. Saito, Speed limit for classical stochastic processes. Phys. Rev. Lett. 121, 070601 (2018).
S. Yamamoto, S. Ito, N. Shiraishi, T. Sagawa, Linear irreversible thermodynamics and Onsager reciprocity for information-driven engines. Phys. Rev. E. 94, 052121 (2016).
Y. Hasegawa, T. Van Vu, Fluctuation theorem uncertainty relation. Phys. Rev. Lett. 123, 110602 (2019).
P. Dasgupta, S. Rastogi, S. Pillai, D. Ordonez-Ercan, M. Morris, E. Haura, S. Chellappan, Nicotine induces cell proliferation by beta-arrestin-mediated activation of Src and Rb-Raf-1 pathways. J Clin Invest 116, 2208–2217 (2006)
Z. Asghar, W. Shatanawi, S. Hussain, Biomechanics of bacterial gliding motion with Oldroyd-4 constant slime. Euro Phys J Spec Topics (2022). https://doi.org/10.1140/epjs/s11734-022-00723-2
M. Ponmurugan, Generalized detailed fluctuation theorem under nonequilibrium feedback control. Phys Rev E Stat Nonlin Soft Matter Phys 82, 031129 (2010)
G.M. Wang, J.C. Reid, D.M. Carberry, D.R. Williams, E.M. Sevick, D.J. Evans, Experimental study of the fluctuation theorem in a nonequilibrium steady state. Phys Rev E Stat Nonlin Soft Matter Phys 71, 046142 (2005)
A.M. Berezhkovskii, S.M. Bezrukov, Fluctuation theorem for channel-facilitated membrane transport of interacting and noninteracting solutes. J Phys Chem B 112, 6228–6232 (2008)
U. Seifert, Stochastic thermodynamics, fluctuation theorems and molecular machines. Reports Progress Phys Phys Soc 75, 126001 (2012)
G.E. Crooks, Entropy production fluctuation theorem and the nonequilibrium work relation for free energy differences. Phys Rev E Stat Phys Plasmas Fluids Relat Interdiscip Topics 60, 2721–2726 (1999)
P. Gaspard, Fluctuation theorem for nonequilibrium reactions. J Chem Phys 120, 8898–8905 (2004)
D. Collin, F. Ritort, C. Jarzynski, S.B. Smith, I. Tinoco Jr., C. Bustamante, Verification of the Crooks fluctuation theorem and recovery of RNA folding free energies. Nature 437, 231–234 (2005)
T. Sagawa, Y. Kikuchi, Y. Inoue, H. Takahashi, T. Muraoka, K. Kinbara, A. Ishijima, H. Fukuoka, Single-cell E. coli response to an instantaneously applied chemotactic signal. Biophys J. 107:730–739 (2014).
T. Tsuruyama, Kullback–Leibler divergence of an open-queuing network of a cell-signal-transduction cascade. Entropy 25(2), 326 (2023)
T. Tsuruyama, Nonlinear thermodynamics of biological signal transduction for predicting conservation of entropy production rate. J Theor Biol 523, 110716 (2021)
Acknowledgements
This research was funded by a Grant-in-Aid from the Ministry of Education, Culture, Sports, Science, and Technology of Japan (Synergy of Fluctuation and Structure: Quest for Universal Laws in Nonequilibrium Systems, P2013-201 Grant-in-Aid for Scientific Research on Innovative Areas, MEXT, Japan). This study was supported by the Kitano Medical Institute and the Radiation Effects Research Foundation, a public interest foundation funded by the Japanese Ministry of Health, Labour, and Welfare and the US Department of Energy. The views of the author do not necessarily reflect those of the two governments.
Author information
Authors and Affiliations
Contributions
TT: Conceptualization, Methodology, Visualization, and Writing—original draft.
Corresponding author
Ethics declarations
Conflicts of interest
Not applicable.
Ethical approval
Not applicable.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Tsuruyama, T. Thermodynamics of signal transduction systems and fluctuation theorem in a signal cascade. Eur. Phys. J. Plus 138, 269 (2023). https://doi.org/10.1140/epjp/s13360-023-03850-4
Received:
Accepted:
Published:
DOI: https://doi.org/10.1140/epjp/s13360-023-03850-4