Skip to main content
Log in

Minimum relative entropy theory for streamflow forecasting with frequency as a random variable

  • Original Paper
  • Published:
Stochastic Environmental Research and Risk Assessment Aims and scope Submit manuscript

Abstract

This paper develops a minimum relative entropy theory with frequency as a random variable, called MREF henceforth, for streamflow forecasting. The MREF theory consists of three main components: (1) determination of spectral density (2) determination of parameters by cepstrum analysis, and (3) extension of autocorrelation function. MREF is robust at determining the main periodicity, and provides higher resolution spectral density. The theory is evaluated using monthly streamflow observed at 20 stations in the Mississippi River basin, where forecasted monthly streamflows show the coefficient of determination (r 2) of 0.876, which is slightly higher in the Upper Mississippi (r 2 = 0.932) than in the Lower Mississippi (r 2 = 0.806). Comparison of different priors shows that the prior with the background spectral density with a peak at 1/12 frequency provides satisfactory accuracy, and can be used to forecast monthly streamflow with limited information. Four different entropy theories are compared, and it is found that the minimum relative entropy theory has an advantage over maximum entropy (ME) for both spectral estimation and streamflow forecasting, if additional information as a prior is given. Besides, MREF is found to be more convenient to estimate parameters with cepstrum analysis than minimum relative entropy with spectral power as random variable (MRES), and less information is needed to assume the prior. In general, the reliability of monthly streamflow forecasting from the highest to the lowest is for MREF, MRES, configuration entropy (CE), Burg entropy (BE), and then autoregressive method (AR), respectively.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13

Similar content being viewed by others

References

  • Alsaka YA, Tzannes NS, Marinelli WA (1988) An efficient algorithm for implementing the relative entropy method. Acoust Speech Signal Process 11–14(1988):2384–2387. doi:10.1109/ICASSP.1988.197120

    Google Scholar 

  • Box GEP, Jenkins GM (1970) Time series analysis: forecasting and control. Holden-Day series in time series analysis. Holden-Day, San Francisco

    Google Scholar 

  • Burr RL, Lytle DW (1986) A general-method of minimum cross-entropy spectral estimation—comments Ieee T Acoust Speech 34:1324–1326

    Google Scholar 

  • Cui H, Singh VP (2015) Configurational entropy theory for streamflow forecasting. J Hydrol 521:1–17. doi:10.1016/j.jhydrol.2014.11.065

    Article  Google Scholar 

  • Cui H, Singh VP (2016a) Application of minimum relative entropy for streamflow forecasting Stoch Env Res Risk A, under review

  • Cui H, Singh VP (2016b) Maximum entropy spectral analysis for streamflow forecasting. Phys A 442:91–99. doi:10.1016/j.physa.2015.08.060

    Article  Google Scholar 

  • Feng X, Porporato A, Rodriguez-Iturbe I (2013) Changes in rainfall seasonality in the tropics. Nat Clim Change 3:811–815. doi:10.1038/Nclimate1907

    Article  Google Scholar 

  • Frieden BR (1972) Restoring with maximum likelihood and maximum entropy. J Opt Soc Am 62:511

    Article  CAS  Google Scholar 

  • Girardin V (2001) Relative entropy and covariance type constraints yielding ARMA models. In: Bayesian Inference and Maximum Entropy Methods in Science and Engineering: 20th International Workshop, 2001. AIP Publishing, Melville, vol 1, pp 318–327

  • Gull SF, Daniell GJ (1978) Image-reconstruction from incomplete and noisy data. Nature 272:686–690

    Article  Google Scholar 

  • Hipel KW, McLeod AI (1994) Time series modelling of water resources and environmental systems. Develop Water Sci 45:555–572

    Article  Google Scholar 

  • Johnson RW, Shore JE (1983) Which is better entropy expression for speech processing: SLogS or LogS?. Naval Research Laboratory, Washington, DC

    Google Scholar 

  • Katsakos-Mavromichalis NA, Tzannes MA, Tzannes NS (1985) Frequency resolution: a comparative study of four entropy methods. Kybernetes 15:25–32

    Article  Google Scholar 

  • Krstanovic PF, Singh VP (1991) A univariate model for long-term streamflow forecasting. 2. Appl Stoch Hydrol Hydraul 5:189–205

    Article  Google Scholar 

  • Kullback S (1959) Information theory and statistics. Wiley, New York

    Google Scholar 

  • Liefhebber F, Boekee DE (1987) Minimum information spectral-analysis. Signal Process 12:243–255

    Article  Google Scholar 

  • Nadeu C (1992) Finite length cepstrum modeling - a simple spectrum estimation technique. Signal Process 26:49–59

    Article  Google Scholar 

  • Nadeu C, Sanvicente E, Bertran MA (1981) new algorithm for spectral estimation. In: International Conference on Digital Signal Processing, Florence, Italy. pp 463–470

  • Oppenheim AV, Schafer RW (1975) Digital Signal Processing. Prentice-Hall, Englewood Cliffs, NJ

    Google Scholar 

  • Oppenheim AV, Schafer RW (2004) From frequency to quefrency: a history of the cepstrum. IEEE Signal Proc Mag 21:95

    Article  Google Scholar 

  • Papademetriou RC (1998) Experimental comparison of two information-theoretic spectral estimators. In: Signal Processing Proceedings, 1998. ICSP ‘98. Fourth International Conference, pp 141–144. doi:10.1109/ICOSP.1998.770170

  • Shore JE (1979) Minimum cross-entropy spectral analysis. Naval Research Laboratory, Washington DC

    Google Scholar 

  • Shore JE (1981) Minimum cross-entropy spectral-analysis. IEEE Trans Acoust Speech 29:230–237

    Article  Google Scholar 

  • Tzannes MA, Politis D, Tzannes NS (1985) A general method of minimum cross-entropy spectral estimation. IEEE Trans Acoust Speech 33:748–752

    Article  Google Scholar 

  • Wu NL (1983) An explicit solution and data extension in the maximum-entropy method. IEEE Trans Acoust Speech 31:486–491

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Huijuan Cui.

Appendices

Appendix 1: Cepstrum analysis

Cepstrum is defined as the inverse Fourier transform of the log-magnitude of Fourier spectrum. For a given streamflow time series y(t), the cepstrum can be computed using the following steps.

First, taking the Fourier transform of the original series y(t), one obtains

$$ Y(f) = \sum\limits_{n = - \infty }^{\infty } {y(t)e^{ - 2\pi nif} } $$
(26)

where Y(f) is the Fourier transform of y(t). Taking the inverse Fourier transform of the log-magnitude of Eq. (26) one obtains the cepstrum of the Fourier transform as

$$ C(n) = \frac{1}{2\pi }\int\limits_{ - \pi }^{\pi } {\log \left| {Y(f)} \right|e^{2\pi nif} df} $$
(27)

It is known that the Fourier transform of autocorrelation leads to the spectral density, which is

$$ q(f) = \sum\limits_{n = - \infty }^{\infty } {\rho (n)e^{ - 2\pi nif} } $$
(28)

Thus, similar to Eq. (26), the cepstrum of autocorrelation can be defined by the inverse Fourier transform of the log-magnitude of P(f), which yields

$$ e(n) = \frac{1}{2\pi }\int\limits_{ - \pi }^{\pi } {\log \left| {q(f)} \right|e^{2\pi nif} df} $$
(29)

However, it is known that the spectral density by definition can also be written as

$$ q(f) = \{ FT[y(t)]\}^{2} $$
(30)

Thus, the following relationship between the cepstrum and the cepstrum of the autocorrelation can be obtained:

$$ e(n) = \frac{1}{2\pi }\int\limits_{ - \pi }^{\pi } {\log \left| {FT[y(t)]} \right|^{2} e^{2\pi nif} df} = \frac{1}{2\pi }\int\limits_{ - \pi }^{\pi } {2\log \left| {Y(f)} \right|e^{2\pi nif} df} = 2C(n) $$
(31)

Appendix 2: Cepstrum of data of finite length

Consider only the positive part of the autocorrelation function \( \rho (n) \), for n > 0 and let e(n) be the cepstrum estimated from \( \rho (n) \), n > 0, which is

$$ e(n) = \frac{1}{2\pi }\int\limits_{ - \pi }^{\pi } {\log \left| {p*(f)} \right|e^{2\pi nif} df} $$
(32)

where the spectral density p * (f) is obtained by Fourier transform from the positive half of \( \rho (n) \), for n > 0. It is noted that p * (f) is analytical. Using only the positive part of ρ(n) ensures that ρ(n) is the minimum-phase function and for a minimum phase system the input and output are uniquely determined. This means e(n) can be uniquely determined from ρ(n). Let us define a two-sided output in the way that

$$ \hat{\rho }(n) = \left\{ {\begin{array}{*{20}c} {2e(n),} \\ {e(0),} \\ {2e( - n),} \\ \end{array} \begin{array}{*{20}c} \\ {} \\ {} \\ \end{array} } \right.\begin{array}{*{20}c} {n \,>\, 0} \\ {n\, = \,0} \\ {n\, < \,0} \\ \end{array} $$
(33)

In such a way, \( \hat{\rho }(n) \) can also be uniquely determined by \( \rho (n) \) and vice versa.

Since p *(f) is analytical, \( \log p*(f) \) can also be considered as analytical. In such a case, following Oppenheim and Schafer (1975), there is the following relationship between the derivatives of z transformed \( \hat{\rho }(n) \) and ρ(n):

$$ \hat{\rho }^{\prime}(z) = \frac{{\rho^{\prime}(z)}}{\rho (z)} $$
(34)

which is equivalent to

$$ z\hat{\rho }^{\prime}(z) = \sum\limits_{{}}^{{}} {[ - n\hat{\rho }(n)]z^{ - n + 1} } = \frac{{z\rho^{\prime}(z)}}{\rho (z)} $$
(35)

The following difference equation can be obtained from Eq. (35):

$$ z\rho^{\prime}(z) = z\hat{\rho }'(z)\rho (z) $$
(36)

Taking the inverse z transform of Eq. (36), one obtains

$$ n\rho (n) = \sum\limits_{k = \infty }^{\infty } {k\hat{\rho }(n)\rho (n - k)} $$
(37)

Dividing Eq. (37) by n, the relationship between input and output becomes

$$ \rho (n) = \sum\limits_{k = - \infty }^{\infty } {(\frac{k}{n})\hat{\rho }(n)\rho (n - k)} $$
(38)

Transforming Eq. (38) with the use of Eq. (33), the autocorrelation function can be obtained from the following recursive formula (Oppenheim and Schafer 1975):

$$ \rho (n) = \left\{ {\begin{array}{*{20}c} {\frac{\rho (0)}{2}e(n) + \sum\limits_{k = 1}^{n - 1} {\frac{k}{n}e(k)} \rho (n - k)],} \\ {\exp e(n),} \\ {\rho ( - n),} \\ \end{array} \begin{array}{*{20}c} \\ {} \\ {} \\ \end{array} } \right.\begin{array}{*{20}c} {n\, >\, 0} \\ {n\, =\, 0} \\ {n\, <\, 0} \\ \end{array} $$
(39)

On the other hand, the cepstrum e(n) can be obtained from the reverse relation of Eq. (39) as:

$$ e(n) = \left\{ {\begin{array}{*{20}c} {\frac{2}{\rho (0)}[\rho (n) - \sum\limits_{k = 1}^{n - 1} {\frac{k}{n}e(k)} \rho (n - k)],} \\ {\log \rho (n),} \\ {0,} \\ \end{array} \begin{array}{*{20}c} \\ {} \\ {} \\ \end{array} } \right.\begin{array}{*{20}c} {n\, >\, 0} \\ {n\, =\, 0} \\ {n\, <\, 0} \\ \end{array} $$
(40)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Cui, H., Singh, V.P. Minimum relative entropy theory for streamflow forecasting with frequency as a random variable. Stoch Environ Res Risk Assess 30, 1545–1563 (2016). https://doi.org/10.1007/s00477-016-1281-z

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00477-016-1281-z

Keywords

Navigation