Skip to main content
Log in

Application of soft computing based hybrid models in hydrological variables modeling: a comprehensive review

  • Original Paper
  • Published:
Theoretical and Applied Climatology Aims and scope Submit manuscript

Abstract

Since the middle of the twentieth century, artificial intelligence (AI) models have been used widely in engineering and science problems. Water resource variable modeling and prediction are the most challenging issues in water engineering. Artificial neural network (ANN) is a common approach used to tackle this problem by using viable and efficient models. Numerous ANN models have been successfully developed to achieve more accurate results. In the current review, different ANN models in water resource applications and hydrological variable predictions are reviewed and outlined. In addition, recent hybrid models and their structures, input preprocessing, and optimization techniques are discussed and the results are compared with similar previous studies. Moreover, to achieve a comprehensive view of the literature, many articles that applied ANN models together with other techniques are included. Consequently, coupling procedure, model evaluation, and performance comparison of hybrid models with conventional ANN models are assessed, as well as, taxonomy and hybrid ANN models structures. Finally, current challenges and recommendations for future researches are indicated and new hybrid approaches are proposed.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Abbreviations

AAE:

Average absolute error

AARE:

Average absolute relative error

ACCDIFF:

Accumulated difference

AIC:

Akaike information criterion

ANFIS:

Adaptive neuro-fuzzy inference system

ANN:

Artificial neural network

APE:

Average percentage error

AR:

Autoregressive

ARE:

Absolute relative error

ARIMA:

Autoregressive integrated moving average

ARIMAX:

Autoregressive integrated moving average with exogenous input

ARMA:

Autoregressive moving average

ARV:

Average relative variance

ATNN:

Adaptive time-delay neural network

ATp:

The error of the time for peak to arrive

BANN:

Bootstrapped artificial neural network

BIC:

Bayesian information criterion

BP:

Back-propagation

BPNN:

Back-propagation neural network

C:

Coefficient

CANN:

Cluster-based ANN

CG:

Conjugate gradient

CM:

Confusion matrix

CNNs:

Computational neural networks

d:

Index of agreement

d1:

The adjusted index of agreement

DBP:

Division-based BP

E:

Nash-Sutcliffe efficiency

E1:

Modified coefficient of efficiency

EANN:

Evolutionary artificial neural network

EKFQ:

Extended Kalman filtering

ENN:

Ensemble neural networks

EQp%:

The error of peak discharge

Fc:

Fuzzy partition coefficient

FCM:

Fuzzy c-means

FE:

Forecasting error

FFBP:

Feed-forward back-propagation

FFNN:

Feed-forward neural networks

GA:

Genetic algorithm

GEP:

Gene expression programming

GP:

Genetic programming

GRNN:

Generalized regression neural networks

IDNN:

Input delay neural network

IGANFIS:

Integrated geomorphological adaptive neuro-fuzzy inference system

IVF:

Index of volumetric fit

JNN:

Jordan recurrent neural network

KNNs/K-nn:

K-nearest neighbor

LLR:

Local linear regression model

LM:

Levenberg-Marquardet

LMBP:

Levenberg-Marquardet back-propagation

MA:

Moving average

MAE:

Mean absolute error

MA-ANN:

Moving average artificial neural networks

MANN:

Modular artificial neural network

MAPE:

Mean absolute percentage error

MARE:

Mean absolute relative error

MAXAE:

Maximum absolute error

MBE:

Mean bias error

ME:

Mean error

MINAE:

Minimum absolute error

MLP:

Multilayer perceptron

MLPNN:

Multilayer perceptron neural network

MRE:

Mean relative error

MSE:

Mean squared error

NormBIC:

Normalized Bayesian information criteria

NMBE:

Normalized mean bias error

NMSE:

Normalized mean squared error

NRMSE:

Normalized root mean squared error

NSC:

Nash-Sutcliffe coefficient

PANN:

Periodic ANN

PCA:

Principal component analysis

PE:

Relative peak error

PI:

Coefficient of persistence index

r :

Pearson coefficient of correlation

R :

Correlation coefficient

R 2 :

Coefficient of determination

RAEp(%):

Ratio of absolute error of peak flow

RBF:

Radial basis function

RBFNN:

Radial basis function neural network

R-Bias:

Relative bias

RE:

Maximum relative error

RME:

Relative mean error

RMSE:

Root mean squared error

R-RMSE:

Relative root mean squared error

RT:

Regression trees

RTRL:

Real-time recurrent learning

S :

Slope

S2d:

Variance of the distribution of differences with MBE

SARIMAX:

Seasonal autoregressive integrated moving average with exogenous input

SE:

Standard error

SEE:

Standard error of estimate

%SEP:

Percent standard error of prediction

SI:

Scatter index

SOFNN:

Self-organizing fuzzy neural networks

SOM:

Self-organizing map

SONO:

Self-organizing nonlinear output map

SORB:

Self-organizing radial basis

SS:

Skill score

SSA:

Singular spectrum analysis

SSE:

Sum of square errors

SSNN:

State space neural network

SVM:

Support vector machine

SWMM:

Storm water management model

TANN:

Threshold-based ANN

TDRNN:

Time-delay recurrent neural network

TNN:

Tevere neural network

TS:

Threshold statistic

VER%:

The error of total discharge volume

WNN:

Wavelet neural network

WT:

Wavelet transform

References

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Farzad Fahimi.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Fahimi, F., Yaseen, Z.M. & El-shafie, A. Application of soft computing based hybrid models in hydrological variables modeling: a comprehensive review. Theor Appl Climatol 128, 875–903 (2017). https://doi.org/10.1007/s00704-016-1735-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00704-016-1735-8

Keywords

Navigation