Skip to main content
Log in

A benchmark-based method for evaluating hyperparameter optimization techniques of neural networks for surface water quality prediction

  • Research Article
  • Published:
Frontiers of Environmental Science & Engineering Aims and scope Submit manuscript

Abstract

Neural networks (NNs) have been used extensively in surface water prediction tasks due to computing algorithm improvements and data accumulation. An essential step in developing an NN is the hyperparameter selection. In practice, it is common to manually determine hyperparameters in the studies of NNs in water resources tasks. This may result in considerable randomness and require significant computation time; therefore, hyperparameter optimization (HPO) is essential. This study adopted five representatives of the HPO techniques in the surface water quality prediction tasks, including the grid sampling (GS), random search (RS), genetic algorithm (GA), Bayesian optimization (BO) based on the Gaussian process (GP), and the tree Parzen estimator (TPE). For the evaluation of these techniques, this study proposed a method: first, the optimal hyperparameter value sets achieved by GS were regarded as the benchmark; then, the other HPO techniques were evaluated and compared with the benchmark in convergence, optimization orientation, and consistency of the optimized values. The results indicated that the TPE-based BO algorithm was recommended because it yielded stable convergence, reasonable optimization orientation, and the highest consistency rates with the benchmark values. The optimization consistency rates via TPE for the hyperparameters hidden layers, hidden dimension, learning rate, and batch size were 86.7%, 73.3%, 73.3%, and 80.0%, respectively. Unlike the evaluation of HPO techniques directly based on the prediction performance of the optimized NN in a single HPO test, the proposed benchmark-based HPO evaluation approach is feasible and robust.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Afzaal H, Farooque A A, Abbas F, Acharya B, Esau T (2020). Groundwater estimation from major physical hydrology components using artificial neural networks and deep learning. Water, 12(1): 5

    Article  Google Scholar 

  • Audet C, Kokkolaras M (2016). Blackbox and derivative-free optimization: Theory, algorithms and applications. Optimization and Engineering, 17(1): 1–2

    Article  Google Scholar 

  • Bergstra J, Bardenet R, Bengio Y, Kégl B (2011). Algorithms for hyper-parameter optimization. Advances in Neural Information Processing Systems 24: 25th Annual Conference on Neural Information Processing Systems 2011. Granada, December 12–15, 2011

  • Bergstra J, Bengio Y (2012). Random search for hyper-parameter optimization. Journal of Machine Learning Research, 13: 281–305

    Google Scholar 

  • Bergstra J, Yamins D, Cox D (2013). Hyperopt: a python library for optimizing the hyperparameters of machine learning algorithms. In: Proceedings of the 12th Python in Science Conference. Scipy, Austin, June 24–29, 2013

  • Davis L (1991). Handbook of Genetic Algorithms. New York: Thomson Publishing Group

    Google Scholar 

  • Diez-Sierra J, del Jesus M (2020). Long-term rainfall prediction using atmospheric synoptic patterns in semi-arid climates with statistical and machine learning methods. Journal of Hydrology, 586(1): 124789

    Article  Google Scholar 

  • Du S S, Poczós B, Zhai X, Singh A (2019). Gradient descent provably optimizes over-parameterized neural networks. 7th International Conference on Learning Representations, ICLR 2019, 1–19. New Orleans, May 6th–9th, 2019

  • Fu X, Zheng Q, Jiang G, Roy K, Huang L, Liu C, Li K, Chen H, Song X, Chen J, et al. (2023). Water quality prediction of copper-molybdenum mining-beneficiation wastewater based on the PSO-SVR model. Frontiers of Environmental Science & Engineering, 17(8): 98

    Article  CAS  Google Scholar 

  • Galelli S, Humphrey G B, Maier H R, Castelletti A, Dandy G C, Gibbs M S (2014). An evaluation framework for input variable selection algorithms for environmental data-driven models. Environmental Modelling & Software, 62: 33–51

    Article  Google Scholar 

  • Greff K, Srivastava R K, Koutník J, Steunebrink B R, Schmidhuber J (2017). LSTM: a search space odyssey. IEEE Transactions on Neural Networks and Learning Systems, 28(10): 2222–2232

    Article  Google Scholar 

  • Hong H, Tsangaratos P, Ilia I, Loupasakis C, Wang Y (2020). Introducing a novel multi-layer perceptron network based on stochastic gradient descent optimized by a meta-heuristic algorithm for landslide susceptibility mapping. Science of the Total Environment, 742: 140549

    Article  CAS  Google Scholar 

  • Kang G, Gao J Z, Xie G (2017). Data-driven water quality analysis and prediction: a survey. In: Proceedings of 3rd IEEE International Conference on Big Data Computing Service and Applications, BigDataService 2017, 224–232

  • Kingma D P, Ba J (2015). Adam: a method for stochastic optimization. In: Proceedings of International Conference on Learning Representation (ICLR), 2015. San Diego, May 7–9, 2015

  • Kingston G B, Lambert M F, Maier H R (2005). Bayesian training of artificial neural networks used for water resources modeling. Water Resources Research, 41(12): 2005WR004152

    Article  Google Scholar 

  • Kingston G B, Maier H R, Lambert M F (2008). Bayesian model selection applied to artificial neural networks used for water resources modeling. Water Resources Research, 44(4): 2007WR006155

    Article  Google Scholar 

  • Klein A, Falkner S, Bartels S, Hennig P, Hutter F (2017). Fast Bayesian optimization of machine learning hyperparameters on large datasets. In: Proceedings of the 20th International Conference on Artificial Intelligence and Statistics. Ft. Lauderdale, April 20–22, 2017

  • Krizhevsky A, Sutskever I, Hinton G E (2017). ImageNet classification with deep convolutional neural networks. Communications of the ACM, 60(6): 84–90

    Article  Google Scholar 

  • Li J, Chen Z, Li X, Yi X, Zhao Y, He X, Huang Z, Hassaan M A, El Nemr A, Huang M (2023). Water quality soft-sensor prediction in anaerobic process using deep neural network optimized by Tree-structured Parzen Estimator. Frontiers of Environmental Science & Engineering, 17(6): 67

    Article  CAS  Google Scholar 

  • Li Q, Dietrich F, Bollt E M, Kevrekidis I G (2017). Extended dynamic mode decomposition with dictionary learning: a data-driven adaptive spectral decomposition of the Koopman operator. Chaos, 27(10): 103111

    Article  Google Scholar 

  • Li X, Zecchin A C, Maier H R (2015). Improving partial mutual information-based input variable selection by consideration of boundary issues associated with bandwidth estimation. Environmental Modelling & Software, 71: 78–96

    Article  Google Scholar 

  • Ma J, Cheng J C P, Lin C, Tan Y, Zhang J (2019). Improving air quality prediction accuracy at larger temporal resolutions using deep learning and transfer learning techniques. Atmospheric Environment, 214(8): 116885

    Article  CAS  Google Scholar 

  • Maier H R, Jain A, Dandy G C, Sudheer K P (2010). Methods used for the development of neural networks for the prediction of water resource variables in river systems: current status and future directions. Environmental Modelling & Software, 25(8): 891–909

    Article  Google Scholar 

  • Martinez-Cantin R (2015). BayesOpt: a Bayesian optimization library for nonlinear optimization, experimental design and bandits. Journal of Machine Learning Research, 15: 3735–3739

    Google Scholar 

  • Mount N J, Maier H R, Toth E, Elshorbagy A, Solomatine D, Chang F J, Abrahart R J (2016). Data-driven modelling approaches for socio-hydrology: opportunities and challenges within the Panta Rhei Science Plan. Hydrological Sciences Journal, 61(7): 1192–1208

    Google Scholar 

  • Müller J, Park J, Sahu R, Varadharajan C, Arora B, Faybishenko B, Agarwal D (2021). Surrogate optimization of deep neural networks for groundwater predictions. Journal of Global Optimization, 81(1), 203–231

    Article  Google Scholar 

  • Najah A, El-Shafie A, Karim O A, El-Shafie A H (2013). Application of artificial neural networks for water quality prediction. Neural Computing & Applications, 22(S1): 187–201

    Article  Google Scholar 

  • Noè F, Nüske F (2013). A variational approach to modeling slow processes in stochastic dynamical systems. Multiscale Modeling & Simulation, 11(2): 635–655

    Article  Google Scholar 

  • Nourani V, Pradhan B, Ghaffari H, Sharifi S S (2014). Landslide susceptibility mapping at Zonouz Plain, Iran using genetic programming and comparison with frequency ratio, logistic regression, and artificial neural network models. Natural Hazards, 71(1): 523–547

    Article  Google Scholar 

  • Ömer Faruk D (2010). A hybrid neural network and ARIMA model for water quality time series prediction. Engineering Applications of Artificial Intelligence, 23(4): 586–594

    Article  Google Scholar 

  • Ozaki Y, Tanigaki Y, Watanabe S, Onishi M (2020). Multiobjective tree-structured parzen estimator for computationally expensive optimization problems. In: Proceedings of the 2020 Genetic and Evolutionary Computation Conference. New York, July 8–12, 2020

  • Rodriguez-Perez J, Leigh C, Liquet B, Kermorvant C, Peterson E, Sous D, Mengersen K (2020). Detecting technical anomalies in high-frequency water-quality data using artificial neural networks. Environmental Science & Technology, 54(21): 13719–13730

    Article  CAS  Google Scholar 

  • Rong G, Li K, Su Y, Tong Z, Liu X, Zhang J, Zhang Y, Li T (2021). Comparison of Tree-structured Parzen estimator optimization in three typical neural network models for landslide susceptibility assessment. Remote Sensing, 13(22): 4694

    Article  Google Scholar 

  • Schmidhuber J (2015). Deep learning in neural networks: an overview. Neural Networks, 61: 85–117

    Article  Google Scholar 

  • Snoek J, Larochelle H, Adams R P (2012). Practical Bayesian optimization of machine learning algorithms. Advances in Neural Information Processing Systems, 4: 2951–2959

    Google Scholar 

  • Spiegel M R (2018). Schaum’s Outlines Statistics. McGraw-Hill Education New York

    Google Scholar 

  • Srivastava N, Hinton G, Krizhevsky A, Sutskever I, Salakhutdinov R (2014). Dropout: a simple way to prevent neural networks from overfitting. Journal of Machine Learning Research, 15(1): 1929–1958

    Google Scholar 

  • Tian W, Liao Z, Wang X (2019). Transfer learning for neural network model in chlorophyll-a dynamics prediction. Environmental Science and Pollution Research International, 26(29): 29857–29871

    Article  CAS  Google Scholar 

  • Tian W, Wu H (2021). Kernel Embedding based Variational Approach for Low-dimensional Approximation of Dynamical Systems. Computational Methods in Applied Mathematics, 21(3): 635–659

    Article  Google Scholar 

  • Tiyasha T, Tung T M, Yaseen Z M (2020). A survey on river water quality modelling using artificial intelligence models: 2000–2020. Journal of Hydrology, 585(2): 124670

    Article  CAS  Google Scholar 

  • Valipour M, Banihabib M E, Behbahani S M R (2013). Comparison of the ARMA, ARIMA, and the autoregressive artificial neural network models in forecasting the monthly inflow of Dez Dam Reservoir. Journal of Hydrology, 476: 433–441

    Article  Google Scholar 

  • Wang G, Lei X, Chen W, Shahabi H, Shirzadi A (2020). Hybrid computational intelligence methods for landslide susceptibility mapping. Symmetry, 12(3): 325

    Article  Google Scholar 

  • Wang X, Tian W, Liao Z (2022). Framework for hyperparameter impact analysis and selection for water resources feedforward neural network. Water Resources Management, 36(11): 4201–4217

    Article  Google Scholar 

  • Wang Z, Wang Q, Wu T (2023). A novel hybrid model for water quality prediction based on VMD and IGOA optimized for LSTM. Frontiers of Environmental Science & Engineering, 17(7): 88

    Article  CAS  Google Scholar 

  • Watanabe S (2023). Tree-structured parzen estimator: understanding its algorithm components and their roles for better empirical performance. arXiv: 2304.11127

  • Wu W, Dandy G C, Maier H R (2014). Protocol for developing ANN models and its application to the assessment of the quality of the ANN model development process in drinking water quality modelling. Environmental Modelling & Software, 54: 108–127

    Article  Google Scholar 

  • Wu W, May R J, Maier H R, Dandy G C (2013). A benchmarking approach for comparing data splitting methods for modeling water resources parameters using artificial neural networks. Water Resources Research, 49(11): 7598–7614

    Article  Google Scholar 

  • Yang K, Van Der Blom K, Bäck T, Emmerich M (2019). Towards single- and multiobjective Bayesian global optimization for mixed integer problems. AIP Conference Proceedings, 2070(2): 020044–1–020044–4

    Google Scholar 

  • Young S R, Rose D C, Karnowski T P, Lim S H, Patton R M (2015). Optimizing deep learning hyper-parameters through an evolutionary algorithm. In: Proceedings of MLHPC 2015: Machine Learning in High-Performance Computing Environments—Held in Conjunction with SC 2015: The International Conference for High Performance Computing, Networking, Storage and Analysis. Austin, Texas, November 15–20, 2015

  • Zhang J, Zhu Y, Zhang X, Ye M, Yang J (2018). Developing a Long Short-Term Memory (LSTM) based model for predicting water table depth in agricultural areas. Journal of Hydrology, 561: 918–929

    Article  Google Scholar 

Download references

Acknowledgements

This study was financially supported by the National Key R&D Project (No. 2022YFC3203203) and the Shaanxi Province Science Fund for Distinguished Young Scholars (No. S2023-JC-JQ-0036).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jinsuo Lu.

Ethics declarations

Conflict of Interests The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Additional information

Highlights

• Manually adjustment of hyperparameters is highly random and computational expensive.

• Five HPO techniques were implemented in surface water quality prediction NN models.

• The proposed benchmark-based method for HPO evaluation is feasible and robust.

• TPE-based BO was the recommended HPO method for its satisfactory performance.

Electronic supplementary material

11783_2024_1814_MOESM1_ESM.pdf

A benchmark-based method for evaluating hyperparameter optimization techniques of neural networks for surface water quality prediction

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wang, X., Dong, Y., Yang, J. et al. A benchmark-based method for evaluating hyperparameter optimization techniques of neural networks for surface water quality prediction. Front. Environ. Sci. Eng. 18, 54 (2024). https://doi.org/10.1007/s11783-024-1814-5

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s11783-024-1814-5

Keywords

Navigation