Skip to main content

Advertisement

Log in

A cooperative genetic algorithm based on extreme learning machine for data classification

  • Data analytics and machine learning
  • Published:
Soft Computing Aims and scope Submit manuscript

Abstract

It is a challenging task to optimize network structure and connection parameters simultaneously in a single hidden layer feedforward neural network (SLFN). Extreme learning machine (ELM) is a popular non-iterative learning method in recent years, which often provides good generalization performance of a SLFN at extremely fast learning speed, yet only for fixed network structure. In this work, a cooperative binary-real genetic algorithm (CGA) based on ELM, called CGA-ELM, is proposed to adjust the structure and parameters of a SLFN simultaneously for achieving a compact network with good generalization performance. In CGA-ELM, a hybrid coding scheme is designed to evolve the network structure and input parameters, i.e., input weights between input nodes and hidden nodes as well as the biases of hidden nodes. Then output parameters, i.e., output weights between hidden nodes and output nodes, are determined by the ELM. A combination of training error and network complexity is taken as the fitness function to evaluate the performance of a SLFN. A binary GA is responsible for optimizing network structure, while a real GA and the ELM optimize collaboratively network parameters. Experimental results on classification applications demonstrate that CGA-ELM outperforms CGA and ELM significantly in terms of the generalization ability. Also, CGA-ELM has more competitive capacity when compared with other state-of-the-art algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

Data Availability

The data used to support the finding are cited within the article. Also, the datasets generated during and/or analysed during the current study are available from the corresponding author on reasonable request.

References

  • Albadr MAA, Tiun S (2017) Extreme learning machine: a review. Int J Appl Eng Res 12(14):4610–4623

    Google Scholar 

  • Albadr MAA, Tiun S, AL-Dhief FT, Sammour MAM (2018) Spoken language identification based on the enhanced self-adjusting extreme learning machine approach. PLOS ONE 13(4):e0194770

    Article  Google Scholar 

  • Albadr MAA, Tiun S, Ayob M, AL-Dhief FT (2019) Spoken language identification based on optimised genetic algorithm-extreme learning machine approach. Int J Speech Technol 22(3):711–727

    Article  Google Scholar 

  • Albadr MAA, Tiun S (2020) Spoken language identification based on particle swarm optimisation-extreme learning machine approach. Circ Syst Signal Process 39:4596–4622

    Article  Google Scholar 

  • Albadr MAA, Tiun S, Ayob M, AL-Dhief FT (2020) Genetic algorithm based on natural selection theory for optimization problems. Symmetry 12(11):1758

    Article  Google Scholar 

  • Albadr MAA, Tiun S, Ayob M, AL-Dhief FT, Omar K, Hamzah FA (2020) Optimised genetic algorithm-extreme learning machine approach for automatic COVID-19 detection. PLOS ONE 15(12):e0242899

    Article  Google Scholar 

  • Albadr MAA, Tiun S, Ayob M, AL-Dhief FT, Abdali TN, Abbas AF (2021) Extreme learning machine for automatic language identification utilizing emotion speech data. In: Proceedings of the 2021 international conference on electrical, communication, and computer engineering (ICECCE)

  • Aljarah I, Faris H, Mirjalili S (2018) Optimizing connection weights in neural networks using the whale optimization algorithm. Soft Comput 22:1–15

    Article  Google Scholar 

  • Ampazis N, Perantonis SJ (2002) Two highly efficient second-order algorithms for training feedforward networks. IEEE Trans Neural Netw 13(5):1064–1074

    Article  Google Scholar 

  • Barreto GA, Barros ALBP (2016) A robust extreme learning machine for pattern classification with outliers. Neurocomputing 176:3–13

    Article  Google Scholar 

  • Cao W, Wang X, Ming Z, Gao J (2018) A review on neural networks with random weights. Neurocomputing 275:278–287

    Article  Google Scholar 

  • Cui D, Huang GB, Liu T (2018) ELM based smile detection using distance vector. Pattern Recogn 79:356–369

    Article  Google Scholar 

  • Deng WY, Ong YS, Zheng QH (2016) A fast reduced kernel extreme learning machine. Neural Netw 76:29–38

    Article  MATH  Google Scholar 

  • Ding S, Su C, Yu J (2011) An optimizing BP neural network algorithm based on genetic algorithm. Artif Intell Rev 36(2):153–162

    Article  Google Scholar 

  • Faris H, Aljarah I, Al-Madi N, Mirjalili S (2016) Optimizing the learning process of feedforward neural networks using lightning search algorithm. Int J Artif Intell Tools 25(6):1650033

    Article  Google Scholar 

  • Faris H, Mirjalili S, Aljarah I (2019) Automatic selection of hidden neurons and weights in neural networks using grey wolf optimizer based on a hybrid encoding scheme. Int J Mach Learn Cybern 10:2901–2920

    Article  Google Scholar 

  • Goldberg DE, Holland JH (1988) Genetic algorithms and machine learning. Mach Learn 3(2):95–99

    Article  Google Scholar 

  • Goldberg DE (1989) Genetic algorithms in search, optimization, and machine learning. Addison-Wesley Publishers, Boston

    MATH  Google Scholar 

  • Gupta JND, Sexton RS (1999) Comparing backpropagation with agenetic algorithm for neural network training. Omega 27(6):679–684

    Article  Google Scholar 

  • Haklı H, Uǧuz H (2014) A novel particle swarm optimization algorithm with Levy flight. Appl Soft Comput 23:333–345

    Article  Google Scholar 

  • Hameed AA, Karlik B, Salman MS (2016) Back-propagation algorithm with variable adaptive momentum. Knowl Based Syst 114:79–87

    Article  Google Scholar 

  • Han F, Jiang J, Ling QH, Su BY (2019) A survey on metaheuristic optimization for random single-hidden layer feedforward neural network. Neurocomputing 335:261–273

    Article  Google Scholar 

  • Haykin S (2009) Neural networks and learning machines 3. Prentice-Hall Publishers, Hoboken

    Google Scholar 

  • Hemeida AH, Hassan SA, Mohamed AAA, Alkhalaf S, Mahmoud MM, Senjyu T, El-Din AB (2020) Nature-inspired algorithms for feed-forward neural network classifiers: a survey of one decade of research. Ain Shams Eng J 11:659–675

    Article  Google Scholar 

  • Heris MK (2015) Binary and real-coded genetic algorithms in MATLAB (URL: https://yarpiz.com/23/ypea101-genetic-algorithms), Yarpiz

  • Holland JH (1975) Adaptation in natural and artificial systems. University of Michigan Press, Ann Arbor

    Google Scholar 

  • Hornick K, Stinchcombe M, White H (1989) Multilayer feedforward networks are universal approximators. Neural Netw 2:359–366

    Article  MATH  Google Scholar 

  • Huang GB (2003) Learning capability and storage capacity of two hidden-layer feedforward networks. IEEE Trans Neural Netw 14(2):274–281

    Article  Google Scholar 

  • Huang GB, Zhu QY, Siew CK (2004) Extreme learning machine: a new learning scheme of feedforward neural networks. In: Proceedings of the IEEE international joint conference on neural networks, pp 985–990

  • Huang GB, Chen L, Siew CK (2006) Universal approximation using incremental constructive feedforward networks with random hidden nodes. IEEE Trans Neural Netw 17(4):879–892

    Article  Google Scholar 

  • Huang GB, Zhu QY, Siew CK (2006) Extreme learning machine: theory and applications. Neurocomputing 70(1):489–501

    Article  Google Scholar 

  • Huang GB, Zhou H, Ding X, Zhang R (2012) Extreme learning machine for regression and multiclass classification. IEEE Trans Syst Man Cybern B 42(2):513–529

    Article  Google Scholar 

  • Huang ML, Chou YC (2019) Combining a gravitational search algorithm, particle swarm optimization, and fuzzy rules to improve the classification performance of a feed-forward neural network. Comput Methods Prog Biomed 180:105016

    Article  Google Scholar 

  • Huang J, Sun W, Huang L (2020) Deep neural networks compression learning based on multiobjective evolutionary algorithms. Neurocomputing 378:260–269

    Article  Google Scholar 

  • Jain AK, Duin RPW, Mao J (2000) Statistical pattern recognition: a review. IEEE Trans Pattern Anal Mach Intell 22(1):4–37

    Article  Google Scholar 

  • Jebari K, Madiafi M (2013) Selection methods for genetic algorithms. Int J Emerg Sci 3(4):333–344

    Google Scholar 

  • Jensi R, Jiji GW (2016) An enhanced particle swarm optimization with Levy flight for global optimization. Appl Soft Comput 43:248–261

    Article  Google Scholar 

  • Lam HK, Leung FHF (2006) Design and stabilization of sampled-data neural-network based control systems. IEEE Trans Syst Man Cybern B Cybern 36:995–1005

    Article  Google Scholar 

  • Li B, Zhao YP (2020) Group reduced kernel extreme learning machine for fault diagnosis of aircraft engine. Eng Appl Artif Intell 96:103968

    Article  Google Scholar 

  • Li H, Zhang L (2020) A bilevel learning model and algorithm for self-organizing feed-forward neural networks for pattern classification. IEEE Trans Neural Netw Learn Syst, pp 1–15. https://doi.org/10.1109/TNNLS.2020.3026114

  • Liu H, Tian H, Chen C, Li Y (2013) An experimental investigation of two Wavelet-MLP hybrid frameworks for wind speed prediction using GA and PSO optimization. Int J Electr Power Energy Syst 52:161–173

    Article  Google Scholar 

  • Luo XJ, Oyedele LO, Ajayi AO, Akinade OO, Delgado JMD, Owolabi HA, Ahmed A (2020) Genetic algorithm-determined deep feedforward neural network architecture for predicting electricity consumption in real buildings. Energy AI 2:100015

    Article  Google Scholar 

  • Luo Q, Li J, Zhou Y, Liao L (2021) Using spotted hyena optimizer for training feedforward neural networks. Cogn Syst Res 65:1–16

    Article  Google Scholar 

  • Maryam T, Mahmoudi, Forouzideh N, Lucas C, Fattaneh, Taghiyareh F (2009) Artificial neural network weights optimization based on imperialist competitive algorithm. 7th International conference on computer science and information technologies (CSIT 2009)

  • Mirjalili S, Mirjalili SM, Lewis A (2014) Let a biogeography-based optimizer train your multilayer perceptron. Inf Sci 269:188–209

    Article  Google Scholar 

  • Ojha VK, Abraham A, Snás̆el V (2017) Metaheuristic design of feedforward neural networks: A review of two decades of research. Eng Appl Artif Intell 60:97–116

    Article  Google Scholar 

  • Peyghami MR, Khanduzi R (2012) Predictability and forecasting automotive price based on a hybrid train algorithm of MLP neural network. Neural Comput Appl 21:125–132

    Article  Google Scholar 

  • Qiao J, Guo X, Li W (2020) An online self-organizing algorithm for feedforward neural network. Neural Comput Appl 32:17505–17518

    Article  Google Scholar 

  • Ragusa E, Gastaldo P, Zunino R, Cambria E (2020) Balancing computational complexity and generalization ability: a novel design for ELM. Neurocomputing 401:405–417

    Article  Google Scholar 

  • Sastry K, Goldberg DE, Kendall G (2014) Genetic algorithms. In: Burke EK, Kendall G (eds) Search methodologies: introductory tutorials in optimization and decision support techniques. Springer, Boston, pp 93–117

    Chapter  Google Scholar 

  • Seiffert U (2001) Multiple layer perceptron training using genetic algorithms. In: Proceedings of the ninth european symposium on artificial neural networks (ESANN 2001), Bruges B, pp 159–164

  • Senhaji K, Ramchoun H, Ettaouil M (2020) Training feedforward neural network via multiobjective optimization model using non-smooth \(L_{1/2}\) regularization. Neurocomputing 410:1–11

    Article  Google Scholar 

  • Sexton RS, Gupta JND (2000) Comparative evaluation of genetic algorithm and backpropagation for training neural networks. Inf Sci 129(14):45–59

    Article  MATH  Google Scholar 

  • Sun ZL, Huang DS, Zheng CH, Shang L (2006) Optimal selection of time lags for TDSEP based on genetic algorithm. Neurocomputing 69(79):884–887

    Article  Google Scholar 

  • Sun Y, Xue B, Zhang M, Yen GG, Lv J (2020) Automatically designing cnn architectures using the genetic algorithm for image classification. IEEE Trans Cybern 50(9):3840–3854

    Article  Google Scholar 

  • Surendran A, Samuel P (2017) Evolution or revolution: the critical need in genetic algorithm based testing. Artif Intell Rev 48(3):349–95

    Article  Google Scholar 

  • Tang J, Deng C, Huang GB (2017) Extreme learning machine for multilayer perceptron. IEEE Trans Neural Netw Learn Syst 27(4):809–821

    Article  MathSciNet  Google Scholar 

  • Tarkhaneh O, Shen H (2019) Training of feedforward neural networks for data classification using hybrid particle swarm optimization, Mantegna Lévy flight and neighborhood search. Heliyon 5(4):e01275

    Article  Google Scholar 

  • Wang J, Zhang B, Sun Z, Hao W, Sun Q (2018) A novel conjugate gradient method with generalized Armijo search for efficient training of feedforward neural networks. Neurocomputing 275:308–316

    Article  Google Scholar 

  • Wang S, Zhu E, Yin J, Porikli F (2018) Video anomaly detection and localization by local motion based joint video representation and ocelm. Neurocomputing 277:161–175

    Article  Google Scholar 

  • Wdaa ASI (2008) Differential evolution for neural networks learning enhancement. PhD thesis, Universiti Teknologi Malaysia

  • Wu Y, Zhang Y, Liu X, Cai Z, Cai Y (2018) A multiobjective optimization-based sparse extreme learning machine algorithm. Neurocomputing 317:88–100

    Article  Google Scholar 

  • Yang J, Ma J (2019) Feed-forward neural network training using sparse representation. Expert Syst Appl 116:255–264

    Article  Google Scholar 

  • Yu J, Xi L, Wang S (2007) An Improved Particle Swarm Optimization for Evolving Feedforward Artificial Neural Networks. Neural Process Lett 26(3):217–231

    Article  Google Scholar 

  • Zhang J, Xiao W, Li Y, Zhang S, Zhang Z (2020) Multilayer probability extreme learning machine for device-free localization. Neurocomputing 396:383–393

    Article  Google Scholar 

  • Zhao L, Qian F (2011) Tuning the structure and parameters of a neural network using cooperative binary-real particle swarm optimization. Expert Syst Appl 38:4972–4977

    Article  Google Scholar 

  • Zhang L, Li H, Kong XG (2019) Evolving feedforward artificial neural networks using a two-stage approach. Neurocomputing 360:25–36

    Article  Google Scholar 

  • Zhang L, Li H (2019) A mixed-coding adaptive differential evolution for optimising the architecture and parameters of feedforward neural networks. Int J Sens Netw 29(4):262–274

    Article  Google Scholar 

  • Zhang G, Lian WQ, Li SN, Cui H, Jing MQ, Chen ZW (2022) A self-adaptive denoising algorithm based on genetic algorithm for photon-counting lidar data. IEEE Geosci Remote Sens Lett 19:6501405

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hong Li.

Ethics declarations

Funding

This work was supported in part by the Natural Science Basic Research Program of Shaanxi (Program Nos. 2022JM-372 and 2022JQ-670), in part by the National Natural Science Foundation of China (Grant Nos. 61966030, 61772391, and 62106186), in part by the Fundamental Research Funds for the Central Universities (Grant No. JB210701).

Conflict of interest

The authors declare that they have no conflict of interest.

Ethical approval

This article does not contain any studies with human participants or animals performed by any of the authors.

Informed consent

Informed consent was obtained from all individual participants included in the study.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Bai, L., Li, H., Gao, W. et al. A cooperative genetic algorithm based on extreme learning machine for data classification. Soft Comput 26, 8585–8601 (2022). https://doi.org/10.1007/s00500-022-07202-9

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00500-022-07202-9

Keywords

Navigation