Abstract
This paper introduces a comprehensive survey of a new population-based algorithm so-called gradient-based optimizer (GBO) and analyzes its major features. GBO considers as one of the most effective optimization algorithm where it was utilized in different problems and domains, successfully. This review introduces set of related works of GBO where distributed into; GBO variants, GBO applications, and evaluate the efficiency of GBO compared with other metaheuristic algorithms. Finally, the conclusions concentrate on the existing work on GBO, showing its disadvantages, and propose future works. The review paper will be helpful for the researchers and practitioners of GBO belonging to a wide range of audiences from the domains of optimization, engineering, medical, data mining and clustering. As well, it is wealthy in research on health, environment and public safety. Also, it will aid those who are interested by providing them with potential future research.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
1 Introduction
Optimization algorithms were introduced based on the behaviors of various organisms [1, 2]. In other words, the optimization algorithms ideas are nature-inspired. Figure 1, the main category of the optimization algorithms: (i) Heuristic approach which also known as a single solution-based, where it contains special heuristic methods. For instance, Simulated Annealing (SA) [3] and Hill-Climbing (HC) [4]. (ii) Metaheuristic algorithms which are also known as population-based methods [5, 6]. It can easily adapt to different kinds of optimization problems by using parameter tuning and modifying the operations. Metaheuristic algorithms are divided into four classes: (1) Evolutionary Algorithms (EAs), such as Genetic Algorithm (GA) [7], Genetic Programming (GP) [8], and Differential Evolution [9]. (2) Human-based algorithms, such as Tabu search (TS) [10], Translation Lookaside Buffer (TLB) [11], and Socio-evolution and Learning Optimization (SELO) [12]. (3) Physics-based algorithms, such as Central Force Optimization (CFO) [13], Gravitational Search Algorithm (GSA) [14], and Big Bang Big Crunch (BBBC) [15]. (4) Swarm-based like Cuckoo search algorithm (CSA) [16], moth flame optimization (MFO) [17], gradient-based optimizer (GBO) [18] and others [19, 20].
Population-based algorithms are usually inspired by social insect colonies and animal societies [21]. Also, they emulate the behavior of swarming social insects for seasonal migrations, looking for food or safety [22]. The main features of these methods are their robustness in achieving the solutions and flexibility in adapting the problems [23]. GBO algorithm is an example of the modern metaheuristic population-based algorithms proposed by Ahmadianfar et al. [18] in 2020. GBO inspired by the gradient-based Newton’s method.
GBO uses two operations related to the gradient-based Newton’s method; local escaping operator (LEO) which focuses on the exploitation search technique and gradient search rule (GSR) which employs to enhance the exploration search technique. Consequently, GBO can deal with various research on health, environment and public safety, efficiently. As well as different problems in various field, such as image processing [24, 25], power energy [26], Engineering [27, 28], and medical [29].
The aim of this review is to introduce a comprehensive synopsis of the related works of the GBO’s applications in different fields to fix and find the optimal solutions of various problems. Moreover, it highlights the defies and possible methods for future works. Thus, the review devided into two main sections: (i) The variants of GBO which includes Binary, Discrete, Multi-objective, Modification, and Hybridization of GBO. (ii) Applications of GBO which includes Economic, Energy, Engineering, and Medical. The following points show the content of this review.
-
An overview of the main concept of basic GBO’s to highlight the main strengths, weaknesses, and procedures of the algorithm.
-
Illustrates the related works of GBO and classifies them into; Variants such as binary GBO, enhancement such as modified GBO, and the recent applications of the GBO such as Machine learning and networks fields.
-
Evaluate the efficiency of GBO compared with other algorithms in the literature.
-
Introduces potential guidnesses using GBO for future works.
Based on the above, this review will aid the interested researchers and students by mentioning the major advantages and weaknesses of the GBO. Especially in swarm-based algorithms, there are many algorithms that have been introduced recently [30].
The rest sections in this review are shown as follows. Section 2 presents GBO’s framework and procedures of the basic GBO algorithm. The research methodology is introduced in Sect. 3. Section 4 illustrates a brief introduction of GBO algorithm. The usage of the GBO algorithm in the various fields is discussed in Sect. 5. Section 6 shows the evaluation of GBO algorithm. The comparisons between GBO’s variants are shown in Sect. 7. At the end, the conclusion and possible future directions are presented in Sect. 8.
2 Gradient-Based Optimizer
Various gradient-based methods were used to fix different optimization problems [31, 32]. To define an optimal solution by any gradient-based methods, successfully. The extreme point must be determined with the gradient’s value zero. One of these method called Newton’s method. Generally, the gradient methods’ mechanism is similar to the optimization methods’ mechanism. In other words, the optimal solution has been selected by determining a search orientation and then starting the searching process based on orientation and objective function [33]. However, this type of searching mechanism suffer from the slowly of convergence speed which lead to lose a chance of get the optimal solution [34]. The other way of determining the optimal solution is based on generates random solutions (i.e., population). In each iteration, each solution create a unique search direction to a new solution. The direction updated based on the new solution. This process continues until achieves the stopping criterion [35]. Therefore, this type of search used widely to solve various engineering problems. However, the complexity time increase according to the dimensional search area because it needs higher computational ability.
Based on the above, it can be noticed that the first search mechanism is looking for the local optima (i.e., exploitation search technique), while the second search mechanism is looking for the global optima (i.e., exploration search technique). Thus, both techniques have strength and weaknesses. GBO combines the benefits of both search mechanisms (i.e., population-based methods and gradient-based methods) which introduces an efficient and powerful algorithm. The following sections illustrates the main procedures of GBO.
2.1 Initialization
GBO like many optimization algorithms has parameters’ control. For instance, \(\alpha\) is used to move from exploitation search to exploration search and the probability rate performed over a given range of parameters. Also, both population size and the number of iterations have been determined based on the complexity of the problem. It’s worth to mention that each element (i.e., solution) in the generated population, namely vector. Equation 1 illustrates the representation of a vector in N vectors in the population search space.
As mentioned earlier, GBO generates the vectors randomly in the search space as shown in the following equation
where rand refers to the random number between [0,1] and \(X_{min}\) and \(X_{max}\) indicate the lower and upper boundaries of the search domain.
2.2 Gradient Search Rule (Exploration)
This section shows how the GBO use the advantages of the gradient search rule (GSR) to improve the exploration search technique, convergence rate, and avoid the local optima. GSR follows the process of direction movement (DM) which use to update the vector location, the expression is shown in Eq. 3.
where \(X1^{m}_{n}\) refers to the new vector, \(x^{m}_{n}\) refers to the current vector, randn indicates the normally distributed of rand (i.e., random number), \(\varepsilon\) refers to the small number between [0,0.1], and \(\rho _{1}\) and \(\rho _{2}\) defined by the following equations
where \(\alpha\)
where \(\beta\)
where the value of \(\beta _{min}\)= 0.2, \(\beta _{max}\)= 1.2. m and M refer to the number and the total number of iterations, respectively.
So, as shown in the above equation, \(\alpha\) is used to change the value of \(\rho\) which is responsible to make the balance between exploration and exploitation by aiding GBO to enhance the diversity of neighbour vectors of the current vector in the search space. Thus, avoid the local optima. The expression is shown in Eq. 7. Figure 2 shows an example of the changing of \(\alpha\) in each iteration.
where the value of \(\Delta x\) depends on the distinction between the best solution \(x_{best}\) and a selected neighbour position \(x^{m}_{r1}\), randomly. The expression is shown in the following equation.
where \(\delta\) is used to make sure the value of \(\Delta x\) is changing at each iteration. The values of r1 until r4 are different random numbers located in \(\left( 1:N \right)\) which indicates the random number between 1 and N dimensions. Moreover, Eq. 11 shows another random parameter which used to enhance the exploration technique in GBO.
Based on the above equations, substitute the best solution (i.e., vector) \(x_{best}\) with the exist vector \(x^{m}_{n}\) will be completed easily using Eq. 3. The next step is finding the new vector based on the current replaced vector. Thus, the following expression shows the process of producing the new solution.
where \(yp_{n}\) and \(yq_{n}\) indicate to two positions initialized based on \(z_{n+1}\) and \(x_{n}\) as shown in following equations.
According to Eqs. 3 and 12 which used to determine \(X1^{m}_{n}\) and \(X2^{m}_{n}\), respectively. Also, the current vector \(X^{m}_{n}\). Equation 15 shows determining the new vector at the next iteration \(x^{m+1}_{n}\)
where \(r_{a}\) and \(r_{b}\) refer to the random number between [0,1], and \(X3^{m}_{n}\)can be defined as:
2.3 Local Escaping Operator (Exploitation)
This section presents the other operator of GBO [i.e., the local escaping operator (LEO)] responsible for improving the exploitation search with avoiding the local optima. LEO is used to aid the GBO in dealing with complex problems, successfully. It starts by utilizing the solutions achieved from the GSR (i.e., \(X1^{m}_{n}\), \(X2^{m}_{n}\), and \(x_{best}\)) to generate a high-performance solution \(X^{m}_{LEO}\), as shown in the following schema:
\(if ~~ rand< pr\) and \(rand <0.5\)
else
where \(f_{1}\) and \(f_{2}\) refer to random number in [− 1,1], random number [mean of 0 , standard deviation 1], respectively. \(u_{1}\), \(u_{2}\), and \(u_{3}\) refer to random numbers determine as shown in the following formula:
where each of rand and \(\mu _{1}\) refer to a random number located in [0,1]. Equation 17 can be represented as follows:
where \(L_{1}\) equal 0 incase \(\mu _{1} <0.5\), otherwise the value of \(L_{1}\) will be equal 1. Based on the above equations, the solution \(x^{m}_{k}\) can be determined as shown in Eq. 21
where \(x_{rand}\) refer to the new solution which defined as shown in the following equation
where \(x^{m}_{p}\) refer to random number selected from the population [1,2,...,N]. Moreover, \(x_{rand}\) can be defined in a simple equation as shown below
where \(L_{2}\) equal 0 incase \(\mu _{1} <0.5\), otherwise the value of \(L_{2}\) will be equal 1. More details of the GBO procedures are shown in Algorithm 1, follows by the flowchart (Fig. 3).
3 Research Methodology
This section presents the mechanisms of collecting the published articles of GBO, the classification of these mechanisms based on various criteria. At the beginning, the search operation depended on using the keywords of the direct and abbreviation of the algorithm, such as “gradient-based optimizer” and “GBO”. In other words, collected articles included the keywords in their titles. In addition, the selected published articles located between November 2020 and December 2022. The search process done by various resources and database, for instance Google Scholar, IEEE Explorer, Elsevier, Springer, MDPI, and Taylor & Francis. Figure 4 illustrates the number of articles from each resource where the total number of articles around 60.
The second stage is classification the selected articles into three groups; (i) Journals, Conferences, or Chapter book (as shown in Fig. 5). (ii) Variants of GBO, as shown in Fig. 6. (iii) Applications of GBO, as shown in Fig. 7. Consequently, the interested researchers will be able to understand the search development of GBO and take advantage to use the GBO optimally.
It can be noticed that the hybridization category outperformed the other category based on the number of articles which refers that the researchers are interested to increase the capability and efficiency of GBO. As well as, it has a simple structure that enables to merge it with other techniques and algorithms (Table 1).
4 Different Variants of Gradient-Based Optimizer
Although the age of the GBO algorithms around two years, the researchers introduced many variants (i.e., versions) by developing it to be compatible for solving different kinds of problems. The following subsection shows the related works about various variants of GBO. At the end of this section, a summary of the GBO variants is shown in Table 2.
4.1 Binary
The original GBO algorithm is tailored to solve continuous optimization problems. But, in dealing with the problems of binary optimization, GBO operators must be reformulated to be applicable to such types of optimization problems. A binary type of GBO (BGBO) has been introduced by Jiang et al. [36] for feature selection in unsupervised learning. BGBO and the sum of squared errors in the K-means algorithm have been applied as the fitness function. Two groups, V-shaped and S-shaped, were used to verify eight proposed versions of BGBO using independent search space. The experimental results show that the BGBO outperformed other metaheuristic algorithms.
4.2 Discrete
Dwight and Brezillon [37] introduced discrete adjoint on GBO to study the effects of different Approximations. A linearly dynamic pulse rate selection method was designed to avoid exponential increases in pulse rate and remain constant in most generations, leading to fix selection pressure. As well as improve the accuracy of the final solution and convergence rate.
4.3 Multi-objective
In general, multi-objectives arise in case there is more than one competing objective. The benefit of multi-objectives is the ability to deal with difficult problems.
Premkumar et al. [38] introduced new method based on multi-objective GBO (MOGBO) to solve real-world optimization problems. MOGBO includes two factors; gradient search mechanism and local escaping factor. These factors are used to determine optimal solutions from the search space containing various sets of vectors. In other words, MOGBO aims to decrease the distance to keep the convergence and increase the chance of selecting the best solution. The results illustrate that MOGBO got the solutions with high accuracy and minimum run-time.
In [39], the authors proposed many-objective GBO (MaOGBO) which contains fuzzy logic to fix the optimal power flow problems by keeping control of the increased energy request. MaOGBO aims to increase the diversity of the solution in the search space and speed up the convergence rate. MaOGBO proved its efficiency using 22 benchmark functions and a large scale Algerian 59-bus real-time system.
The applications of the Internet of Things (IoT) became essential in various fields which leads to an increase in the traffic in the mobile networks. Therefore, Kesavan et al. [40] introduced multi-objective GBO (MOGABO) to improve the processes, decrease the complexity, and thereby increase the productivity. The results proved the efficiency of MOGABO which achieved a 1.2% increase in productivity compared with other methods in the literature [41].
Ouadfel et al. [42] proposed a new method that combines multi-objective GBO and weighted multi-view clustering. The main objective of the proposed method is to determine the optimal clustering with observance of the importance of the features in the views and differences between views [43]. Ten multi-view methods and six common multi-objectives have been used to evaluate the performance of the proposed method.
4.4 Modification
Although GBO proved its efficiency to solve various problems, successfully. However, it’s likes many metaheuristic algorithms can’t deal with all problems with same effectiveness. Thus, it needs modify based on the difficulty of the problem.
Hassan et al. [44] introduced a novel method based on modifying GBO (MGBO) for dealing with photovoltaic models. The goal of MGBO is enhancing the speed up the convergence which leads to avoid the local optima. Thus, MGBO has ability to estimate the parameters of various photovoltaic model, such as PV module, double-diode, and single-diode. The results show that MGBO outperformed the other recent algorithms.
Chaotic GBO (CGBO) was developed by Abd Elminaam et al. [45], and chaos-based strategies were combined with GBO to minimize the possibility of getting stuck at local optima and mitigate the premature convergence problem [46]. CGBO has been used with k-nearest neighbor to find the optimal subset of the feature selection characteristics [47]. Various CGBO strategies were experimentally examined on different optimization functions to determine the appropriate chaotic strategy for GBO. CGBO significantly outperformed the original GBO, particle swarm optimization (PSO), salp swarm algorithm (SSA), sine cosine algorithm (SCA) and moth flame optimizer (MFO).
Jiang et al. [48] improved GBO (IGBO) using two mechanisms; chaotic behavior used to enhance the speed up of the convergence rate, adaptive weights to avoid fallen in local optima. Thus, the authors applied IGBO to extract the photovoltaic models’ parameters. IGBO proved its robustness and accuracy in PV module, double-diode, and single-diode.
In [49], the authors solved the weakness of the GBO (i.e., local search mechanism) by introducing a new improved version of GBO, called GOMGBO which includes opposition-based learning strategy, Gaussian bare-bones strategy and moth spiral strategy. Thirty benchmark functions have been applied to evaluate the performance of GOMGBO. The results illustrates that the GOMGBO achieved best solutions at most of benchmark functions comparing with other metaheuristic algorithms.
Montoya et al. [50] proposed a new method to modify GBO, called MGbMO. The aim of MGbMO is determining the optimal size and placement of PV sources. IEEE 34-bus system has been used to evaluate the efficiency of MGbMO. The experimental results shows that MGbMO outperformed the other metaheuristic algorithms. such as, Newton metaheuristic algorithm, genetic algorithm, and the basic GBO.
Ahmadianfar [51] proposed a new enhanced version of GBO using metaphor-free, so called EGBO to determine the optimal PV parameters. PV, double diodes, and single diodes were used to evaluate the performance of EGBO, where it achieved the optimal parameters with high accuracy, reliability, and speedily of convergence rate [52]. Also, it achieved the solutions with high accuracy in different temperatures and irradiances using the manufacturer’s data-sheet.
Premkumar et al. [53] introduced a new method that improves GBO using an opposition-based learning strategy, so-called (OBGBO) to determine the optimal parameters in solar photovoltaic models. Double-diode and single-diode models were used to validate the performance of OBGBO. The results illustrated that OBGBO obtained accurate parameters compared to various algorithms.
Zhou et al. [54] improved GBO using random learning strategy, called RLGBO. The aim of random learning strategy using random learning strategy is enhancing the accuracy of selected solutions of GBO based on tuning the convergence rate and avoiding fall into local optimum. The authors evaluated the efficiency and robustness of RLGBO using three various PV models (KC200GT, ST40, and SM55). The results improved the ability of RLGBO to estimate the optimal parameters of all mentioned PV models.
Jin et al. [55] proposed a gradient-based differential neural-solution for solving time-dependent nonlinear optimization problem. The authors converted the optimization problem to non-homogeneous linear equation with the dynamic parameter. The proposed model has higher accuracy for minimizing the large solution error with exponential convergence as compared to gradient-based neural network and the dual neural network [56]. The performance of the proposed model has been validated on robot motion planning and data dimension reduction and reconstruction real-world optimization problems.
4.5 Hybridization
This section shows the related works of hybridizing GBO with other algorithms. It’s worth mentioning that the hybridization mechanism is used in metaheuristic algorithms to merge the advantages of two or more algorithms, leading to an enhanced version that can deal with various complicated problems [57].
Elsheikh et al. [58] proposed a GBO with ensemble random vector functional link model (ERVFL), called ERVFL-GBO for modeling ultrasonic welding of polymers. Five statistical tools have been used to evaluate the performance of the ERVFL-GBO. The results illustrated that ERVFL-GBO achieved the best solutions compared with other well-known models.
Although GBO has ability to deal with several problems. But it still suffer from finding the optimal solution in the global search. Thus, Yu et al. [59] proposed a new hybrid method merge the efficiency of chaotic local search (CLS) with the original GBO (CGBO). The results proved the competence of CGBO in the exploration mechanism as well improve the diversity compared with the original GBO and other state-of-the-art algorithms.
Ahmadianfar et al. [60] introduced a hybrid GBO with weighted exponential regression (WER), called WER-GBO to develop the data intelligence model. WER-GBO aims to determine the optimal sodium concentration in the surface water. Various statistical methods, such as scatter-plots, Taylor diagrams, and error distributions have been utilized to evaluate the efficiency of WER-GBO. The experimental results showed that the WER-GBO outperformed BLR, ANFIS, RSR, and LSSVM.
In [61], the authors proposed a new hybrid method combines GBO and elephant optimization algorithm (EHO), namely, GBEHO. The main objective of GBEHO is determining the premier cluster centers at the search space. GBEHO got the optimal solution compared with 10 common optimization algorithms using different criteria, such as F-measure, accuracy rate, and detection rate [62].
The applications based on the human–computer interaction experienced rapid development. Especially the applications related to healthcare and smartphone. However, the quality of efficiency of these applications has been affected because of the amount of data used. Therefore, Helmi et al. [63] proposed a hybrid method that merged GBO with Grey Wolf Optimizer (GWO) and the feature selection method (GBOGWO) to improve the classification procedures of human activity recognition using the sensors of the smartphone. WISDM and UCI-HAR datasets were employed to evaluate the efficiency of GBOGWO. The experimental results illustrated that GBOGWO achieved high accuracy with a percentage of 98%.
Mostafa et al. [64] proposed a method that integrated GBO and k-nearest neighbor (k-NN), called GBO-kNN. The purpose of the proposed method is to determine the optimal classification accuracy. A set of benchmark functions were used to prove the performance of GBO-kNN. The results of GBO-kNN outperformed the other metaheuristic algorithms in the literature.
In [65], the authors used sine cosine and differential GBO (SDGBO) to determine the optimal parameter of the photovoltaic model. The results illustrated that SDGBO achieved better solutions compared with other algorithms using various photovoltaic cells.
Kadkhodazadeh and Farzin [66] integrated GBO with least square support vector machine (LSSVM) to introduce a novel method, called LSSVM-GBO. The main objective of LSSVM-GBO is estimating the parameters of the water quality. The authors employed two experimental level; starting with utilizing set of benchmark functions and compared the results of LSSVM-GBO with other optimization algorithms. After that, they applied LSSVM-GBO to estimate the parameters of the water quality using electrical conductivity (EC) [67]. In both experimental LSSVM-GBO proved its efficiency with dealing with these problems.
A hybrid GBO with Moth Flame Optimization (MFO)(GBO-MFO) was proposed by Mohamed et al. [68] to determine the best power flow based on the optimal position and the volume of the wind energy. The proposed algorithm aims to reduce the cost by reducing the intermittent renewable sources.
The directional overcurrent relay problem needs to determine the optimal setting, efficiently. Thus in [69], the authors proposed a hybrid GBO which includes the advantages of Linear population size reduction technique of Success-History-based Adaptive Differential Evolution (LSHADE). The aims of the proposed algorithm is improving the exploitation technique of the original GBO and avoid stuck in the local optimum [70]. The proposed algorithm proved it efficiency using three test-systems. Moreover, it’s outperformed the other optimization algorithms in the literature using a set of benchmark functions.
Ramadan et al. [71] introduced a new version of GBO to estimate the parameters of Static/Dynamic Photovoltaic Models. The new version combines chaotic eagle strategy, so called ESCGBO. The authors utilized two levels of evaluation to prove the efficiency of ESCGBO; (i) used 23 benchmark functions, (ii) single and double diode as PV static models. In the both experiments, ESCGBO achieved best results with highest accuracy.
5 Gradient-Based Optimizer Applications
5.1 Economic
The manufacturing of highly quality and credible products are main objectives to increase the competition in different fields. Thus, each applications in these fields need to find robust strategy for realizing its requirements. This section, illustrates some studies related to employ GBO for enhancing the different strategies in economic field.
Wang et al. [74] used GBO to aid design the workflow scheduling by achieve the lowest amount of scheduling with limitations, such as deadline. Particle swarm optimization (PSO) algorithm and genetic algorithm was used to compare with GBO utilizing various types and sizes of workflow. The results illustrated that GBO outperformed the mentioned algorithms in dealing workflow scheduling problems.
In [75], the authors used GBO to improve the reliability system with some limitations, such as weight, volume, and cost. The work aims to achieve best solutions in the search space and avoid the ineligible solutions. The reliability redundancy allocation problem (RRAP) was used to evaluate the performance of GBO. The results proved the efficiency of GBO compared with PSO algorithm.
Shahid et al. [76] utilized selection strategy of GBO to enhance (i.e., maximize) the portfolio’s sharp ratio. The proposed method was evaluated using S&P BSE Sensex database which includes 30 stocks of the Indian stock exchange. The experimental results showed that GBO got better solutions compared with PSO algorithm.
In the cloud computing the tasks scheduling consume a large storage with high costs of operating systems. Therefore, Hung et al. [27] used GBO to utilize storage space which lead to decrease operational cost. The experimental includes set of benchmark functions and variable number of virtual machine to validate the performance of GBO. The results showed that GBO outperformed recent heuristic algorithms.
5.2 Energy and Power Flow
This section aims to present the efficiency of GBO with dealing various energy and power Flow problems. Such as, renewable energy, photovoltaic (PV) systems, and solar energy [77].
Ismaeel et al. [78] used GBO to determine the optimal parameters of PV and solar cells (SCs) modules. To prove the capability of GBO dealing with the mentioned modules, the authors used three popular SC models; three diode models (TDMs), double-diode models (DDMs), and single diode models (SDMs). The experimental showed that GBO achieved minimum error value, as well high proximity between I–V and P–V curves. In [79], the authors worked to find the optimal design of the Automatic Voltage Regulator (AVR) using GBO. To achieve the goal, GBO was employed to determine the optimal Fractional-Order Proportional Integral Derivative (FOPID) through decreasing the chance of selecting the fitness function. The results illustrated that GBO aided to determine optimal, robust, and stability AVR design compared with recent metaheuristic algorithms. Hydropower considers as one of the most important resource of the renewable energy. The mechanism of Hydropower based on saving the steps of the peak time, regularly. So, it can provide the resources allocation of the power system. Therefore, Fang et al. [80] used accelerated version of GBO (AGBO) includes the sequential quadratic programming (SQP) technique to deal with complicated multi-reservoir hydropower system. AGBO was employed on a 10-reservoir hydropower system then on 23 benchmark functions to evaluate its performance. The experimental results proved the efficiency of AGBO which achieving the maximum load requests in hydropower system. As well as, it got the optimal solutions of the most benchmark functions compared the recent optimization algorithms. Said et al. [81] introduced a novel GBO to solve unit commitment problem which considers as one of complicated optimization tasks in the power plants. To evaluate the performance of GBO, the authors used power system network contains several units; starting with 4 unit, 20 units, 40 units, and 100 unit system, independently. The results shows that GBO achieved the optimal cost function with minimum run-time compared with other metaheuristic algorithms.
The effective of resource management platform for smart grid utilizing GBO was introduced by Mohanty et al. [82]. To mange the resources of smart grids, the authors introduced fog-aided-cloud-based model. The aim of using this model to test GBO of improving the model’s performance by saving the load balancing. In the experimental results, GBO outperformed the artificial bee colony (ABC), ant colony optimization (ACO), and PSO.
Premkumar et al. [83] and Khelifa et al. [84] proposed an improved GBO using chaotic drifts (CGBO) to locate the optimal parameters of solar photovoltaic model. Five case studies were used to validate the efficiency of CGBO. The results proved the ability of CGBO dealing with this problem, where it obtained the highest accuracy solutions of PV parameters with lowest run time compared state-of-the-art algorithms. Also in [85], the authors solved solar photovoltaic model. But this time, orthogonal learning (OL) was used with GBO (RLGBO) to enhance its speed rate. RMSE and STD with light provision and particular temperatures were utilized to improve the performance of RLGBO.
To fix the charger placement problem, Houssein et al. [86] used GBO which can manage the orders in the charging stations by receiving and accepting Electric Vehicles, successfully. The results proved the robustness and efficacy of GBO in addressing the charger placement problem comparing with GA, DE, and PSO.
Priyadarshani and Satapathy [87] introduced GBO to reduce the frequency oscillation by setting the gains of fuzzy-PIDD controller. In other words, the authors used GBO to save the stability of the load frequency. Two models were utilized to evaluate the performance of GBO; (i) linear multi-source topology, (ii) two-area linear thermal. The results shows that GBO achieved the optimal cost function with maximum overshoot and settling time compared with other metaheuristic algorithms.
In [88], the authors took the advantages of GBO performance to identify the optimal design of the wind cube. It’s worth to mention that the wind cube aids to maximize the power of the wind turbine and minimize the power function. Consequently, the work aims to determine the optimal variables of wind cube design and the wind turbine. The experimental conducted based on comparing the energy generated with and without wind cube. The results illustrates that the amount of energy generated using optimized cube better than using turbine without using wind cube.
Economic load dispatch (ELD) is one of the power system problems which is based on decreasing the cost of the constrained system by scheduling the power-produced units. Thus, Deb et al. [89] used GBO to fix combined economic and emission dispatch (CEED) and ELD problems. The experiment results prove the efficiency of GBO compared with eight metaheuristic algorithms.
Rezk et al. [90] presented a gradient-based optimizer to determine optimal parameters for polymer electrolyte membrane (PEM) fuel cells. Anonymous parameters of PEM fuel cells are employed as decision variables during search iterations, and the sum square error between the observed and estimated values is utilized as an objective function. The authors claimed the superiority of the proposed algorithm in determining the best parameters of various PEM fuel cells compared to state-of-the-art algorithms, including sine cosine algorithm (SCA), differential evolution (DE), heap-based optimizer (HBO), moth-flame optimization algorithm (MFO), whale optimization algorithm (WOA) and salp swarm algorithm (SSA).
5.3 Engineering
Engineering design process is a set of steps that engineers follow to achieve a solution to a problem, and often, the solution includes designing a product (e.g. a computer code or a machine) that meets certain criteria and/or accomplishes a particular task [91]. This section highlights the related works of using GBO in Engineering field. Mehta et al. [92] used GBO to deal with one of most complicated engineering problem, called heat exchangers (HEs). The aim of using GBO is finding the lowest cost of find and tube HE (FTHE). GBO proved its ability to deal with different design variables and restrictions which achieved the best success rate based on experimental conducted. Welded beam design is a common design optimization problem in engineering field. Chen [93] used GBO to deal with this problem (i.e., determine the optimal design of welded beam). Thirty IEEE CEC2014 test functions were employed to evaluate the efficiency of GBO. The results proved effectiveness of GBO compared with advanced algorithms.
Seibert et al. [94] proposed a framework for reconstructing 3D microstructures using high-dimensional descriptors that are defined generally. The proposed 3D-formulation fulfills the experimental requirement that genuine microstructure samples in the shape of microscope photos are typically only available on 2D segments [95]. The noise that results from reconstructing 3D microstructures from descriptors of 2D slices is managed by reinterpreting a denoising approach as a physical microstructure descriptor and modifying the formulation. Authors reported that there is less noise, improved hyper-parameter robustness, and better reconstruction results.
Keramat et al. [96] presented a steepest descent optimization approach to detect multiple leaks in the frequency domain. A multi-dimensional optimization problem with the leak locations as the fitness function is constructed using the maximum likelihood estimation. The gradient of the provided objective function is then stated using closed-form formulations. lastly, the steepest-descent algorithm is then used on the optimization problem with various initialization. The proposed approach has been tested on three leak cases with varying numbers of resonant frequencies, as well as verified against a laboratory case study involving two leaks. The results demonstrate the precision and effectiveness of the proposed approach and demonstrate how flexible the approach is for locating leaks using a single spatial measurement vector.
Wechsung et al. [97] presented a gradient-based descent method that uses analytical derivatives to address coil design challenges. The authors considered additive perturbations represented by Gaussian procedures to distinguish between the coils’ discretization and the perturbation modeling. Experimental results demonstrated that stochastic optimization methods are more efficient than the deterministic method in terms of stable configurations, fewer local minima, quasi-symmetry near and far from the magnetic axis, and particle loss fraction.
5.4 Medical
Researchers have interest in solving medical challenges by using computers at early stages. GBO is widely used in solving medical challenges due to its simplicity and efficiency. Zheng at al. [98] proposed a new algorithm combines GBO and Cauchy and Levy mutation (CLGBO) to create sets of DNA storage with highly strong code. The experimental results illustrated that CLGBO increased the lower bounds storage of DNA in the rate of 4.3–13.5%. Moreover, CLGBO proved its efficiency using non adjacent subsequence chains.
In [99], the authors used GBO for mammogram images (i.e., early detection of breast cancer). The aim of using GBO is aiding to extract the features of a rapper by determining the optimal parameters of convolutional neural network. In the experimental, mammogram images dataset was used. The results showed that GBO achieved higher recall and higher F-measure compared with similar processes in the literature.
To solve the penalty facto and selection k-value problems. Li et al. [100] utilized GBO to achieve the self-adaptive selection of k-value which leads to estimate cumulative COVID-19 model. The work starts by categorized the cumulative COVID-19 data into IMFr and IMFs. Then, using GBO to reconstruct the predicted results in IMFr and IMFs. To evaluate the efficiency of GBO, cumulative COVID-19 data from India, USA, and Russia were used. The results showed that GBO outperformed other similar method.
Kiziloluk et al. [101] used GBO to identify the optimal parameter of convolutional neural network. The aim of work is classifying the viral pneumonia and COVID-19, accurately. COVID-19 and Epistroma are two studies have been employed in the experimental. The results illustrated that GBO maximized the accuracy of the classification, obviously.
Concentric tubular robots (CTR) have special potential for surgical operations requiring minimally invasive surgery. Making these robots’ designs patients specific is one of the challenges. Lin et al. [102] introduced a scalable-based framework technique that can optimize robot design and motion plan for safe navigation and scalable solution to optimize continuous variables, even across multiple anatomical structures. The authors used the clinical examples of laryngoscopy and cardiac biopsy to demonstrate how optimization problems can be resolved for both a single patient and a group of patients.
6 Evaluation of GBO
Since its proposed in 2020, the GBO has been used to deal with various real problems in different fields, as shown in the above sections. GBO likes many metaheuristic algorithms which they have simple structure, flexible to adapt, and robustness solutions. These features maked GBO a point of researchers attention. Furthermore, GBO owns private merits. For example, GBO combines the benefits of population-based methods and gradient-based methods (i.e., Balancing between exploitation and exploration), diversity of solutions, and avoid the local optima.
No free-lunch is one of the popular theories in the optimization field. The major concept of this theory is there is no original metaheuristic algorithms has ability to deal various problems, successfully [104]. Consequently, GBO requires enhancement to be able to deal with various types of optimization problems. GBO suffers premature convergence which lead to stuck in local optimum, and complexity time increase according to the dimensional search area. [53, 54, 58].
Table 3 illustrates some of main characteristics of GBO compared with the characteristics of the other optimization algorithms. For example, advantages and disadvantages, complexity time, number of parameters,... etc. To conduct comprehensive comparison, various algorithms from different categories have been selected. For instance, GA and HS from evolutionary algorithms, CSA and PSO from swarm-based algorithms, and TS from trajectory-based algorithms [105].
7 Result and Discussion
As mentioned in the literature of this review, it can be noticed that the GBO has ability to deal with various fields and determine the optimal solutions in complicated problems, successfully. Moreover, the researcher enhance the GBO to introduce a new versions to increase the capability search of the basic GBO. In other words, the GBO as many different optimization algorithms cannot maintain its efficacy to solve all problems (i.e., free lunch theorem [137]). Thus, the researchers conducted some modification to fix the weaknesses of GBO, such as enhancing the speed up the convergence [44], avoiding stuck at local optima [45], the Performance [49], and balancing between exploration and exploitation [138]. These new versions can be used to solve various problems. Table 4 shows the comparisons introduced versions of GBO in the literature.
Based on the results in Table 4, it can be noticed that most of the introduced variants with a percentage 55% focused on enhancing the performance of GBO (i.e., speeding up the processes, quality of solutions, and so on). 48% utilized to improve the local search mechanism of GBO and avoid trapp in local minima. Also, around 30% conducted the improvement to enhance the convergence rate. It’s worth to mention that some of the introduced variants combined more than one improvement, such as improving the local search and convergence rate. These results prove the properties of the GBO that have been shown in Table 3.
8 Conclusion and Possible Future Directions
This review paper includes more than fifty articles related to GBO algorithm since November 2020 until November 2022. The collected articles were classified into diffident fields. For instance, Economic, Energy and Power Flow, Engineering, and Medical. Furthermore, the rest of these articles were classified based on the variants of GBO. For instance, Binary, Discrete, Multi-objectives, Modification, and Hybridization. These articles were reviewed to extract the strength, weaknesses, and features of GBO. So, this review paper enable to lead and guide the interested students and researchers in this area.
Although the GBO algorithm proposed recently compared with other metaheuristic algorithms. It deems a promising algorithm because it proved its effectiveness to deal with different problems in various fields. The GBO algorithm’s merits helped in its success. For example, it has search mechanism for looking to the local optima and search mechanism for looking to the global optima using the features of population-based methods and gradient-based methods. In contrast, the local search mechanism, may lead to get stuck in a bad local optima. Also, it suffers from a weakness in dealing with large-domain problems which leads to raise the computational time. Thus, as mentioned in Sect. 4 the researchers introduced new versions of GBO algorithm to enhance its capability to overcome these weaknesses.
Also, although GBO’s has advantages like other optimization algorithms, such as simplicity, fewer parameters, and ability to hybridization with other algorithms, easily. But, GBO shortage the mathematical analysis. In other words, GBO doesn’t have a theoretical analysis like GA and PSO. Therefore, the researchers solved this issue using the features of other optimization algorithms by hybridizing GBO, as described in Sect. 4.5.
Finally, this paragraph presents suggestions and possible future directions in the hope of useful for the interested researchers. First, there are common selection schemes (Tournament Selection, Proportional Selection, and Linear Rank Selection) can be included to enhance the quality of the selected solutions. Second, it can take advantage of the local search algorithms (Hill-climbing and Simulating Annealing) to avoid the rapid convergence which lead to avoid stuck in local optimum.
Data availability
data is available from the authors upon reasonable request.
References
Gharehchopogh FS (2022) Advances in tree seed algorithm: A comprehensive survey. Arch Comput Methods Eng, 1–24
Ezugwu AE, Agushaka JO, Abualigah L, Mirjalili S, Gandomi AH (2022) Prairie dog optimization algorithm. Neural Comput Appl 34(22):20017–20065
Kirkpatrick S, Gelatt CD, Vecchi MP (1983) Optimization by simulated annealing. Science 220(4598):671–680
Koziel S, Yang X-S (2011) Computational optimization, methods and algorithms, vol 356. Springer, New York
Abdelmadjid C, Mohamed S-A, Boussad B (2013) CFD analysis of the volute geometry effect on the turbulent air flow through the turbocharger compressor. Energy Procedia 36:746–755
Shambour MKY, Khan EA (2022) A late acceptance hyper-heuristic approach for the optimization problem of distributing pilgrims over mina tents. J Univers Comput Sci 28(4):396–413. https://doi.org/10.3897/jucs.72900
Koza JR (1994) Genetic programming II: automatic discovery of reusable programs. MIT Press, Cambridge
Al-Madi NA, Hnaif AA (2022) Optimizing traffic signals in smart cities based on genetic algorithm. Comput Syst Sci Eng 40(1):65–74
Storn R, Price K (1997) Differential evolution—a simple and efficient heuristic for global optimization over continuous spaces. J Glob Optim 11(4):341–359
Glover F (1977) Heuristics for integer programming using surrogate constraints. Decis Sci 8(1):156–166
Rao RV, Savsani VJ, Vakharia D (2012) Teaching-learning-based optimization: an optimization method for continuous non-linear large scale problems. Inf Sci 183(1):1–15
Kumar M, Kulkarni AJ, Satapathy SC (2018) Socio evolution & learning optimization algorithm: a socio-inspired optimization methodology. Future Gener Comput Syst 81:252–272
Formato RA (2007) Central force optimization. Prog Electromagn Res 77:425–491
Rashedi E, Nezamabadi-Pour H, Saryazdi S (2009) GSA: a gravitational search algorithm. Inf Sci 179(13):2232–2248
Erol OK, Eksin I (2006) A new optimization method: big bang-big crunch. Adv Eng Softw 37(2):106–111
Shehab M, Khader AT, Al-Betar MA (2017) A survey on applications and variants of the cuckoo search algorithm. Appl Soft Comput 61:1041–1059
Mirjalili S (2015) Moth-flame optimization algorithm: a novel nature-inspired heuristic paradigm. Knowl Based Syst 89:228–249
Ahmadianfar I, Bozorg-Haddad O, Chu X (2020) Gradient-based optimizer: a new metaheuristic optimization algorithm. Inf Sci 540:131–159
Al-qaness MA, Ewees AA, Fan H, AlRassas AM, Abd Elaziz M (2022) Modified aquila optimizer for forecasting oil production. Geo-Spat Inf Sci, 1-17.
Al-qaness MA, Ewees AA, Abualigah L, AlRassas AM, Thanh HV, Abd Elaziz M (2022) Evaluating the applications of dendritic neuron model with metaheuristic optimization algorithms for crude-oil-production forecasting. Entropy 24(11):1674
Amini S, Homayouni S, Safari A, Darvishsefat AA (2018) Object-based classification of hyperspectral data using random forest algorithm. Geo-Spat Inf Sci 21(2):127–138
Shehab M, Mashal I, Momani Z, Shambour MKY, AL-Badareen A, Al-Dabet S, Abualigah L (2022) Harris hawks optimization algorithm: variants and applications. Arch Comput Methods Eng, 29(7), 5579–5603.
Ahmad M, Khaja IA, Baz A, Alhakami H, Alhakami W (2020) Particle swarm optimization based highly nonlinear substitution-boxes generation for security applications. IEEE Access 8:116132–116147
Wunnava A, Naik MK, Panda R, Jena B, Abraham A (2020) A differential evolutionary adaptive Harris hawks optimization for two dimensional practical Masi entropy-based multilevel image thresholding. J King Saud Univ - Comput Inf Sci
Gharehchopogh FS, Abdollahzadeh B (2022) An efficient Harris hawk optimization algorithm for solving the travelling salesman problem. Clust Comput 25(3):1981–2005
Aleem SHA, Zobaa AF, Balci ME, Ismael SM (2019) Harmonic overloading minimization of frequency-dependent components in harmonics polluted distribution systems using Harris hawks optimization algorithm. IEEE Access 7:100824–100837
Huang X, Lin Y, Zhang Z, Guo X, Su S (2022). A gradient-based optimization approach for task scheduling problem in cloud computing. Clust Comput, 1–17.
Gharehchopogh F S (2022). An Improved Tunicate Swarm Algorithm with Best-random Mutation Strategy for Global Optimization Problems. J Bionic Eng, 1–26.
Rammurthy D, Mahesh PK (2020) Whale Harris hawks optimization based deep learning classifier for brain tumor detection using MRI images. J King Saud Univ - Comput Inf Sci
Darwish A (2018) Bio-inspired computing: algorithms review, deep analysis, and the scope of applications. Future Comput Inf J 3(2):231–246
Chai R, Savvaris A, Tsourdos A, Chai S, Xia Y (2017) Improved gradient-based algorithm for solving aeroassisted vehicle trajectory optimization problems. J Guidance Control Dyn 40(8):2093–2101
Chau M, Fu MC, Qu H, Ryzhov IO (2014) Simulation optimization: a tutorial overview and recent developments in gradient-based methods. In: Proceedings of the winter simulation conference 2014. IEEE, pp 21–35
Shahidi N, Esmaeilzadeh H, Abdollahi M, Ebrahimi E, Lucas C (2004) Self-adaptive memetic algorithm: an adaptive conjugate gradient approach. In: IEEE conference on cybernetics and intelligent systems, 2004, vol 1. IEEE, pp 6–11
Salajegheh F, Salajegheh E (2019) PSOG: enhanced particle swarm optimization by a unit vector of first and second order gradient directions. Swarm Evolut Comput 46:28–51
Shehab M, Khader AT, Alia MA (2019) Enhancing cuckoo search algorithm by using reinforcement learning for constrained engineering optimization problems. In: 2019 IEEE Jordan international joint conference on electrical engineering and information technology (JEEIT). IEEE, pp 812–816
Jiang Y, Luo Q, Wei Y, Abualigah L, Zhou Y (2021) An efficient binary gradient-based optimizer for feature selection. Math Biosci Eng 18:3813–3854
Dwight R, Brezillon J (2006) Effect of various approximations of the discrete adjoint on gradient-based optimisation. In: 44th AIAA aerospace meeting, pp 9–12
Premkumar M, Jangir P, Sowmya R (2021) MOGBO: a new multiobjective gradient-based optimizer for real-world structural optimization problems. Knowl Based Syst 218:106856
Premkumar M, Jangir P, Sowmya R, Elavarasan RM (2021) Many-objective gradient-based optimizer to solve optimal power flow problems: analysis and validations. Eng Appl Artif Intell 106:104479
Kesavan D, Periyathambi E, Chokkalingam A (2022) A proportional fair scheduling strategy using multiobjective gradient-based African buffalo optimization algorithm for effective resource allocation and interference minimization. Int J Commun Syst 35(1):e5003
Alotaibi Y (2022) A new meta-heuristics data clustering algorithm based on tabu search and adaptive search memory. Symmetry 14(3):623
Ouadfel S, Abd Elaziz M (2021) A multi-objective gradient optimizer approach-based weighted multi-view clustering. Eng Appl Artif Intell 106:104480
Shambour MK, Khan EA (2022) A novel scheduling approach for pilgrim flights optimization problem. Malays J Comput Sci 35(4):281–306
Hassan MH, Kamel S, El-Dabah M, Rezk H (2021) A novel solution methodology based on a modified gradient-based optimizer for parameter estimation of photovoltaic models. Electronics 10(4):472
Abd Elminaam DS, Ibrahim SA, Houssein EH, Elsayed SM (2022) An efficient chaotic gradient-based optimizer for feature selection. IEEE Access, 10, 9271–9286
Shehadeh HA, Idna Idris MY, Ahmedy I (2017) Multi-objective optimization algorithm based on sperm fertilization procedure (MOSFP). Symmetry 9(10):241
Alkhatib AA, Abu Maria K, Alzu'bi S, Abu Maria E (2022) Novel system for road traffic optimisation in large cities. IET Smart Cities
Jiang Y, Luo Q, Zhou Y (2022) Improved gradient-based optimizer for parameters extraction of photovoltaic models. IET Renew Power Gener 16(8):1602–1622
Qiao Z, Shan W, Jiang N, Heidari AA, Chen H, Teng Y, Turabieh H, Mafarja M (2022) Gaussian bare-bones gradient-based optimization: towards mitigating the performance concerns. Int J Intell Syst 37(6):3193–3254
Montoya OD, Grisales-Noreña LF, Giral-Ramírez DA (2022) Optimal placement and sizing of PV sources in distribution grids using a modified gradient-based metaheuristic optimizer. Sustainability 14(6):3318
Ahmadianfar I, Gong W, Heidari AA, Golilarz NA, Samadi-Koucheksaraee A, Chen H (2021) Gradient-based optimization with ranking mechanisms for parameter identification of photovoltaic systems. Energy Rep 7:3979–3997
Manasrah A, Masoud M, Jaradat Y, Bevilacqua P (2022) Investigation of a real-time dynamic model for a PV cooling system. Energies 15(5):1836
Premkumar M, Jangir P, Elavarasan RM, Sowmya R (2021) Opposition decided gradient-based optimizer with balance analysis and diversity maintenance for parameter identification of solar photovoltaic models. J Ambient Intell Humaniz Comput, 1-23
Zhou W, Wang P, Heidari AA, Zhao X, Turabieh H, Chen H (2021) Random learning gradient based optimization for efficient design of photovoltaic models. Energy Convers Manag 230:113751
Jin L, Wei L, Li S (2022) Gradient-based differential neural-solution to time-dependent nonlinear optimization. IEEE Trans Automat Contr
Elbes M, Alrawashdeh T, Almaita E, AlZu’bi S, Jararweh Y (2022) A platform for power management based on indoor localization in smart buildings using long short-term neural networks. Trans Emerg Telecommun Technol 33(3):e3867
Shehadeh HA (2021) A hybrid sperm swarm optimization and gravitational search algorithm (HSSOGSA) for global optimization. Neural Comput Appl 33(18):11739–11752
Elsheikh AH, Abd Elaziz M, Vendan A (2022) Modeling ultrasonic welding of polymers using an optimized artificial intelligence model using a gradient-based optimizer. Weld World 66(1):27–44
Yu H, Zhang Y, Cai P, Yi J, Li S, Wang S (2021) Stochastic Multiple Chaotic Local Search-Incorporated Gradient-Based Optimizer. Discrete Dyn Nat Soc
Ahmadianfar I, Shirvani-Hosseini S, Samadi-Koucheksaraee A, Yaseen ZM (2022) Surface water sodium (Na+) concentration prediction using hybrid weighted exponential regression model with gradient-based optimization. Environ Sci Pollut Res, 1–26
Duan Y, Liu C, Li S, Guo X, Yang C (2022) Gradient-based elephant herding optimization for cluster analysis. Appl Intell, 1–32.
Malibari AA, Alotaibi SS, Alshahrani R, Dhahbi S, Alabdan R, Al-wesabi FN, Hilal AM (2022) A novel metaheuristics with deep learning enabled intrusion detection system for secured smart environment. Sustain Energy Technol Assess 52:102312
Helmi AM, Al-Qaness MA, Dahou A, Damaševičius R, Krilavičius T, Elaziz MA (2021) A novel hybrid gradient-based optimizer and grey wolf optimizer feature selection method for human activity recognition using smartphone sensors. Entropy 23(8):1065
Mostafa AA, Alhossary AA, Salem SA, Mohamed AE (2022) GBO-kNN a new framework for enhancing the performance of ligand-based virtual screening for drug discovery. Expert Syst Appl 197:116723
Yu S, Chen Z, Heidari AA, Zhou W, Chen H, Xiao L (2022) Parameter identification of photovoltaic models using a sine cosine differential gradient based optimizer. IET Renew Power Gener
Kadkhodazadeh M, Farzin S (2021) A novel LSSVM model integrated with GBO algorithm to assessment of water quality parameters. Water Resour Manag 35(12):3939–3968
Shambour MK (2022) Analyzing perceptions of a global event using CNN-LSTM deep learning approach: the case of Hajj 1442 (2021). PeerJ Comput Sci 8:e1087
Mohamed AA, Kamel S, Hassan MH, Mosaad MI, Aljohani M (2022) Optimal power flow analysis based on hybrid gradient-based optimizer with moth-flame optimization algorithm considering optimal placement and sizing of facts/wind power. Mathematics 10(3):361
Rizk-Allah RM, El-Fergany AA (2021) Effective coordination settings for directional overcurrent relay using hybrid gradient-based optimizer. Appl Soft Comput 112:107748
Shehadeh HA, Shagari NM (2022) A Hybrid Grey Wolf Optimizer and Sperm Swarm Optimization for Global Optimization. Handbook of Intelligent Computing and Optimization for Sustainable Development, 487–507
Ramadan A, Kamel S, Hassan MH, Tostado-Véliz M, Eltamaly AM (2021) Parameter estimation of static/dynamic photovoltaic models using a developed version of eagle strategy gradient-based optimizer. Sustainability 13(23):13053
Abualigah L, Almotairi KH, Abd Elaziz M, Shehab M, Altalhi M (2022) Enhanced flow direction arithmetic optimization algorithm for mathematical optimization problems with applications of data clustering. Eng Anal Bound Elem 138:13–29
Shehab M, Abualigah L (2022) Opposition-based learning multi-verse optimizer with disruption operator for optimization problems. Soft Comput 26(21):11669–11693
Wang D, Li H, Zhang Y, Zhang B (2021) Gradient-Based Optimizer for Scheduling Deadline-Constrained Workflows in the Cloud
Ashraf Z, Shahid M, Ahmad F (2021) Gradient based optimization approach to solve reliability allocation system. In: 2021 international conference on computing, communication, and intelligent systems (ICCCIS). IEEE, pp 337–342
Shahid M, Ashraf Z, Shamim M, Ansari MS (2022) A novel portfolio selection strategy using gradient-based optimizer. In: Proceedings of international conference on data science and applications. Springer, pp 287–297
Al-Wesabi FN, Obayya M, Hamza MA, Alzahrani JS, Gupta D, Kumar S (2022) Energy aware resource optimization using unified metaheuristic optimization algorithm allocation for cloud computing environment. Sustain Comput Inform Syst 35:100686
Ismaeel AA, Houssein EH, Oliva D, Said M (2021) Gradient-based optimizer for parameter extraction in photovoltaic models. IEEE Access 9:13403–13416
Altbawi SMA, Mokhtar ASB, Jumani TA, Khan I, Hamadneh NN, Khan A (2021) Optimal design of Fractional order PID controller based Automatic voltage regulator system using gradient-based optimization algorithm. J King Saud Univ Eng Sci
Fang Y, Ahmadianfar I, Samadi-Koucheksaraee A, Azarsa R, Scholz M, Yaseen ZM (2021) An accelerated gradient-based optimization development for multi-reservoir hydropower systems optimization. Energy Rep 7:7854–7877
Said M, Houssein EH, Deb S, Alhussan AA, Ghoniem RM (2022) A novel gradient based optimizer for solving unit commitment problem. IEEE Access 10:18081–18092
Mohanty A, Samantaray S, Patra SS, Mahmoud A, Barik RK (2021) An efficient resource management scheme for smart grid using GBO algorithm. In: 2021 international conference on emerging smart computing and informatics (ESCI). IEEE, pp 593–598
Premkumar M, Jangir P, Ramakrishnan C, Nalinipriya G, Alhelou HH, Kumar BS (2021) Identification of solar photovoltaic model parameters using an improved gradient-based optimization algorithm with chaotic drifts. IEEE Access 9:62347–62379
Khelifa MA, Lekouaghet B, Boukabou A (2022) Symmetric chaotic gradient-based optimizer algorithm for efficient estimation of PV parameters. Optik 259:168873
Yu S, Heidari AA, Liang G, Chen C, Chen H, Shao Q (2022) Solar photovoltaic model parameter estimation based on orthogonally-adapted gradient-based optimization. Optik 252:168513
Houssein EH, Deb S, Oliva D, Rezk H, Alhumade H, Said M (2021) Performance of gradient-based optimizer on charging station placement problem. Mathematics 9(21):2821
Priyadarshani S, Satapathy J (2021) Novel application of gradient-based optimizer for tuning a fuzzy-PIDD2 controller for load frequency stabilization. In: 2021 IEEE international conference on electronics, computing and communication technologies (CONECCT). IEEE, pp 1–6
Ismaeel AA, Houssein EH, Hassan AY, Said M (2022) Performance of gradient-based optimizer for optimum wind cube design. Comput Mater Contin, 71, 339–353
Deb S, Abdelminaam DS, Said M, Houssein EH (2021) Recent methodology-based gradient-based optimizer for economic load dispatch problem. IEEE Access 9:44322–44338
Rezk H, Ferahtia S, Djeroui A, Chouder A, Houari A, Machmoum M, Abdelkareem MA (2022) Optimal parameter estimation strategy of PEM fuel cell using gradient-based optimizer. Energy 239:122096
Shehadeh HA, Ahmedy I, Idris MYI (2018) Sperm swarm optimization algorithm for optimizing wireless sensor network challenges. In: Proceedings of the 6th international conference on communications and broadband networking. pp 53–59
Mehta P, Yıldız BS, Sait SM, Yıldız AR (2022) Gradient-based optimizer for economic optimization of engineering problems. Mater Test 64(5):690–696
Chen Z (2021) The Design Optimization Problem of Welded Beam Design Studies. Int J Sci, 8(3)
Seibert P, Raßloff A, Ambati M, Kästner M (2022) Descriptor-based reconstruction of three-dimensional microstructures through gradient-based optimization. Acta Mater 227:117667
Jaradat Y, Masoud M, Jannoud I, Zeidan D (2022) Genetic algorithm energy optimization in 3D WSNS with different node distributions. Intell Automation and Soft Comput 33(2)
Keramat A, Duan H-F, Pan B, Hou Q (2022) Gradient-based optimization for spectral-based multiple-leak identification. Mech Syst Signal Process 171:108840
Wechsung F, Giuliani A, Landreman M, Cerfon A, Stadler G (2022) Single-stage gradient-based stellarator coil design: stochastic optimization. Nucl Fusion 62(7):076034
Zheng Y, Wu J, Wang B (2021) CLGBO: an algorithm for constructing highly robust coding sets for DNA storage. Front Genet 12:673
Sakthivel NK, Subasree S, Malik S, Tyagi AK (2022) A Wrapper-based feature extraction framework based on AlexNet deep convolutional neural network parameters optimized using gradient‐based optimizer for mammogram images. Concurr Comput Pract Exp
Li G, Chen K, Yang H (2022) A new hybrid prediction model of cumulative COVID-19 confirmed data. Process Saf Environ Prot 157:1–19
Kiziloluk S, Sert E (2022) COVID-CCD-Net: COVID-19 and colon cancer diagnosis system with optimized CNN hyperparameters using gradient-based optimizer. Med Biol Eng Comput 60(6):1595–1612
Lin JT, Girerd C, Yan J, Hwang JT, Morimoto TK (2022). A Generalized Framework for Concentric Tube Robot Design Using Gradient-Based Optimization. IEEE Trans Robot 38(6), 3774–3791
AlZu’bi S, Aqel D, Lafi M (2022) An intelligent system for blood donation process optimization-smart techniques for minimizing blood wastages. Clust Comput, 1-11
Zhang J, Zhou Y, Luo Q (2018) An improved sine cosine water wave optimization algorithm for global optimization. J Intell Fuzzy Syst 34(4):2129–2141
Shehab M, Khader AT, Laouchedi M, Alomari OA (2019) Hybridizing cuckoo search algorithm with bat algorithm for global numerical optimization. J Supercomput, 75(5), 2395–2422.
Yang X-S, Deb S (2009) Cuckoo search via Lévy flights. In: 2009 World Congress on nature & biologically inspired computing (NaBIC). IEEE, pp 210–214
Holland J (1975) Adaptation in natural and artificial systems: an introductory analysis with application to biology. Control Artif Intell 3:1–15
Kennedy J (2010) Particle swarm optimization. Encycl Mach Learn 12:760–766
Geem ZW, Kim JH, Loganathan GV (2001) A new heuristic optimization algorithm: harmony search. Simulation 76(2):60–68
Yu J, Kim CH, Rhee SB (2020) The comparison of lately proposed Harris hawks optimization and jaya optimization in solving directional overcurrent relays coordination problem. Complexity
Shehadeh HA, Jebril IH, Wang X, Chu SC, Idris MYI (2022) Optimal topology planning of electromagnetic waves communication network for underwater sensors using multi-objective optimization algorithms (MOOAs). Automatika, 1-12
Poli R, Kennedy J, Blackwell T (2007) Particle swarm optimization. Swarm Intell 1(1):33–57
Shehab M, Khader AT, Al-Betar MA, Abualigah LM (2017) Hybridizing cuckoo search algorithm with hill climbing for numerical optimization problems. In: 2017 8th international conference on information technology (ICIT). IEEE, pp 36–43
Zhang H, Sun G (2002) Feature selection using tabu search method. Pattern Recogn 35(3):701–711
Heidari AA, Mirjalili S, Faris H, Aljarah I, Mafarja M, Chen H (2019) Harris hawks optimization: algorithm and applications. Future Gener Comput Syst 97:849–872
Salgotra R, Singh U, Saha S (2018) New cuckoo search algorithms with enhanced exploration and exploitation properties. Expert Syst Appl 95:384–420
Abualigah L, Elaziz MA, Sumari P, Khasawneh AM, Alshinwan M, Mirjalili S, Gandomi AH (2022) Black hole algorithm: A comprehensive survey. Appl Intell, 1-24
Shehab M, Alshawabkah H, Abualigah L, AL-Madi N (2021) Enhanced a hybrid moth-flame optimization algorithm using new selection schemes. Eng Comput 37(4):2931–2956
Almodfer R, Mudhsh M, Chelloug S, Shehab M, Abualigah L, Abd Elaziz M (2022) Quantum mutation reptile search algorithm for global optimization and data clustering. Hum-Centr Comput Inf Sci, 12
Alsalibi AI, Shambour MKY, Abu-Hashem MA, Shehab M, Shambour Q, Muqat R (2022) Nonvolatile memory-based Internet of Things: a survey. In: Artificial intelligence-based Internet of Things systems. Springer, pp 285–304
Fan Q, Chen Z, Xia Z (2020) A novel quasi-reflected Harris hawks optimization algorithm for global optimization problems. Soft Comput, 24(19), 14825-14843
Shehab M, Khader A, Laouchedi M (2018) A hybrid method based on cuckoo search algorithm for global optimization problems. J Inf Commun Technol 17(3):469–491
Wright AH (1991) Genetic algorithms for real parameter optimization. In: Foundations of genetic algorithms, vol 1. Elsevier, pp 205–218
Liu Y, Wang G, Chen H, Dong H, Zhu X, Wang S (2011) An improved particle swarm optimization for feature selection. J Bionic Eng 8(2):191–200
Guo L, Wang G-G, Wang H, Wang D (2013) An effective hybrid firefly algorithm with harmony search for global numerical optimization. Sci World J 13:30–44
Shehab M, Khader AT (2020) Modified cuckoo search algorithm using a new selection scheme for unconstrained optimization problems. Curr Med Imaging 16(4):307–315
Shehab M, Abualigah L, Shambour Q, Abu-Hashem MA, Shambour MKY, Alsalibi AI, Gandomi AH (2022) Machine learning in medical applications: a review of state-of-the-art methods. Comput Biol Med 145:105458
Ouaarab A, Ahiod B, Yang X-S (2014) Discrete cuckoo search algorithm for the travelling salesman problem. Neural Comput Appli 24(7–8):1659–1669
Bajpai P, Kumar M (2010) Genetic algorithm—an approach to solve global optimization problems. Indian J Comput Sci Eng 1(3):199–206
Bai Q (2010) Analysis of particle swarm optimization algorithm. Comput Inf Sci 3(1):180
Milad A (2013) Harmony search algorithm: strengths and weaknesses. J Comput Eng Inf Technol 2(1):1–7
Almomani SN, Shehab M, Al Ebbini M, Shami AA (2021) The efficiency and effectiveness of the cyber security in maintaining the cloud accounting information. Acad Strateg Manag J 20:1–11
Zhang Y, Zhou X, Shih PC (2020) Modified Harris Hawks optimization algorithm for global optimization problems. Arab J Sci Eng, 45(12), 10949–10974
Shehab M, Abualigah L, Al Hamad H, Alabool H, Alshinwan M, Khasawneh AM (2020) Moth–flame optimization algorithm: variants and applications. Neural Comput Appl 32(14):9859–9884
Abualigah L, Shehab M, Alshinwan M, Alabool H, Abuaddous HY, Khasawneh AM, Al Diabat M (2020) TS-GWO: IoT tasks scheduling in cloud computing using grey wolf optimizer. In: Swarm intelligence for cloud computing. Chapman and Hall/CRC, pp 127–152
Kulturel-Konak S, Smith AE, Coit DW (2003) Efficiently solving the redundancy allocation problem using tabu search. IIE Trans 35(6):515–526
Adam SP, Alexandropoulos SAN, Pardalos PM, Vrahatis MN (2019) No free lunch theorem: A review. Approx Optim, 57–82
Shehab M, Abu-Hashem MA, Shambour MKY, Alsalibi AI, Alomari OA, Gupta JN, Abualigah L (2022). A Comprehensive Review of Bat Inspired Algorithm: Variants, Applications, and Hybridization. Arch Comput Methods Eng, 1–33
Acknowledgements
The authors would like to thank the Deanship of Scientific Research at Umm Al-Qura University for supporting this work by Grant Code: (22UQU4361183DSR06).
Funding
This research received no external funding.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that there is no conflict of interest regarding the publication of this paper.
Ethical approval
This article does not contain any studies with human participants or animals performed by any of the authors.
Informed consent
Informed consent was obtained from all individual participants included in the study.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Daoud, M.S., Shehab, M., Al-Mimi, H.M. et al. Gradient-Based Optimizer (GBO): A Review, Theory, Variants, and Applications. Arch Computat Methods Eng 30, 2431–2449 (2023). https://doi.org/10.1007/s11831-022-09872-y
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11831-022-09872-y