1 Introduction

Recently, several challenging non-linear optimization problems have been addressed by employing non-conventional meta-heuristic algorithms inspired by natural phenomena as conventional algorithms often fail to yield desired results in certain situations [1]. Meta-heuristic Algorithms (MAs) use social behavior of animals or natural systems to find the optimal solution for specific problems by generating a random population through stochastic processes and iterations [2]. The majority of MAs may be categorized into two main groups: Firstly, those influenced by biological processes inherent in nature and secondly, those entirely dependent on natural phenomena. Several meta-heuristics that have attracted many scholars are cited over the past 20 years. Some of the MAs which are influenced by biological process are: Evolutionary Programming (EP) [3], Evolutionary Strategy (ES) [4], Genetic Algorithm (GA) [5], Genetic Programming (GP) [6], Differential Evolution (DE) [7], Particle Swarm Optimization (PSO) [8], Ant Colony Optimization (ACO) [9], Cuckoo Search (CS) [10], Artificial Bee Colony (ABC) [11], Grey Wolf Optimizer (GWO) [12], Whale Optimization Algorithm (WOA) [13], Salp Swam Algorithm (SSA) [14], Moth-Flame Optimization (MFO) [15], Ant Lion Optimizer (ALO) [16], Aquila Optimizer (AO) [17], White Shark Optimizer (WSO) [18], American zebra optimization algorithm (AZOA) [19], and Modified Whale Optimisation Algorithm (MWOA) [20]. Numerous MAs which are dependent on natural phenomenon are: Simulated Annealing (SA) [21], Big Bang-Big Crunch (BBBC) [22], Gravitational Search Algorithm (GSA) [23], Black Hole Algorithm (BHA) [24], Sine Cosine Algorithm (SCA) [25], Multi-verse Optimizer (MVO) [26], Equilibrium Optimizer (EO) [27], Tabu Search (TS) [28], Harmony Search (HS) [29], Teaching Learning-Based Optimization (TLBO) [30], Successful History-based Adaptive DE variants with linear population size reduction (LSHADE) [31], Covariance Matrix Adaptation of Evolution Strategy (CMAES) [32], Biogeography-Based Optimization (BBO) [33], Farmland Fertility Algorithm (FFA) [34], Cosine Swarm Algorithm (CSA) [35], and Young double-slit experiment optimizer (YDSE) [36].

Despite the fact that most of the aforementioned MAs have been extensively deployed to address optimization problems, poor convergence or getting stuck in local optima are still prevalent problems. Thus, the growth of thought-provoking optimization problems has led to the expansion of new optimization algorithms that consistently generate better outcomes and amend existing techniques. The “No Free Lunch” (NFL) [37] theory is applicable in these circumstances. This theorem has demonstrated that no optimization technique can effectively address every optimization challenge. According to this NFL theorem, several scholars have projected many natural-inspired meta-heuristic algorithms. Therefore, in this study, the Mountain Gazelle Optimizer (MGO) [38], which was projected by Benyamin Abdollahzadeh et al. in 2022, was a thought-provoking MA that was an alternative for global optimization. MGO draws its inspiration from the hierarchy and social organization of wild mountain gazelles and is mathematically simulated based on this inspiration. The algorithm is modeled using the fundamental factors, including motherhood herds, bachelor male herds, territorial males, solitary, and foraging movements. The exploitation (intensification) and exploration (diversification) stages of the MGO are performed concurrently using four processes. Despite MGO’s superior performance, simulations show that early iterations may become stuck in less desirable search space domains when interacting with high-dimensional problems, indicating that solutions may not reach diversity, but interesting domains are discovered after a few cycles [39]. Examining the advantages and disadvantages of MGO algorithm and improving their effectiveness by implementing new or modified mechanisms have become fascinating research challenges [40, 41]. Chaos theory is one of the fascinating concepts that is widely used in non-linear dynamic systems due to its ergodicity, unpredictability, and regularity properties. It has been effectively applied in technical optimization procedures, as its initial population diversity helps avoid local optima and early convergence trappings, making it an effective tool in various applications [42]. They have not offered a strong theoretical foundation for improving the effectiveness of meta-heuristic algorithms with chaotic maps. Therefore, numerous chaos maps have been applied to enhance variants of meta-heuristic algorithms by analyzing the literature [43]. Inspired by the benefits of chaotic maps and the shortcomings of MGO, as explained above, this paper proposes a unique chaos-based algorithm termed the Chaotic Mountain Gazelle Optimizer (CMGO). The chaotic maps are incorporated into MGO to enhance its diversification and intensification while preventing premature convergence and local optimum traps. It is thus anticipated that a detailed experimental study will be carried out later in the article to demonstrate the robustness of the newly designed CMGO approach. The main contribution of the paper is provided as below:

  • In this work, the MGO algorithm is combined with ten distinct chaotic maps to introduce the CMGO technique for the first time.

  • Chaos theory has enhanced the exploitation ability of the classical MGO and helped it avoid falling into local optimal solutions.

  • The effectiveness of 23 standard benchmarks of CEC2005 and 10 complex functions of CEC2019 has been assessed through the CMGO approach.

  • To validate that the suggested CMGO is statistically superior, the t-test and Wilcoxon statistical tests have been performed.

  • For evaluating the proposed CMGO algorithm’s problem-solving capability, four real engineering design challenges are addressed.

The remaining section of the research is summarized as follows: Sect. 2 describes related works on the modified MAs and hybridization of chaotic maps. Section 3 yields details of the Mountain Gazelle Optimizer and chaotic maps. Section 4 embodies the description of the projected CMGO algorithm. The numerical experiments and result analysis are established in Sect. 5. Section 6 describes the implementation of the CMGO algorithm to solve practical engineering challenges, and Sect. 7 presents the final conclusion and suggests some few ideas for further research.

2 Related Work

An efficient meta-heuristic strikes a good balance between exploration and exploitation to preserve population diversity and increase the algorithm’s reliability and rate of convergence. Hence, several attempts have been tried in past to increase the efficiency of the existing meta-heuristics. The African Vultures Optimization Algorithm (AVOA) has been enhanced with the Quantum Rotation Gate mechanism by enhancing population diversity and optimizing local trap escapes [44]. The proposed ResNet50 Convolutional Neural Network (CNN) model utilizes a hybridization of Particle Swarm Optimization (PSO) to introduce different subtypes [45]. To overcome the shortcomings of WOA, an alternative based on multi-population evolution (MEWOA) has been proposed recently [46]. Similarly, the chaotic functions are mostly used to balance the exploration and exploitation phases in meta-heuristic optimization algorithms. It can increase convergence speed and diversity in meta-heuristic algorithms with chaotic maps. The chaos theory is one of the characteristics of nonlinear systems [47] which defines as the randomness generated by the systems. Many researchers have added chaos theory to different meta-heuristic optimization algorithms to increase the algorithm’s ability to obtain the optimum solution. In this section, some of the hybridization, modification and chaos-based algorithms are discussed. Gupta et al. [48] proposed novel algorithm OCS-GWO that enhances the performance of original GWO by introducing the opposition-based learning to approximate the closer search candidate solution to the global optima and chaotic local search for the exploitation of the search regions efficiently. In OCS-GWO, a chaotic local search is used for balancing the exploration and exploitation operators that are the underlying features of any stochastic search algorithm. Kumar et al. [49] proposed chaotic teaching learning-based algorithm premature convergence and lack of trade-off between local search and global search. Li et al. [50] suggested chaotic arithmetic optimization algorithm (AOA) to improve the exploration and exploitation capabilities of AOA. Firstly, ten chaotic maps are separately embedded into two parameter Arithmetic Optimization Accelerator (MOA) and Arithmetic Optimization Probability, then a combination test was carried out by embedding ten chaotic maps into MOA and MOP at the same time. Kaur et al. [51] introduces chaos theory into WOA to enhance the global convergence speed. The fast random opposition-based learning Golden Jackal Optimization algorithm (FROBL-GJO) was introduced by Mohapatra et al. in 2023 [52], a new technique that enhances the precision and convergence speed of the GJO algorithm. Botnet detection in the Internet of Things is addressed by an innovative binary multi-objective dynamic Harris Hawks Optimization (HHO) improved with mutation operator (MODHHO) proposed by Gharehchopogh et al. [53]. A multi-strategy improved Harris Hawks optimization method has been presented by Gharehchopogh for social network community discovery [54]. Chandran, Vanisree et al. introduces an Enhanced Opposition-Based Learning and is incorporated into Grey Wolf Optimizer to form EOBGWO, a novel technique, to boost the efficiency of the traditional GWO method [55]. An improved version of the Golden Jackal Optimization (GJO) algorithm, incorporates the opposition-based learning (OBL) approach with a probability rate, enabling the algorithm to escape from local optima [56]. The proposed chaotic IAS (Interactive Autodidactic School) model utilizes ten chaotic maps and the intra-cluster summation fitness function to enhance the results of the chaotic IAS algorithm and is introduced by Gharehchopogh et al. [57]. An improved chaotic PSO algorithm, based on adaptive inertia weight (AIWCPSO), enhances population diversity and particle periodicity by appropriately generating the initial population [58]. The Chaotic Marine Predators Algorithm (CMPA) is a novel metaheuristic introduced by Sumit et al. that optimizes engineering problems by combining the exploration capabilities of the Marine Predators Algorithm with the exploitation capabilities of chaotic maps [59]. New 10 chaotic maps are incorporated in SHO to increase the performance of SHO by producing chaotic values instead of random ones, aiming to improve convergence speed and avoid local optimums [60].

3 Overview of MGO and Chaotic Maps

3.1 Mountain Gazelle Optimizer (MGO)

Mountain Gazelle Optimizer is a novel nature-inspired algorithm proposed in 2022 [38], which is inspired by social structure and hierarchy of wild mountain gazelles. A mathematical representation of the MGO algorithm has been developed using the fundamental ideas underlying the social and group behavior of mountain gazelles. The four primary aspects of the mountain gazelle’s life— territorial solitary males, maternity herds, bachelor male herds, and migration in search of food—are employed by the MGO algorithm to execute optimization procedures. Figure 1 illustrates the herd of mountain gazelles, known for their continuous long-distance journey in search of food. Indigenous to the Arabian Peninsula and its surroundings, these gazelles boast a widespread distribution across the region, despite its low population density. Its habitat shares a close connection with the habitat of the Robinia tree species.

Fig. 1
figure 1

Mountain Gazelle herd

3.1.1 Territory Solitary Males

When mountain gazelle males are powerful enough and have reached adulthood, they establish solitary territories and are fiercely protective of themselves, with considerable distances separating the territories. Adult male gazelles engage in conflict over the ownership or territory of the female. The young males attempt to claim either the female or the territory, while mature males work to preserve their habitat. Below is an illustration of the territory of an adult male:

$$\begin{aligned} TSM&={male}_{gazelle}-\left| ri_1*BH-ri_2*X(t)*F\right| *Cof_{r} \end{aligned}$$
(1)
$$\begin{aligned} BH&=X_{ra}*\left\lfloor r_1\right\rfloor +M_{pr}*\big \lceil r_2\big \rceil ,\ ra =\left\{ \bigg \lceil \frac{N}{3}\bigg \rceil \ldots N\right\} \end{aligned}$$
(2)
$$\begin{aligned} F&=N_1\left( D\right) *{\rm exp} \left( 2-it*\left( \frac{2}{Maxit}\right) \right) \end{aligned}$$
(3)
$$\begin{aligned} {Cof}_i&=\left\{ \begin{array}{cc} (a+1)+r_3, &{}\\ a*N_2(D) , &{}\\ r_4(D), &{}\\ N_3(D)*{N_4(D)}^2*{\rm cos}(2r_{4}N_3(D) \end{array}\right. \end{aligned}$$
(4)
$$\begin{aligned} a&=-1+it*\frac{-1}{Maxit}. \end{aligned}$$
(5)

Here, the best male adult’s position vector is denoted as malegazelle. The coefficient vector of the young male herd is identified as BH. \(ri_1\) and \(ri_2\) are random numbers with values of 1 or 2. \(X_{ra}\) signifies as young male in the range of ra, while \(M_{pr}\) implies the average number of populations \(\big \lceil \frac{N}{3}\big \rceil\) that were arbitrarily nominated. \(Cof_i\) symbolizes an arbitrarily generated coefficient vector that is updated after every iteration to increase the effectiveness of the search region. The total number of gazelles is N, while \(r_1\), \(r_2\), \(r_3\) and \(r_4\) designate as arbitrary values in between 0 to 1. The cosine function denoted as cos, whereas, exp embodies the exponential function. \(N_1\) symbolizes the arbitrary number drawn from the standard distribution. \(N_2\), \(N_3\) and \(N_4\) characterizes the arbitrary numbers in the normal range and the dimensions of the problem. Finally, Maxit and it expresses the maximum iterations and the current iteration, respectively.

3.1.2 Maternity Herds

Mountain gazelles are a species that depend on maternity herds to produce strong male offspring, which is a crucial part of their life cycle. Male gazelles may additionally be associated with the birth of gazelles and the attempts of young males to seize females. This behavior is articulated as:

$$\begin{aligned} MH=(BH+{Cof}_{1,r})+\left( ri_3*{male}_{gazelle}-{ri}_4 *X_{rand}\right) *{Cof}_{1,r} \end{aligned}$$
(6)

Here, the arbitrary integers 1 or 2 are represented by \(ri_3\) and \(ri_4\).A gazelle is arbitrarily chosen from the total gazelles, and \(X_{rand}\) displays its vector position.

3.1.3 Bachelor Male Herd

Male gazelles have a tendency to generate territories and seize control of female gazelles as they mature. An abundance of violence may be involved when the young male gazelles engage in this conflict with the adult males for dominance of the female gazelles’ territory. The mathematical model of this behavior is expressed below:

$$\begin{aligned} BMH&=\left( X\left( t\right) -D\right) +\left( ri_5*{male}_{gazelle}-{ri}_6 *B H\right) *{Cof}_r \end{aligned}$$
(7)
$$\begin{aligned} D&=\left( \left| X\left( t\right) \right| +\left| {male}_{gazelle}\right| \right) *\left( 2*r_6-1\right) . \end{aligned}$$
(8)

Here, the random number between 1 and 2 is selected as \(ri_5\) and \(ri_6\). The random number ranges from 0 to 1 is denoted as \(r_6\). The vector position of gazelle in the current generation is indicated by the symbol X(t).

3.1.4 Migration in Search of Food

Mountain gazelles are perpetually on the lookout for sources of food and will travel large distances to seek food and migrate. Mountain gazelles, on the other hand, have a rapid running gait and strong leaping abilities. This behavior of gazelles has been numerically described using the below equation:

$$\begin{aligned} MSF=\left( ub-lb\right) *r_7+lb. \end{aligned}$$
(9)

Here, \(r_7\) indicates the arbitrary number between 0 to 1. lb and ub illustrate the lower and upper bounds, respectively. An overview of the optimization method based on the four MGO’s components is displayed in Fig. 2. The MGO algorithm operates on exploiting and exploring phases simultaneously by employing four mechanisms in parallel, according to its inherent nature.

Fig. 2
figure 2

An overview of the optimization procedure based on the four components of MGO

3.2 Chaotic Maps

Chaos is a phenomenon that can exhibit non-linear behavior changes in response to minor changes in its beginning phase. It may be defined mathematically as the arbitrary nature of a fundamental dynamic deterministic framework, and can be regarded as a cause of randomness. It possesses the characteristics of non-repetition and ergodicity; as a result, it will execute searching at a faster rate than random searches, which rely largely on probability. Chaos exhibits regularity as it is produced by specified functions called the chaotic functions or chaotic maps. These chaotic maps are extensively utilized in numerical analytic techniques, image encryption, and cryptology, as well as modeling complex problems in the domains of medicine, ecology, economics, and engineering [61]. In this study, chaotic sets are produced using 10 discrete non-invertible one-dimensional mappings. Table 1 emphasizes the mathematical model of the well-known chaotic maps. Figure 3 illustrates how the deterministic chaotic map equations contain stochastic variations. According to observations, \(Cof_i\) is crucial for maintaining an appropriate balancing in MGO between the intensification and diversification phases. MGO’s exploitation and convergence speed will be significantly improved by incorporating chaotic maps.

Fig. 3
figure 3

Stochastic variations of ten different chaotic maps

Table 1 Chaotic maps and their mathematical expression

4 Proposed Chaotic MGO (CMGO)

Chaotic maps have a significant impact on the quality of the solutions generated by optimization algorithms. Despite having a better convergence rate, MGO is still unable to outperform other methods in locating global optima that have an impact on the algorithm’s convergence rate. Therefore, the CMGO algorithm was established by incorporating chaos into the MGO algorithm to improve efficiency in searching for the global optimum by preventing getting entangled in local optima. In the MGO algorithm, four strategies are applied to simultaneously complete the exploration (diversification) and exploitation (intensification) stages. According to observations in MGO, the coefficient vector (\(Cof_i\)) is crucial for maintaining an appropriate balance in MGO between the intensification and diversification phases. MGO’s exploitation and convergence speeds will be significantly improved by incorporating chaotic maps. The mathematical expression of the three strategies is given by Eqs. (1), (6), and (7). To enhance the exploration and exploitation phases of the MGO algorithm, the proposed CMGO approach generates the values of these coefficient variables using 10 distinct chaotic maps which is displayed in Fig. 4. Chaos variables are generated through the utilization of chaotic maps instead of coefficient factors. The variables acquired from chaotic maps are generated in a certain order, while they are unordered with coefficient factor. This explains why chaotic maps ought to be incorporated into optimization techniques to improve their efficacy [49]. Mathematically, Eqs. (1), (6) and (7) in the MGO has been transformed to Eqs. (10), (11) and (12), resulting in the CMGO proposal.

$$\begin{aligned} TSM&={male}_{gazelle}\nonumber \\&\quad -\left| ri_1*BH-ri_2*X(t)*F\right| *Chaos(t) \end{aligned}$$
(10)
$$\begin{aligned} MH&=(BH+Chaos(t))\nonumber \\&\quad +\left( ri_3*{male}_{gazelle}-{ri}_4 *X_{rand}\right) *{Chaos(t)} \end{aligned}$$
(11)
$$\begin{aligned} BMH&=\left( X\left( t\right) -D\right) \nonumber \\&\quad +\left( ri_5*{male}_{gazelle}-{ri}_6 *B H\right) *{Chaos(t)} \end{aligned}$$
(12)

Here, Chas(t) is utilized instead of the coefficient vector as the chaotic variable. Ten different chaotic maps are simultaneously used in the formation of CMGO. Algorithm-1 demonstrates the CMGOs pseudo-code. Figure 5 displays the flowchart of the proposed CMGO algorithm.

Fig. 4
figure 4

Implementation of 10 different chaotic maps in MGO

Algorithm 1
figure a

The proposed CMGO algorithm

Fig. 5
figure 5

The flowchart of the proposed CMGO algorithm

5 Numerical Experiments and Result Analysis

This section stipulates the outcomes of simulations and a comprehensive analysis of the newly introduced CMGO algorithm to address global optimization problems with less time and greater accuracy. The proposed CMGO was assessed over 33 distinct benchmark functions (23 from CEC2005 [62] and 10 from CEC2019 [63]) and applied to four real-world engineering design issues to measure its performance. In this method, several chaotic maps were implemented in MGO individually to generate CMGO1 to CMGO10, which characterize Chebyshev maps, circle maps, Gauss/mouse maps, Iterative maps, logistic maps, piecewise maps, sine maps, singer maps, sinusoidal maps, and tent maps, and was compared to MGO. Furthermore, the validation process of the CMGO algorithm involved conducting both t-test [64] and Wilcoxon-rank tests [65]. These statistical assessments were performed to validate the robustness and effectiveness of the proposed CMGO algorithm. The results obtained from CMGO across the 33 comparison functions were contrasted with those derived from the original MGO, alongside widely recognized meta-heuristic optimization algorithms like PSO [8], GWO [12], DE [7], LSHADE [31], CMAES [32], FFA [34], and WSO [18] found in the existing literature. For each function, 30 individual runs are conducted, employing 30 search agents and performing 500 iterations. The CMGO algorithm was rigorously tested and executed on an 11th Gen Intel Core CPU running Windows 11 with a processing power of 2.42GHz and 16GB of RAM using MATLAB R2021b.

5.1 Test Functions and Parameter Settings

A test function is often described as an artificial problem that may be employed to estimate how well an algorithm works under various challenging conditions. For the purpose of assessing CMGO performance, a set of 23 benchmark functions from CEC2005 [62] are separated into three groups: The first (F1–F7) unimodal groups which have one global optimum are utilized to determine the exploitation (intensification) of the algorithm. The second (F8–F13) multi-modal groups that have multiple local extrema are employed to evaluate an algorithm’s capacity to prevent local sluggishness in sub-optimal regions. Finally, the third (F14–F23) fixed-dimension multi-modal groups have several optimal points and fewer local minima to assess the algorithm’s exploration (diversification) ability. The derivation of the benchmark functions and their global optimum values are presented in Tables 2, 3 and 4.

Table 2 The details of unimodal functions
Table 3 The details of multi-modal functions
Table 4 The details of fixed-dimension multi-modal functions

The complexity of 10 functions in CEC2019 [63] is significantly higher than standard functions. The algorithm evaluation method also emphasizes the need for generating accurate outcomes. The mean and standard deviation (std) measurements were frequently considered to be the best options for these benchmark functions. The purpose of standard deviation analysis is to verify that the performance of the algorithms has remained constant across the thirty different runs. Setting algorithm parameters to their default values is a prudent and appropriate method, as illustrated in Table 5.

Table 5 Parameter settings of comparison algorithms

5.2 Performance of Different Chaotic Maps with MGO.

This subsection describes the approach of employing 10 different alternative chaotic maps separately to enhance and refine the performance of the MGO algorithm. The results of different chaotic maps and MGO algorithm obtained from 23 functions of CEC2005 and 10 functions of CEC2019 with averages (Mean) and standard deviations (Std) for each test function are separately presented in Tables 6, 7, respectively. The best mean values have been highlighted in boldfaced in Table 6, 7. As observed in Table 6, the optimization efficiency of the method CMGO for the unimodal functions F1–F7 is considerably increased, indicating that CMGO has excellent exploitation capability and can discover the best global solution (F1–F5 and F7). It has been observed that the best outcomes achieved with CMGO2-CMGO10 for F1, CMGO3-CMGO10 for F2, CMGO1 for F3, CMGO8-CMGO9 for F4, CMGO6 and CMGO9 for F5, and CMGO10 for F7. In the group of multi-modal functions, F8, F9 and F11 shows the similar performance for MGO and all other chaotic MGO. CMGO2-CMGO10 for F10 and CMGO6 for F13 outperforms better than MGO. This demonstrates that the newly proposed CMGO has excellent exploration and flexibility when dealing with multimodal functions. In the group of fixed-dimension multi-modal functions, F14 and F16 achieve the same performance in MGO and all other chaotic MGOs. Except for F15 and F18, CMGO1 outperforms better results than MGO. The three enhancement techniques increase the algorithm’s optimization performance and convergence speediness. Moreover, the optimization accuracy of the CMGO method for the CEC2019 functions has been enhanced to different degrees when compared to the original MGO algorithm. On the basis of the results observed in Table 7, CMGO1 for cec02, cec03 and cec06 along with CMGO2 for cec10 achieved the best outcomes when compared to MGO. Therefore, from the aforementioned experiment research, the projected chaotic MGO (CMGO) outperform better in maximum functions and is appropriate for evaluating exploration (diversification) and exploitation (intensification) capability.

Table 6 Results on unimodal, multi-modal and fixed-dimension multi-modal functions with different chaotic maps
Table 7 Results on CEC2019 benchmark functions with different chaotic maps

5.3 Performance of CMGO with Other State-of-the-Art Algorithms

This study compares the performance and overall efficiency of the proposed CMGO technique against several top-performing algorithms, including PSO, GWO, DE, LSHADE, CMAES, FFA, WSO, and MGO. The outcomes presented in Table 8 demonstrate that the newly designed CMGO algorithm outperformed the comparative algorithms for the majority of the first group of unimodal functions (F1–F5 and F7). Therefore, the CMGO approach is best suited to assess its exploitation potential. The suggested CMGO technique may produce fascinating results on the second group of multi-modal functions for F9, F11, and F13, whereas WSO and MGO offer superior results for F8 and F12, respectively. It reveals that for addressing multimodal functions, the projected CMGO has outstanding exploration and adaptability. On the group of fixed-dimension multi-modal functions, the designed CMGO algorithm achieved superior results for F14, F21, F22, and F23, while achieving comparable results for F16 and F17. The experimental investigation serves as an illustration of how the CMGO algorithm appropriately blends exploitation and exploration. However, Table 9 divulges that the proposed CMGO algorithm outperforms better in fewer functions like cec02, cec03, and cec10, whereas LSHADE performs better than the CMGO algorithm. In general, the suggested CMGO delivers complimentary benefits to improve the global search capacity, enabling it to identify an accurate solution.

Table 8 Results on unimodal, multi-modal and fixed-dimension multi-modal functions with other state-of-art algorithms
Table 9 Results on CEC2019 benchmark functions with recent and topmost algorithms

5.4 Sensitivity Analysis

The sensitivity analysis of the CMGO algorithm is examined to assess the quality of the proposed CMGO. Here, the impact of varying the parameters of the CMGO on how it operates is investigated. Generally speaking, an initial point is the only major parameter in CMGO; hence, several scenarios are constructed based on the values of these factors. These factors are assessed at one of the following values: 0.1, 0.3, or 0.7. The test functions that have been selected for the experiment are F1, F4, and F7. The graphical and statistical results for each of the conditions are presented in Fig. 6 and Table 10, respectively. The outcomes demonstrate that the CMGO approach yields excellent results when the initial point (IP) is set at 0.7.

Fig. 6
figure 6

Sensitivity analysis of CMGO algorithm

Table 10 Sensitivity analysis of CMGO with different values of parameter

5.5 Statistical Analysis

The purpose of the statistical tests is to identify significant differences between the suggested CMGO approach and other well-established methods. In this work, the benchmark functions applied to all other algorithms are statistically analyzed employing the commonly-used t-test [64] and the non-parametric Wilcoxon rank-sum test [65]. The t-values for each function in the t-test may be determined by considering both approaches simultaneously. The statistical t-values are presented in Tables 8, 9. The last row in Tables 8, 9, gives the win, tie, and loss total for CMGO, which contributes in producing more efficient outcomes than other various meta-heuristics. Additionally, the outcomes of the two-tailed Wilcoxon rank-sum test are presented in Tables 11, 12. Wilcoxon rank-sum test is a non-parametric statistical test that analyses the importance of two sets of parameters depending on the differences between them. It is performed at a 5% significance level. Throughout the experiment, when p-value \(< 0.05\), W-value is determined as ‘\(+\)’, which demonstrates that there is a statistically significant difference between the two groups’ data, otherwise W-value is determined as ‘−’ which shows that there is no significant difference. NA stands for “not applicable” that symbolizes ‘\(=\)’ which demonstrate that the proposed CMGO algorithm have equivalent difference with other meta-heuristics. From the Table 11, it is observed that the proposed CMGO algorithm has significant difference for the functions F1–F8, F17, F19 and F22 compared with all other algorithms. However, compared with MGO, CMGO has no significant results for the functions F10 and F20. Compared with LSHADE, FFA, WSO and MGO, CMGO illustrates equivalent outcomes for the functions F14 and F9. Therefore, when compared to prominent algorithms, the suggested CMGO demonstrates outstanding results for the majority of the functions. As observed in Table 12, it is perceived that the proposed CMGO has significant difference for the functions cec01, cec05 and cec06 when compared with other meta-heuristics. Compared with GWO, CMAES, and FFA, the CMGO has significant results in the maximum number of functions except for cec07 and cec08. In general, CMGO proved to be the best optimizer based on the aforementioned statistical analysis, outperforming the other recent and top-performing algorithms on the majority of the problems with the environment.

Table 11 Statistical results obtained for the Wilcoxon rank-sum test on CEC2005 test functions
Table 12 Statistical results obtained for the Wilcoxon rank-sum test on CEC2019 test functions

5.6 Convergence Analysis

This subsection illustrates the graphical investigation for a more comprehensive evaluation of the performance of all the algorithms. Line graphs exhibiting the convergence of several benchmark functions using the CMGO method and other algorithms have been displayed in Figs. 7, 8, which make it simple to analyze the rate of convergence of each algorithm over the course of numerous iterations. These figures exhibit the convergence curves of the CMGO for several unimodal, multi-modal, and fixed-dimensional multi-modal functions, as well as CEC2019 benchmark functions. As can be observed from the convergence curves in Fig. 7, over the period of each iteration, CMGO tends to accelerate its convergence. In other words, searching individuals can identify desirable areas in the early iterations and accelerate convergence. Functions F1–F5 and F7 each exhibit this behavior. Consequently, while solving the multi-modal functions, CMGO might reveal an outstanding convergence rate. Based on these curves, CMGO sustains a consistent rate of convergence in the final stage of the iterations. This is because the explorers are covering all of the search space, and functions F9, F11 and F13 exhibit this apparent behavior. However, CMGO performs equivalent results than the compared algorithms in context of convergence rate in most of the functions like F17, F21–F23. On the basis of convergence curve in Fig. 8, it is obvious that the CMGO algorithm has better convergence accuracy than competing algorithms at certain functions, such as cec02, cec03, cec06, and cec10. These evaluations demonstrate that the search region is initially explored by the CMGO gazelles to evaluate multiple potential destinations. Then, the search area is exploited by the CMGO gazelles to seek out the global optimal solution while compelling them to change locally rather than globally. As a result, it is shown that the enhancements recommended in this study result in a better balance between the exploration and exploitation abilities of the classical MGO algorithm. Because of these enhancements, the convergence and search rate of CMGO outperform the other compared algorithms.

Fig. 7
figure 7

Convergence curve of CEC2005 benchmark functions

Fig. 8
figure 8

Convergence curve of CEC2019 benchmark functions

6 CMGO for Real-Life Engineering Problems

Engineering design is the technique of addressing the demands necessary to assemble a product. An intricate objective function and an extensive range of decision factors, including weight, strength, and wear, constitute this process of making decisions. Real design challenges might have a substantial number of design variables and a complex, nonlinear impact on the objective function that has to be optimized. Therefore, design considerations for seven traditional engineering design challenges have been addressed in this section to overcome the projected CMGO algorithm, and its outcomes are evaluated against other competing algorithms.

6.1 Tension/Compression Design Problem

The primary objective of this engineering design problem [66] is to reduce the weight of the spring through the combination of three decision variables: wire diameter (d), mean coil diameter (D), and number of active coils (N) as displayed in Fig. 9. The problem may be described numerically as follows: Suppose \(\vec {y}=[y_1, y_2,y_3]=[N,d,D]\)

Minimize \(f_1(\vec {y})=(y_3+2)y_2y_{1}^2\) Subject to

$$\begin{aligned}&c_1\left( \vec {y}\right) =1-\frac{{y_2^3y}_3}{71785y_1^4}\le 0 \end{aligned}$$
(13)
$$\begin{aligned}&c_2\left( \vec {y}\right) =\frac{{{4y}_2^2-y}_1y_2}{12566 ({y_2y_1^3-y}_1^4)}+\frac{1}{5108y_1^2}\le 0 \end{aligned}$$
(14)
$$\begin{aligned}&c_3\left( \vec {y}\right) =1-\frac{140.45y_1}{y_2^2y_3}\le 0 \end{aligned}$$
(15)
$$\begin{aligned}c_4\left( \vec {y}\right) =\frac{y_1+y_2}{1.5}-1\le 0 \end{aligned}$$
(16)
$$\begin{aligned}0.05\le y_{1}\le 2.00\nonumber \\&0.25 \le y_2\le 1.30\nonumber \\&2.00\le y_3\le 15.0 \end{aligned}$$
(17)
Fig. 9
figure 9

The schematic design of Tension/compression spring problem

The results of the analysis are summarized in Table 13, demonstrating that the CMGO can address this problem effectively by providing a better design than the enumerated algorithms such as PSO, GWO, WOA, MFO, ABC, and MGO.

Table 13 The results obtained for solving tension/compression design problem

6.2 Gear Train Design Problem

An excellent illustration of a mixed challenge is the design of a gear train, which requires determining a number of design elements that are continuous, integer variables, and discrete [30]. The challenge is basically that follows: Given a set input drive and a number of fixed output drive spindles, how can the output drive spindles be driven by the input retaining the least amount of connecting gear in the train. The purpose of this challenge is to substantially reduce the cost of the gear ratio in the aforementioned mechanical engineering problem, depicted in Fig. 10. Four integer variables, \(T_A\), \(T_B\), \(T_C\), and \(T_D\), which represent the number of teeth on four different gearwheels, are contained in this problem. Below is an investigation of the challenge’s formulation.

Consider

$$\vec {y}=\left[ y_1\ y_2\ y_3\ y_4\right] =\left[ T_A\ T_B\ T_C\ T_D\right]$$

Minimize

$$f\left( \vec {y}\right) =\left( \frac{1}{6.931} -\frac{y_3y_2}{y_1y_4}\right) ^2$$

Variable range 12 \(\le y_1, y_2, y_3, y_4 \le 60\)

Fig. 10
figure 10

The schematic design of Gear train design problem

In Table 14, the performance of CMGO is contrasted with that of the PSO, GSA, GWO, MVO, WOA, and MGO. These results illustrate how CMGO outperforms competing algorithms and generates outcomes comparable to MGO. These results show that CMGO is adept at addressing various problems.

Table 14 The results obtained in solving Gear train design problem

6.3 Speed Reducer Design Problem

The objective of the speed reducer design challenge, which is schematically portrayed in Fig. 11, is to produce a speed reduction with the least amount of weight [67]. Since there are seven design variables in this problem, it is more challenging. The following factors constitute this category: width (\(x_1\)), teeth module (\(x_2\)), number of pining teeth (\(x_3\)), length of shaft 1 between bearings (\(x_4\)), length of shaft 2 between bearings (\(x_5\)), diameter of shaft 1 (\(x_6\)), and diameter of shaft 2 (\(x_7\)). For this design problem, a mathematical equation is outlined as follows.

Consider

$$\vec {y}=\left[ y_1\ y_2\ y_3\ y_4\ y_5\ y_6\ y_7\right] =[b\ m\ p\ l_1\ l_2\ D_1\ d_1]$$

Minimize

$$f\left( \vec {y}\right) =0.7854y_1y_2^2\left( 3.3333y_3^2 +14.9334y_3-43.0934\right) -1.508y_1\left( y_6^2+y_7^2\right) +7.4777\left( y_6^2+y_7^2\right) +0.7854\left( y_4y_6^2+y_5y_7^2\right)$$

Subject to

$$\begin{aligned}&c_1\left( \vec {y}\right) =\ \frac{27}{y_1y_2^2y_3}-1\le 0 \nonumber \\&c_2\left( \vec {y}\right) =\ \frac{397.5}{y_1y_2^2y_3^2}-1\le 0 \nonumber \\&c_3\left( \vec {y}\right) =\ \frac{1.93y_4^3}{y_2y_6^4y_3}-1\le 0 \nonumber \\&c_4\left( \vec {y}\right) =\ \frac{1.93y_5^3}{y_2y_7^4y_3}-1\le 0 \nonumber \\&c_5\left( \vec {y}\right) = \frac{\sqrt{\left( \frac{745y_4}{y_2y_3}\right) ^2+16.9 \ \times \ {10}^6}}{110y_6^3}-1\le 0 \nonumber \\&c_6\left( \vec {y}\right) =\ \frac{\sqrt{\left( \frac{745y_5}{y_2y_3}\right) ^2+157.5\ \times \ {10}^6}}{85y_7^3}-1\le 0 \nonumber \\&c_7\left( \vec {y}\right) =\frac{y_2y_3}{40}-1\le 0 \nonumber \\&c_8\left( \vec {y}\right) =\frac{{5y}_2}{y_1}-1\le 0 \nonumber \\&c_9\left( \vec {y}\right) =\frac{y_1}{12y_2}-1\le 0\nonumber \\&c_{10}\left( \vec {y}\right) =\frac{1.5y_6+1.9}{y_4}-1\le 0\nonumber \\&c_{11}\left( \vec {y}\right) =\frac{1.1y_7+1.9}{y_5}-1\le 0 \end{aligned}$$
(17)

Where the range of the design variables b, m, p, \(l_1\), \(l_2\), \(D_1\), and \(d_1\) were given as

$$\begin{aligned}&2.6\le y_1\le 3.6\nonumber \\&0.7\le y_2\le 0.8\nonumber \\&y_3\in \left\{ 17,\ 18,\ 19,\ \ldots .,\ 28\right\} \nonumber \\&7.3\le y_4\nonumber \\&y_5\le 8.3\nonumber \\&2.9\le y_6\le 3.9\nonumber \\&5\le y_7\le 5.5 \end{aligned}$$
(18)
Fig. 11
figure 11

The schematic design of Speed reducer design problem

Table 15 compares the best solutions obtained through CMGO with other optimizers like PSO, GWO, WOA, MVO, SCA, and MGO. The statistical results summarized in Table 15 divulge that CMGO generated the best optimum solutions among the competing algorithms.

Table 15 The results obtained in solving speed reducer design problem

6.4 Three-Bar Truss Design Problem

Three-bar truss design is a standard optimization problem in civil engineering [68]. Its primary objective is to establish the best values for two parameters (A1 and A2, albeit a third value, A3 = A1, is also generated), which might help decrease weight when designing a truss similar to the one in Fig. 12 by determining the best values for those two parameters. The following equation quantitatively expresses the truss design problem:

Consider

$$\vec {y}=\left[ y_1\ y_2\ \right] =\left[ A_1\ A_2\right]$$

Minimize

$$f(\vec {y})=(2\sqrt{2y_1}+y_2)*l$$

Subject to

$$\begin{aligned}&c_1\left( \vec {y}\right) =\frac{\sqrt{2}y_1+y_2}{\sqrt{2}y_1^2+2y_1y_2}P -\sigma \le 0, \nonumber \\&c_2\left( \vec {y}\right) =\frac{y_2}{\sqrt{2}y_1^2+2y_1y_2}P-\sigma \le 0,\nonumber \\&c_3\left( \vec {y}\right) =\frac{1}{\sqrt{2}y_2+y_1}P-\sigma \le 0, \end{aligned}$$
(19)

Variable range \(0\le y_1\), \(y_2\le 1\)

Here, \(l=100\,cm\), \(\sigma ={2\,kN}/{cm}^2\), \(P={2\,kN}/{cm}^2\)

Fig. 12
figure 12

The schematic design of Three-bar truss design problem

The outcomes of the projected CMGO algorithm are juxtaposed with those of other optimizers like PSO, GSA, GWO, MVO, WOA, and MGO to assess how effectively it performs in solving this problem. The approximate optimal cost achieved through the proposed approach and the added algorithms are summarized in Table 16. As this table demonstrates, the outcomes of the CMGO are relatively comparable to all the other algorithms. The suggested algorithm’s optimal cost is utilized as an indication of the outstanding performance of the CMGO in addressing this challenge.

Table 16 The results obtained in solving the three-bar truss problem

7 Conclusions and Future Works

This study attempts to improve the recently proposed swarm-based algorithm called Mountain Gazelle Optimizer (MGO) by incorporating 10 distinct chaotic maps to form a newly developed algorithm, namely, Chaotic Mountain Gazelle Optimizer (CMGO), to accelerate convergence, prevent local optimums, and enhance MGO’s capacity to balance between exploration and exploitation stages. At first, 10 different chaotic maps were implemented individually and compared to different chaotic maps along with MGO. Afterwards, ten chaotic maps were incorporated into MGO simultaneously and compared to top-performing algorithms. The outcomes of the investigation indicate that for the majority of the tested functions of CMGO, the accuracy of the generated solutions and the convergence rate were superior to those of other optimization techniques for the majority of the functions. Engineering challenge evaluations reveal that the recently established chaotic-based CMGO outperforms its competitors. Additionally, the statistical results of the Wilcoxon sign-rank test showed that the CMGO has a p-value of less than 0.05 for the CEC2005 and CEC2019 benchmark functions, indicating its statistically significant superiority over competitor algorithms. Regardless of its outstanding performance, the main limitation of the proposed CMGO method is that it has insignificant results when solving complex multi-modal functions and CEC2019 functions compared to other peers, which could affect the computational time. When optimizing problems with varying degrees of uncertainty, there is a deficiency of diverse search strategies during the exploration and exploitation stages. In future work, the CMGO method may be utilized for image processing, data mining, and design problems across a broad range of engineering disciplines. Furthermore, the CMGO can be modified to optimize the network’s parameters to improve the network’s performance.