Introduction

In the past few decades, many meta-heuristic search algorithms have been developed and performed well in solving various optimization problems in engineering fields. Thus, they have attracted a significant amount of attention in recent years.

Many traditional numerical optimization algorithms mainly calculate based on differential and gradient descent principles. Therefore, when solving some complex optimization problems, such as discrete structure optimization1, water supply network design2 and vehicle routing problem3 etc., the traditional optimization algorithm cannot get reasonable and accurate solutions in a limited time. With the continuous development of science and technology, the actual engineering optimization problem becomes more and more complex, the scale of calculation is more and more large, and the constraints are more and more. The defects of traditional optimization algorithms in solving these problems become more and more obvious, so effective and simple meta-heuristic search algorithms become more and more important. Since the meta-heuristic search algorithm is an iterative calculation by simulating the randomness and regularity of some phenomena in nature and life, it does not require too much mathematical form of the problem and has good global search ability, so it is suitable for solving these optimization problems. For example, genetic algorithm (GA) simulates the process of biological evolution and natural selection, establishes selection, crossover and mutation operations according to biological evolution, and simulates the process of biological evolution in iterative calculation4. Inspired by the foraging behavior of birds, the particle swarm optimization algorithm (PSO) compares the optimal solution to the problem to food, updates the position of each bird according to the position of the bird closest to the food and each bird's own position, and simulates the process of birds constantly searching for food through iterative calculation5. Mirjalili and Lewis proposed whale optimization algorithm (WOA) by simulating the behavior of whale predation on fish and using random or optimal individuals to simulate the hunting behavior of humpback whale6. Alsattar simulates the hunting strategy and intelligent social behavior of condor when looking for prey, and proposes the bald eagle search optimization algorithm (BES)7. In various optimization problems, these algorithms show better performance than traditional numerical methods in solving optimization solutions.

Harmony search (HS) algorithm is a population-based meta-heuristic search algorithm, which is inspired by the phenomenon that musicians repeatedly adjust the pitch of their instruments to achieve a beautiful harmonic state. It usually has a small number of adjustable parameters and is easy to implement. Based on these advantages, HS has been successfully applied to a large number of optimization problems8,9, such as job-shop scheduling10,11,12, function optimization13,14,15,16,17,18,19,20,21,22,23,24,25, neural network training optimization26, network design27, Knapsack problem28, etc. However, HS algorithm has the disadvantages of slow convergence speed29, premature convergence and low optimization accuracy. In order to improve the performance of HS algorithm, four aspects of parameter adjustment, strategy design, algorithm mixing and algorithm application are mainly improved.

The main approaches for parameter adjustment include empirical parameter adjustment, dynamic parameter adjustment and adaptive parameter adjustment. Mahdavi et al. proposed an improved harmony search algorithm (IHS) by dynamically adjusting PAR and BW parameters, and effectively enhanced the search capability of the algorithm30. Pan et al. proposed an adaptive global optimal harmony search (SGHS) algorithm, which stores the information in the global optimal scheme and uses this information to adjust the harmony memory consideration rate (HMCR) and pitch adjustment rate (PAR) according to the proposed learning mechanism20. Khalili et al. eliminated various parameters that needed to be defined before optimization of the original algorithm, and adjusted parameters HMCR and PAR into dynamic mode. HMCR and PAR increase gradually in the early stage of algorithm calculation, and decrease gradually in the second half31. Zhu et al. proposed an improved differential-based harmony search algorithm with linear dynamic domain (ID-HS-LDD). In the ID-HS-LDD algorithm, the improved difference based method is adopted to adjust the value of BW, and the changes of HMCR and PAR are divided into dynamic adjustment and fixed value32. Li et al. artificially minimized the thrust of bovine cortical bone VAD, and proposed an information feedback adaptive harmony search (IFSHS) algorithm, which evaluated the optimization parameters and fed back the evaluation information into the dynamic adjustment of the parameters33. Li et al. proposed an improved harmonic search algorithm, called the global optimal adaptive harmony search algorithm (AGOHS), which overcomes the problem of traditional HS parameter setting34. Mahmoudi et al. adjusted the bandwidth (BW), pitch adjustment rate (PAR) and other parameters of the harmonic search algorithm through the membership function and fuzzy rules designed to improve the performance of the algorithm35. Loor et al. proposed an adaptive improved harmony (AIHS) optimization algorithm based on two adjustment stages, which designed new dynamic BW and dynamic HMCR, and used the maximum vector and minimum vector in HM to dynamically adjust BW36. Cui et al. dynamically adjusted parameter BW according to the square root of the mean value of the global harmony vector after each update, so as to obtain better fine-tuning performance and convergence speed, and proposed the opposition-based Learning parameter-adjusting Harmony Search (OLPDHS) Algorithm37.

Improvements based on the strategy design include the following works. In order to solve the team orientation problem (TOP), Tsakirakis et al. added a new strategy called "similarity process" into the standard HS process. According to the set similarity parameter SP and similarity matrix SM, the solution space is too large, and proposed the similarity mixed harmony search (SHHS) algorithm38. Li et al. proposed a novel enhanced harmony search (NEHS) algorithm. The algorithm divides the calculation process into three stages and fine-tuning with three different BW levels to enhance the global and local search capabilities. Meanwhile, the algorithm uses the global best and global worst harmony vectors to update HM in the second half generation of HMC operation39. In order to use HS algorithm to better solve the traveling salesman problem, Boryczka and Szwarc designed a mechanism of resetting HM element to improve adjacent solutions in harmonic memory HM, and activated this mechanism when the number of iterations reached the set number R40. In order to improve initial harmonic memory, Doush et al. made use of nearest neighbor (NN) and constructively modified NEH, and proposed an improved harmony search algorithm with proximity heuristic(MHSNH). Another adjacent heuristic method is also applied in this algorithm to improve the resulting harmonic vector41. Liu et al. proposed a modified Harmony Search (MHS) algorithm with a control strategy, in which the effective mutation operator and crossover operator are used to avoid falling into local optimum, and adaptive relationship is used to improve the flexibility of crossover42. The basic HS algorithm is not suitable for large networks, complex constraints and high dimensions. Therefore, Wang et al. proposed the conductor harmony search (CHS) algorithm, which introduced time series constraints in day-ahead optimal scheduling problems into fractional constraints and state matrix, and used the state matrix to record the values of time series constraints43. In the improved harmony search algorithm with hybrid convergence mechanism (AHS-HCM) proposed by Zhu and Tang (2021), They introduced a convergence coefficient into the harmonic wave to adjust the optimization performance, and proposed a quadratic nonlinear convergence region for global search, overcoming the disadvantages of low accuracy of basic HS optimization and easy to fall into local optimal44. In 2021, Pan et al. proposed an adaptive agent-based harmonious search algorithm (ASBHSA) to artificially accelerate the design and optimization of VSC structures. This algorithm integrated GHS algorithm into the optimization framework, and designed an adaptive sampling method based on the combination of distance-based filling criteria and radial basis function (RBF)45. Li et al. proposed a new harmony search algorithm to solve the image segmentation problem. In this algorithm, central harmony and central congestion distance are introduced in the initial harmonic memory stage to reduce the possibility of local aggregation of initial solutions, and a new harmonic generation strategy based on explicit harmonic learning experience is constructed46.

In fact, many new meta-heuristics algorithms are based on the combination of existing meta-heuristics algorithms47. At present, many scholars make use of the characteristics of existing heuristic algorithms and combine heuristic algorithms with harmony search algorithm to solve some problems existing in traditional harmony search algorithm. For example, Fesanghary et al. proposed the hybrid harmony search (HHS) algorithm, which combines HS algorithm with sequential quadratic programming (SQP) algorithm to improve the local search efficiency and the precision of the algorithm48. Wang and Li proposed the co-evolutionary differential evolution with harmony search algorithm (CDEHS). In the first half of the algorithm, the two populations respectively implement the optimization mechanism of differential evolution algorithm (DE) and the optimization mechanism of HS. In the second half of the algorithm, HS population stops evolving and provides the best solution for DE population evolution49. In order to improve the convergence ability of the basic harmony search algorithm, Kayabekir et al. proposed a new hybrid harmony search (HHS) algorithm, which combines the local search phase of the harmony search algorithm with the global search phase of the flower pollination algorithm (FPA). Compared with other heuristic algorithms, HHS algorithm has fewer iterations and smaller total potential energy value50. Shaikh et al. found that the mixture of HS and SA could expand the search scope and avoid premature convergence of HS, so they proposed the hS-SA algorithm, which has significant search ability in the early stage far away from the nearby optimal value to avoid falling into the local optimal51. Zhang et al. proposed an improved harmonious search (IC-HS) algorithm based on clustering, which combined the clustering algorithm when initializing the harmonious memory library, made the initial value distribution more uniform, avoided premature convergence, and achieved faster convergence speed and higher efficiency52. In order to avoid falling into local optimization or becoming unstable, Radman et al. combined bidirectional evolutionary structure optimization (BESO) with HS, and utilized the random and random characteristics of HS to reduce the risk of the process falling into local optimization or becoming unstable53. Gong et al. combined and complemented harmonious search and Tabu search according to their advantages and disadvantages. The local search capability of HS algorithm is enhanced by combining the advantages of harmony search algorithm in early and middle stage global search with the neighborhood search of Tabu search algorithm54. In order to improve the performance of HS algorithm, Amini and Ghaderi introduced pheromone values and heuristic values of ant colony algorithm into HS algorithm, and used them to regenerate probabilistic mass function (PMF), and then used PMF to fill HM, so that HS algorithm could avoid falling into local minimum55. Similarly, in order to avoid falling into local minima, Gheisarnejad et al. improved the spawning and migration operations of Cuckoo Search Algorithm (CSA) and added them into HS algorithm to create a new update method56.

HS algorithm also has good innovation in many practical applications. Mahmoudi et al. used the improved harmony search algorithm to optimize the loading mode and rationally design the pressurized water reactor core fuel management, which increased the fitness function value by 15.35% and improved the economy and safety of nuclear reactors35. Li et al. applied an improved harmony search algorithm to optimize the automatic guided vehicle (AGV) scheduling algorithm. The mathematical model including the total travel distance of AGV and the standard deviation of the wait time of NC material buffer is improved57. Gong et al. have improved the layout planning of automated assembly lines and flexible manufacturing production facilities through a new harmony Tabu search algorithm, and have used this algorithm to explore the optimal construction conditions for the mechanical properties of composite particleboard54. Szwarc et al. proposed a new harmony search algorithm design method for directional problems. Through experiments, compared with other algorithms, the efficiency of this algorithm is more significant, and the average error is less than 0.01%58. In order to optimize the layout of single-camera and multi-lens devices, Wang et al. optimized 3D multiphase flow imaging device through an improved harmony search algorithm, which can be used to measure transparent objects and opaque objects in the center of the 3D observation area59. In order to obtain higher accuracy in fake news detection, Huang and Chen proposed an adaptive and improved harmony search algorithm model, which is superior to the existing model and has a maximum accuracy of 99.4%60. Yong transforms the general LCP into a nonlinear equation by means of the NCP—function, and uses a new global harmony search algorithm (NGHS) to solve the problem. Numerical results show that this algorithm has a faster convergence rate than other algorithms and overcomes the shortcomings of interior point method61. Jeddi et al. proposed a new improved HS algorithm, which solved the established problems of robust dynamic distributed energy (DER) planning, achieved the balance between development and exploration capabilities, and improved the performance of dynamic distributed energy planning in distribution networks62. Determining the optimal variable of electricity quantity under the most economical condition is the key to meet the demand of electricity. Therefore, Maleki et al. used the IHS algorithm to determine the optimal size of a hybrid energy system consisting of photovoltaic panels and batteries63. In terms of gene selection and classification of high-dimensional medical data, Dash et al. proposed a gene selection method based on adaptive harmony search algorithm to overcome the difficulty of gene selection in large search spatial representation of microarray data64. Botella Langa et al. applied the harmonious search algorithm to the optimal design of urban water distribution network, and the experiment showed that compared with other heuristic algorithms, the harmony search algorithm could obtain more convenient solutions at more reasonable computational costs65. The type of solar panel has a big impact on the optimal size of a hybrid photovoltaic cell solution. Therefore, Liu,J et al. proposed a new global dynamic harmony search algorithm to solve the optimal size of hybrid photovoltaic cell schemes66. Due to the outbreak of COVID-19 in recent years, there is a shortage of international nurses, and it is difficult to provide convenient roster for nurses. The problem of nurse roster is a well-known non-deterministic polynomial time difficult combinatorial optimization problem. Therefore, Mohammed Hadwan proposed an annealing harmony search algorithm (AHSA) to solve this problem67. In medicine and genetics, Shouheng Tuo et al. developed and applied some excellent modified harmony search algorithm to detect higher-order SNP epistatic interactions. Through experimental comparison, it is proved that this algorithm can effectively perform multiple high-order detection tasks for high-order epistatic interaction, and improve the recognition ability of different epistatic models68,69,70.

In the HS algorithm, the HM library is updated by comparing whether the new harmonies produced in each generation are better than the worst harmonies in the HM library. This means that HM library can actually be regarded as the preservation library of the global best harmonies in history, storing the historical information of the best harmonies in the past. To date, a lot of research has been done on the parameter adjustment of the algorithm and the design of new harmony generation strategy, but little research has been done on how to use all the historical information in HS library to assist the generation of new harmony. Studying how to utilize the historical information in HM library is still an important and significant research area. At the same time, there are little research on how to use the relationship between individual values of different dimensions in the algorithm calculation.This research aims to make some contribution in these two aspects.

In this paper, we propose an improved version of the Harmony Search (HS) algorithm. This algorithm is also an improvement over the Intelligent Global Harmony Search (IGHS) algorithm proposed by Ehsan Valian et al. in 201471. We adopted the trust region and simple coupling strategy and named our algorithm the Novel Intelligent Harmony Search (NIGHS) algorithm. The NIGHS algorithm has the following features:

  1. (1)

    Construct a stable trust region based on all the historical information in the HM library. For each variable, the trust region is constructed with the mean value of the variable in HM library as the boundary and the global best harmony as the center.

  2. (2)

    Dynamic Gauss fine-tuning in the trust region to enhance local search ability. Gaussian fine-tuning with dynamic adjustment of step size with the new solution obtained randomly in the trust region improves the convergence speed and accuracy of the algorithm.

  3. (3)

    Improved coupling operation between different dimensions. The relationship between other dimensions and the current dimension is used to adjust the value of the current dimension, which provides a large reasonable fluctuation for the dimension in the early stage of the algorithm and enhances the ability to get rid of the local optimal.

The proposed NIGHS is compared with the basic version of harmony search (HS)72, the improved harmony search algorithm (IHS)30, the geometric Harmony search algorithm (GHS)73, the novel global harmony search algorithm (NGHS)20, the self-adaptive global best harmony search algorithm(SGHS)74, the intelligent global harmony search algorithm (IGHS) which was proposed by Ehsan Valian et al. in 201471 and the intersect mutation global harmony search algorithm (IMGHS)75. These methods are tested and evaluated using CEC 2017 benchmark functions. Our experimental results reveal that the proposed NIGHS algorithm is superior to the comparison HS variants on many benchmark functions and has a more robust convergence when optimizing objective functions in terms of the solution accuracy and efficiency.

The remainder of this paper is organized as follows: “Preliminaries” section provides the background information of harmony search (HS), novel global harmony search (NGHS), and intelligent global harmony search algorithm (IGHS). “The proposed novel intelligent global harmony search algorithm (NIGHS)” section describes the proposed novel intelligent Global Harmony search algorithm (NIGHS). The experimental results and analysis are presented in “Experimental results and analysis” section. The concluding remarks are given in “Conclusions” section.

Preliminaries

Harmony search (HS) algorithm

Harmony search (HS) is a type of population-based algorithm that was first introduced by Geem, Z. W. (2001). The HS algorithm is inspired by the improvisation of music players, and its main steps are as follows.

Step 1: Initialization of parameters In this step, we define the values of each parameter of HS. Among them, these parameters include the dimension of the variable (N), the lower and upper limits of the search domain (LB,UB), harmonization memory bank size (HMS), harmonization memory consideration rate (HMCR), pitch adjustment rate (PAR), bandwidth vector (BW), and maximum number of iterations (NI).

Step 2: Initialization of the harmony memory The harmonic memory bank (HM) is an N × HMS matrix, whose initial value consists of N vectors randomly generated between the lower and upper bounds (LB, UB) of the problem search domain. The formula is as follows:

$${\text{x}}_{{{\text{i}},{\text{j}}}} = {\text{LB}}_{{\text{j}}} + \left( {{\text{UB}}_{{\text{j}}} - {\text{LB}}_{{\text{j}}} } \right){\text{*rand}}\left[ {0,1} \right].$$
(1)
$${\text{HM}} = [{\text{x}}_{1} ,{\text{x}}_{2} ,{\text{x}}_{3} , \ldots {\text{x}}_{{{\text{HMS}}}} ]^{{\text{T}}}$$
(2)

where i = 1,2……,HMS, and j = 1,2,…,N.

Step 3: Improvisation of a new harmony In this step, a new harmony (\(x^{new}\)) is generated based on:harmony memory consideration rate (HMCR), pitch adjusting rate (PAR) and a random choice as described below. :

Step 3.1: Harmony memory consideration rate (HMCR) First, a new random number \({\text{r}}1 \in \left( {0,1} \right)\) is generated. Then, it is compared with the harmony memory consideration rate (HMCR). If \({\text{r}}1 < {\text{HMCR}}\), each component of the new harmony is chosen randomly from the harmony memory (HM) components according to

$$x_{j}^{new} = x_{j}^{a}$$
(3)

where \({\text{j}} = 1,2, \ldots ,{\text{N}}\),and \(x^{a}\) is a random harmony vectors from harmony memory (HM).

Step 3.2: Pitch adjusting rate (PAR) A new random number \({\text{r}}2 \in \left( {0,1} \right)\) is generated, and if \({\text{r}}2 < {\text{PAR}}\), the component selected in the harmonic memory Consideration rate (HMCR) step is fine-tuned according to bandwidth BW, the formula is as follows:

$$x_{j}^{new} = \left\{ {\begin{array}{*{20}l} {x_{j}^{new} \pm rand \times BW} \hfill & {if\; r2 < PAR} \hfill \\ {x_{j}^{new} } \hfill & {otherwise} \hfill \\ \end{array} } \right.$$
(4)

where \({\text{j}} = 1,2, \ldots ,{\text{N}}\).

Step 3.3: Random generation Each component that is not selected for the harmonious Memory Consideration Rate (HMCR) will be replaced with a value randomly generated in the search field, namely:

$$x_{j}^{new} = {\text{LB}}_{{\text{j}}} + \left( {{\text{UB}}_{{\text{j}}} - {\text{LB}}_{{\text{j}}} } \right){\text{*rand}}\left[ {0,1} \right].$$
(5)

where \({\text{j}} = 1,2, \ldots ,{\text{N}}\).

Step 4: Harmony memory (HM) update Here a new harmony is generated. If the fitness value of the new harmony (\({\text{x}}^{{{\text{new}}}}\)) is better than the fitness value of the worst harmony (\({\text{x}}^{{{\text{worst}}}}\)) in HM, the worst harmony in HM will be replaced by the new harmony.

Step 5: Check the termination criterion If the number of the current iteration (t) is less than the maximum number of iterations (NI), then Steps 3 and 4 are repeated. Otherwise, the optimization process stops.

The HS algorithm are shown in Algorithm 1.

figure a

Novel global harmony search (NGHS)

The novel global harmony search (NGHS) algorithm74 is a global type of harmony search (HS) whose genetic mutation operator is applied after updating its position to escape from the local minima. It modifies the improvisation step of the HS by substituting HMCR and PAR based on the genetic mutation probability \(P_{m}\). The main steps of NGHS algorithm are shown in Algorithm 2.

figure b

Intelligent global harmony search (IGHS)

The intelligent global harmony search (IGHS) algorithm71 is a variant of novel global harmony search (NGHS). It modifies the improvisation step of the NGHS in such a way that the new harmony imitates one dimension of the best harmony in the HM. The main steps of IGHS algorithm are shown in Algorithm 3.

figure c

The proposed novel intelligent global harmony search algorithm (NIGHS)

In this section, we will introduce an improved version of HS algorithm, namely the novel intelligent global harmony search algorithm (NIGHS). The main improvement in the NIGHS are the stable trust region and improved coupling operation.

Stable trust region

According to the HS algorithm, each variable of the new harmony in each generation is generated by fine-tuning the variable of the same dimension of any harmony in the HM library with a certain probability, or randomly generated globally. The multiple harmonies in the HM library provide the diversity for the generation of new harmonies. In the NGHS algorithm, it is changed to adjusting the global best harmony search position instead of replacing the original selection operation. The IGHS Algorithm adds coupling operation between variables of different dimensions (Algorithm 3, Line 7) on the basis of the global adjustment strategy of NGHS, which increases the diversity of population. The adjustment range of the global best harmony search is only determined by the best and worst harmonies of the HM library, and the range of the trust region is unstable, and the convergence is poor. In IGHS algorithm, the width of trust region is easy to be zero, which leads to the stagnation of iterative search, as shown in Fig. 1.

Figure 1
figure 1

The width of trust region in IGHS.

Therefore, how to build a stable trust region is the key to improve the optimization ability of the IGHS algorithm. With the iteration of HM library, the harmony of HM library will gradually approach to the global optimal harmony. The iterative change of the fitness mean of HM library is shown in Fig. 2a. It is natural to think that a stable trust region can be constructed through the mean and best harmony of the HM library. The trust region is constructed as follows:

$$x_{j}^{mean} = \frac{{\mathop \sum \nolimits_{i = 1}^{HMS} x_{i,j} }}{HMS}$$
(6)
$$x_{R} = 2 \times x_{j}^{best} - x_{j}^{mean}$$
(7)
Figure 2
figure 2

a The value of the mean of HM libraries in HS and b the width of trust region in NIGHS.

Here, HMS represents the size of HM library, and \({\text{x}}_{{\text{j}}}^{{{\text{mean}}}}\) represents the average value of this variable in the HM library. The boundary treatment method is adopted, if the \({\text{x}}_{{\text{R}}}\) exceeds the value range of [LB,UB]. The schematic diagram of the trust region is shown in Fig. 3. We can see that the trust region is centered on the best harmony and its width is twice the distance between \({\text{x}}_{{\text{j}}}^{{{\text{mean}}}}\) and \({\text{x}}_{{\text{j}}}^{{{\text{best}}}}\), under the premise that it does not exceed the search range. In the early stage of the algorithm, the trust region is wide, so that the algorithm can conduct global search and avoid premature convergence. With the progress of iteration, the trust region is gradually reduced, the convergence speed is accelerated, and the solution accuracy is also becoming higher and higher. The width of the trust region varies over iterations as shown in Fig. 2b.

Figure 3
figure 3

The schematic diagram of position updating.

To enhance the local search ability, we conduct a Gaussian fine-tuning of the selected value after a random search in the trust region. The process principle is shown in Fig. 4, and the dynamic change of BW is shown in Fig. 5. The specific steps are as follows:

$$x_{j}^{new} = x_{j}^{mean} + \left( {x_{R} - x_{j}^{mean} } \right) \times rand\left[ {0,1} \right]$$
(8)
$$BW = BW_{max} *exp\left( {c*\frac{t}{NI}} \right)$$
(9)
$$c = ln\left( {\frac{{BW_{min} }}{{BW_{max} }}} \right)$$
(10)
$$x_{j}^{new} = x_{j}^{new} + Gaussian\left( {0,1} \right)*BW$$
(11)

where the \(Gaussian\left( {0,1} \right)\) represents a Gaussian distribution random number with a mean value of 0 and a standard deviation of 1.

Figure 4
figure 4

The schematic diagram of secondary search.

Figure 5
figure 5

The dynamic change of BW.

Improved coupling operation

In the IGHS algorithm, a simple coupling operation, \({\text{x}}_{{\text{j}}}^{{{\text{new}}}} =\) \({\text{x}}_{{\text{k}}}^{{{\text{best}}}}\), is adopted, and the value of other dimensions is randomly selected to replace the value of this dimension in the best harmony. When the dimensions of the optimal solution of the problem are the same, the strategy will have a good effect. But the optimal values of each dimension are different in practical engineering problems. Therefore, we improved the strategy:

$$x_{j}^{new} = \left( {0.6*\frac{{x_{j}^{best} }}{{x_{k}^{best} }} + 0.4*\frac{{x_{j}^{worst} }}{{x_{k}^{worst} }}} \right)*x_{k}^{mean}$$
(12)

We use the linear proportional relationship between any two dimensions of the best harmony and the worst harmony according to a certain weight combination to form a new linear proportional relationship, in the mean harmony according to the linear relationship to generate new values. This strategy is not a simple direct replacement like IGHS, but a reasonable generation of new values according to the linear relationship between dimensions, which can improve the shortcomings of the coupling operation of IGHS mentioned above. In order for the algorithm to do this at the right time at runtime, we dynamically adjust the HMCR and PAR:

$$HMCR = 0.85 + 0.3*\sqrt { \frac{{\left( {it - 1} \right)}}{{\left( {NI - 1} \right)}}*\left( {1 - \frac{{\left( {it - 1} \right)}}{{\left( {NI - 1} \right)}}} \right)}$$
(13)
$$PAR = PAR_{max} - \left( {PAR_{max} - PAR_{min} } \right)*\frac{t}{NI}$$
(14)

The dynamic tuning of HMCR and PAR allows the algorithm to select the right operation with a high probability at the right time. The plotted probabilities of each operation at different time periods are shown in Fig. 6, where strategy 1 is the improved coupling operation, strategy 2 is the stable trust region search, and strategy 3 is the global random search.

Figure 6
figure 6

The probability change for each strategy.

In the early stage of algorithm operation, strategy 1 has the highest probability, and a wide range of search is carried out through coupling operation to avoid premature convergence, and the probability gradually decreases. The probability of strategy 2 gradually increases with the running of the algorithm, to accelerate the convergence speed and improve the convergence accuracy through the stable trust region search and Gaussian fine tuning. Meanwhile, the global random search of strategy 3 maintains a certain probability in the late stage of the algorithm, which provides a certain opportunity to escape from the local optimal and find a better one.

The steps of the NIGHS algorithm

The complete steps of NIGHS are as follows:

Step 1: Initialization of NIGHS parameters In this step, the IGHS algorithm parameters are defined, such as the number of decision variables (N), lower and upper bounds (LB,UB), harmony memory size (HMS), harmony memory consideration rate (HMCR), pitch adjusting rate (PAR), and maximum number of iterations(NI).

Step 2: Initialization of the harmony memory (HM) First, the harmony memory is randomly placed between the upper and lower bounds (LB,UB) of optimization problems according to Eqs. (1) and (2).

Step 3 Update the HMCR, PAR and BW. In this step we update the HMCR, PAR and BW through the Eqs. (9), (10), (13) and (14).

Step 4: Improvisation of a new harmony In this step, the new harmony (\({\text{x}}^{{{\text{new}}}}\)) is created by means of Algorithm 4 (Lines 6–21).We replace \({\text{x}}^{{{\text{worst}}}}\) with \({\text{x}}^{{{\text{mean}}}}\), generate the trust region between \({\text{x}}_{{\text{R}}}\) and \({\text{x}}^{{{\text{mean}}}}\)(Lines 6–14), and add the Gaussian fine-tuning strategy (Line 15).

Step 5: Harmony memory (HM) update Here a new harmony is generated. If the fitness value of the new harmony (\({\text{x}}^{{{\text{new}}}}\)) is better than the fitness value of the worst harmony (\({\text{x}}^{{{\text{worst}}}}\)) in HM, the worst harmony in HM will be replaced by the new harmony.

Step 6: Check the termination criterion If the number of the current iteration (t) is less than the maximum number of iterations (NI), then Steps 3 and 4 are repeated. Otherwise, the optimization process stops.

The NIGHS algorithm are shown in Algorithm 4 and the flow chart of NIGHS algorithm is given in Fig. 7.

Figure 7
figure 7

The overall flow chart of the NIGHS algorithm.

Experimental results and analysis

Computing environment and parameter settings

In this research work, we use the MATLAB software to implement the program to search for the trial solutions. The solution-finding equipment comprised an Intel Core (TM) i5-10300H (2.50 GHz) CPU, 16 GB of memory, and Windows 10 home edition (64-bit) OS. The settings of these algorithms are extracted from Geem et al.72, Mahdavi et al.30, Degertekin et al.73, Zou et al.74, Pan.20, Valian71, Gholam75 and are as follows:

HS: HMS = 5, HMCR = 0.9, PAR = 0.3, BW = 0.01;

IHS: HMS = 5, HMCR = 0.9, \({\text{PAR}}_{{{\text{min}}}} = 0.01,{\text{PAR}}_{{{\text{max}}}} = 0.99\),\({\text{BW}}_{{{\text{min}}}} = 0.0001\),\({\text{BW}}_{{{\text{max}}}} = \frac{{{\text{x}}_{{\text{u}}} - {\text{x}}_{{\text{l}}} }}{20}\);

GHS: HMS = 5, HMCR = 0.9, \({\text{PAR}}_{{{\text{min}}}} = 0.01,{\text{ PAR}}_{{{\text{max}}}} = 0.99\);

SGHS: HMS = 5, HMCRm = 0.98, PARm = 0.9, \({\text{BW}}_{{{\text{min}}}} = 0.0005\), \({\text{BW}}_{{{\text{max}}}} = \frac{{{\text{x}}_{{\text{u}}} - {\text{x}}_{{\text{l}}} }}{10}\),LP = 100;

NGHS: HMS = 5, \({\text{P}}_{{\text{m}}} = 0.005;\)

IGHS: HMS = 5, HMCR = 0.9950, PAR = 0.4;

IMGHS: HMS = 5, HMCR = 0.9, PAR = 0.3, BW = 0.01,\({\text{P}}_{{\text{m}}} = 0.005\), \({\upmu }1 = 0.7\), \({\upmu }2 = 0.3\);

NIGHS: HMS = 5, \({\text{PAR}}_{{{\text{min}}}} = 0.1,{\text{PAR}}_{{{\text{max}}}} = 0.9\), \({\text{BW}}_{{{\text{min}}}} = 0.0001\), \({\text{BW}}_{{{\text{max}}}} = \frac{{{\text{x}}_{{\text{u}}} - {\text{x}}_{{\text{l}}} }}{20}\);;

Benchmark optimization problems

The CEC2017 test set contains 30 numerical minimization test functions, divided into four groups. The first group contains three unimodal functions (f1–f3), the second group contains seven simple multimodal functions (f4–f10), the third group contains ten hybrid functions (f11–f20), and finally the fourth group contains ten composition functions (f21–f30). The global optimal solution of each function is randomly shifted in \(\left[ { - 80,80} \right]^{{\text{D}}}\) and given a different rotation matrix. Consider that in the real-world problem, there is very little correlation between all the variables. In CEC2017, variables are randomly divided into child components. The rotation matrix of each subcomponent are generated from standard normally distributed entries by Gram-Schmidt ortho-normalization with condition number c that is equal to 1 or 2. Table 1 summarizes these capabilities and their features. For details on these features, see76.

Table 1 Benchmark functions of CEC2017.

We conducted 51 independent runs of each algorithm on the benchmark function of CEC 2017 test set, respectively set to 10 (10D), 30 (30D) and 50 (50D) dimensions, where the search range of all test functions is [− 100,100]. According to the benchmark rule, the maximum number of function evaluations is set to 10,000 × D, and the error value obtained is zero if it is less than 10 − 8. Finally, we calculate the best value, worst value, mean value and standard deviation (SD) of the obtained data and make a good arrangement.

figure d

Experimental results and analysis

According to the famous No Free Lunch Theorem (NFL)77, it is impossible to design an algorithm that outperforms all other algorithms for all problems. Hence, we do not expect the proposed NIGHS algorithm is better than all other algorithms for all problems. In this paper, benchmark functions of CEC 2017 are selected to test the algorithm and to see if the NIGHS algorithm performs better than other algorithms on most of the functions.

In this paper, the experiments are conducted with dimension size of 10, 30, and 50, respectively. In the tables below, "+" means that the effect is better than NIGHS algorithm, "-" means that the effect is worse than NIGHS algorithm, and "≈" means that the effect is similar to NIGHS algorithm, that is, the difference between the two is considered to be within 0.1 under the same order of magnitude. The best results are highlighted in bold. Each algorithm is separately run for 51 times under each of the dimension size, with the maximum number of iterations set to 100,000, 300,000 and 500,000. In order to facilitate uniform comparison, we subtract the corresponding optimal value from the result of each calculation of each function and return the optimal value to zero. The best value, worst value, mean value and standard deviation (SD) of the 51 iterations are obtained and reported in Tables 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12 and 13. Among them, Tables 2, 3, 4 and 5 presents the comparison results on the best value, worst value, mean value and standard deviation (SD) with dimension size of 10, respectively. Similarly, Tables 6, 7, 8 and 9 presents the comparison results with dimension size of 30, and Tables 10, 11, 12 and 13 presents the comparison results with dimension size of 50.

Table 2 The comparison results of the algorithms with D = 10 on the best value.
Table 3 The comparison results of the algorithms with D = 10 on the worst value.
Table 4 The comparison results of the algorithms with D = 10 on the mean value.
Table 5 The comparison results of the algorithms with D = 10 on the standard deviation (SD).
Table 6 The comparison results of the algorithms with D = 30 on the best value.
Table 7 The comparison results of the algorithms with D = 30 on the worst value.
Table 8 The comparison results of the algorithms with D = 30 on the mean.
Table 9 The comparison results of the algorithms with D = 30 on the standard deviation(SD).
Table 10 The comparison results of the algorithms with D = 50 on the best value.
Table 11 The comparison results of the algorithms with D = 50 on the worst value.
Table 12 The comparison results of the algorithms with D = 50 on the mean value.
Table 13 The comparison results of the algorithms with D = 50 on the standard deviation (SD).

In this experiment, NIGHS was compared with HS, IHS, GHS, SGHS, NGHS and IMGHS. As the NIGHS algorithm aims to improve the trust region of the IGHS algorithm to achieve a better balance between search and exploitation performance of the algorithm, the comparison with IGHS is particularly relevant. To this end, we first focus on comparing the experimental results of NIGHS and HS, IHS, GHS, SGHS, NGHS and IMGHS algorithms, then consider the comparison of experimental results of NIGHS and IGHS algorithms. In this way, the improvement of NIGHS can be highlighted as meaningful and progressive.

As for the HS, IHS, GHS, SGHS, NGHS, IMGHS algorithms, Tables 2, 3, 4 and 5 show that NIGHS algorithm is significantly better than them in the test results of these 30 benchmark functions with dimension size of130. Since the principle of the algorithm is based on the probability model, it is more convincing to use the average of 51 calculation results to explain. For the optimal values of 10 iterations, Table 4 reveals the following results:

  1. (1)

    Compared with the HS algorithm, NIGHS has 20 results better than HS in 30 test functions, and the remaining 10 results are slightly worse than HS.

  2. (2)

    Compared to the IHS algorithm, NIGHS is better than IHS in only 15 out of 30 functions, but the results of the remaining 15 functions are similar to that of IHS in 7 functions, and slightly better than IHS overall.

  3. (3)

    Compared with the GHS algorithm, NIGHS is superior to GHS in the test results of 22 functions, and significantly worse than GHS in F18.

  4. (4)

    Compared with SGHS algorithm, NIGHS is better than SGHS on 27 functions, slightly worse than SGHS on F21 and F24, and worse than SGHS on F14.

  5. (5)

    NIGHS is superior to both the NGHS and IMGHS algorithms on 28 functions and slightly inferior to them on F18, 25 and 28.

On the whole, the experimental calculation results of NIGHS at D = 10 are superior to the six algorithms. As the dimension increases, when D = 30, NIGHS algorithm outperforms HS, IHS, GHS, SGHS, NGHS and IMGHS on more functions. NIGHS outperforms them on 27, 25, 30, 30, 29, and 26, respectively. NIGHS outperforms 19, 20, 30, 30, 26, and 19 function results when D = 50. Although the comparison results decreased when D = 30, it was still better than the six algorithms on the whole. This is because in the early NIGHS computing, algorithm probability to use improved coupling operation, search in a wide range of avoid premature convergence, then high probability for stable trust region search and Gaussian fine tuning, speed up the convergence of the algorithm and improve the accuracy of convergence. Both IHS and NIGHS use the same dynamic tone width, so when the dimensions are low, the problem is less complex, and fine-tuning can be done to avoid falling into local optimal, with NIGHS only slightly better than IHS at D = 10. As the dimensions increase, NIGHS is significantly better than IHS when D = 30 and 50. This is because the improved coupling operation of NIGHS in the higher dimension, there is a higher randomness, to help the algorithm in the early stage of a large range of search, avoid falling into the local optimal.

The robust stability of an algorithm is often demonstrated by the standard deviation (SD) comparison of experimental results. From Tables 5, 9 and 13, we can see that when D = 10, NIGHS on 20, 15, 19, 23, 28, 29 functions is better than HS, IHS, GHS, SGHS, NGHS and IMGHS respectively. NIGHS outperforms 20, 20, 27, 27, 27, 24 functions in standard deviation when D = 30. When D = 50, NIGHS outperforms the standard deviation of 18, 20, 26, 27, 24, 20 functions. NIGHS performs poorly mainly on F10, F21 and F28 when D = 10. NIGHS performs poorly mainly on F10, F26, and F28 when D = 30. When D50, NIGHS performed poorly mainly on F10 and F22. Overall, NIGHS is significantly more robust than GHS, SGHS, NGHS and IMGHS, and better than HS. NIGHS and IHS are not much different in robustness at low dimensions, but as dimensions increase NIGHS still outperforms IHS.

The comparison of the best value and the worst value results of the 51 times of each algorithm is helpful to analyze the upper limit and lower limit of an optimization ability of the algorithm. We can see from Tables 2, 3, 6, 7, 10 and 11 that NIGHS is significantly superior to GHS,SGHS,NGHS and IMGHS, superior to HS and slightly superior to IHS when D = 10. NIGHS is significantly superior to all six algorithms when D = 30. At D = 50, the advantage of NIGHS declines, but it is still superior to the six algorithms. Searching in stable trust areas and dynamic Gaussian fine-tuning will help NIGHS 'algorithmic capabilities increase. The improved coupling operation can improve the ability of the algorithm to get rid of the local optimal and the lower limit of the algorithm's ability.

Now let us consider the comparison between IGHS and NIGHS. First of all, it can be seen from Tables 4, 8 and 12 that NIGHS has 28, 30, 30 function results better than IGHS on mean value when D = 10, 30 and 50. NIGHS outperformed IGHS on almost the entire test set. For robust stability, it can be found from Tables 5, 9 and 13 that NIGHS is superior to IGHS on 26,26 and 24 functions respectively when D = 10, 30 and 50 for standard deviation (SD). It can be proved that NIGHS is superior to IGHS in robustness in all dimensions. As for the upper limit and lower limit of the optimization ability of the algorithm, we compared the best value and the worst value of the results of 51 separate runs. As for the upper limit of optimization capability, we can see from Tables 2, 6 and 10 that NIGHS is superior to IGHS on 27,30, and 30 functions respectively when D = 10,30, and 50. As for the worst value, we can find from Tables 3, 7 and 11 that NIGHS is superior to IGHS in the calculation results of 28,29 and 29 functions respectively under the conditions of D = 10,30 and 50. According to the results of the algorithms on CEC2017 test function set, we can conclude that NIGHS 'optimization ability and robustness are much better than IGHS. This is because NIGHS has improved the coupling operation of different dimensions on IGHS algorithm framework, so that the original limited and lack of rationality of the operation has been improved, for the complex addition of displacement and rotation functions can also be applied. The original trust region search based on the worst harmony and the best harmony is improved. The mean harmony is used to construct the stable convergence of the stable trust region, and the dynamic Gauss fine-tuning strategy is added to improve the convergence speed and accuracy of the algorithm.

In sum, according to Tables 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12 and 13, NIGHS has some obvious advantages over the comparison algorithms. It is superior to HS in terms of exploration ability and robustness and has better optimization performance than the IGHS in lower and high dimension.

Further analysis

To analyze the search performance and convergence ability of the algorithms more intuitively, we selected one unimodal test functions (F2) and three Simple multimodal test functions (F4, F6 and F7), which are plotted the iteration graph of these eight algorithms. When the dimension is 2, the three-dimensional images of these four functions are shown in Fig. 8. When the variable dimensions are 10, 30 and 50 respectively, the calculated iteration graph are shown in Fig. 9, where the abscissa is the number of iterations and the ordinate is the fitness value of the optimal solution.

Figure 8
figure 8

The graphs of F2, F4, F6 and F7 with dimension size of 2.

Figure 9
figure 9figure 9

The iteration graphs of the comparison algorithms.

Figure 9 contains 12 graphs, which are the iteration graphs of the eight algorithms on F2, F4, F6 and F7 with dimension size of 10, 30, and 50, respectively. For the convergence analysis of the algorithm, we can prove it by analyzing the iterative graph of the historical optimal solution. In Fig. 9, we can see that since the first generation of the optimal solution, the fitness value of the historical optimal solution is converging to the corresponding global optimal solution as the iteration progresses. On F2, F4, F6 and F7, with continuous iteration, the historical optimal solution of NIGHS is constantly approaching the global optimal solution 0. This convergence behavior has been used in many literatures to prove that the algorithm has robust convergencey78,79.

From Fig. 8, we can see that there is no local optimal solution for the unimodal function F2. We can analyze the convergence ability of the algorithm by the calculation process of the algorithm on the function. According to the iterative calculation curves of F2 when D = 10, 30 and 50 in Fig. 9, we can find that the calculation result accuracy of NIGHS is the best among the eight algorithms in the final calculation results. IHS is slightly less accurate than NIGHS, but both algorithms are better than the other six. This is because both IHS and NIGHS employ a fine-tuning strategy with convergent BW. NIGHS is slightly better than IHS, the difference is that NIGHS uses the Gaussian fine-tuning, which compared to the probability of random fine-tuning in (0,1] of IHS, is more concentrated. Let NIGHS has higher convergence accuracy.

For a comparison of the running conditions in the simple multimodal functions F4, 6 and 7, we can see from Fig. 9 that the final calculation results of NIGHS on these three functions are superior to the other seven algorithms for D = 10,30 and 50. At the same time, we can observe that for local optimum of multimodal problems, HS, IHS, GHS, SGHS, NGHS, IGHS and IMGHS is convergence to an optimal value easily at the early stage of the calculation, and then has been stalled in the value. After that, most of the calculation are unable to find a better solution. Observe NIGHS performance in F7 in D = 10, 30, 50, NIGHS in the early computation compared to other algorithms to maintain a very slow search, and did not converge prematurely. NIGHS slowly convergence until running to the middle of the calculation. This is because NIGHS adjusts the HMCR and PAR on the move, taking into account the characteristics of the three strategies and adopting the right strategies at different times of computing. In the early stage of the calculation, NIGHS mainly adopts the improved coupling operation, so that the algorithm in the convergence domain for global search, and then gradually converted to local exploit, local convergence.

The conclusions of experimental results

The purpose of this study is to improve the trust region and coupling operation of the IGHS algorithm, and to construct a new trust region with good stability and convergence, as well as a more reasonable coupling operation. By adaptively adjusting HMCR and PAR, the algorithm can use each strategy more reasonably at different stages, better balancing the search and development performance of the algorithm. Through the analysis of the iterative graphs of the selected four functions, it can be found that NIGHS can maintain a strong global search ability in the early stage of calculation when solving basic multi-peak problems, compared with other algorithms, suppressing the development ability and avoiding premature convergence into a local optimum. This indicates that NIGHS can adaptively adjust the exploration and development abilities during the search process to avoid premature convergence. At the same time, by testing the selected CEC2017 test function set and comparing the experimental data of the proposed NIGHS algorithm with the NGHS algorithm, which also constructs a trust region, it can be concluded that the convergence speed and calculation accuracy of NIGHS are better than NGHS in most problems. This indicates that using dynamic Gaussian fine-tuning in a stable trust region can accelerate convergence speed and improve optimization accuracy. Based on the above, it can be concluded that although the NIGHS algorithm proposed in this paper may not be the best choice for all problem considerations, it can basically meet our requirements. Overall, for single-objective unconstrained optimization problems, NIGHS is a relatively good choice.

Conclusions

In this paper, we proposed a novel intelligent global harmony search algorithm, which improves upon the trust region of the previous algorithm by considering the best and worst values to provide a more stable trust region with better convergence. Additionally, we introduced modifications to the simple coupling operation of IGHS based on the linear proportional relationship between different dimensions. Our proposed NIGHS algorithm was tested and evaluated using the CEC2017 test function set, and compared with several other algorithms including the basic version of harmony search, improved harmony search algorithm, geometric harmony search algorithm, novel global harmony search algorithm, self-adaptive global best harmony search algorithm, intelligent global harmony search algorithm, and intersect mutation global harmony search algorithm.

Our experimental results demonstrate that the proposed algorithm outperforms the HS and IGHS algorithms in terms of solution accuracy and efficiency, and has robust convergence. This improved efficiency is attributed to our stable trust region and improved coupling operation, which strike a balance between exploration and exploitation performance of the algorithm. Additionally, the dynamic Gauss fine-tuning improves the solution accuracy of the algorithm. We believe that our research has significance and rationality in the application of the coupling operation, and that normalization of the variables of each dimension, unifying the values of each dimension, or reducing the difference between the values of each dimension can help to establish the mathematical model of actual problems.

Even though different solutions may have varying physical meanings, we posit that effective solutions can still be obtained by searching the data space as a spatial process, and that different dimensional boundaries also have common parts. If the difference between different dimensions is still large after normalization, this can be addressed by reducing PAR. Furthermore, our proposed NIGHS algorithm is easy to implement and provides a good choice for solving complex global optimization problems.

In the future research, we will focus on further practical engineering applications of our proposed algorithm in fields such as power systems, image processing, network optimization, and high-order SNP interaction detection. This can shed more light on the algorithm and its potential for solving real-world problems.