Abstract
Backbreak (BB) is one of the serious adverse blasting consequences in open-pit mines, because it frequently reduces economic benefits and seriously affects the safety of mines. Therefore, rapid and accurate prediction of BB is of great significance to mine blasting design and other production activities. For this purpose, six different swarm intelligence optimization (SIO) algorithms were proposed to optimize the extreme learning machine (ELM) model for BB prediction, i.e., ELM-based particle swarm optimization (ELM–PSO), ELM-based fruit fly optimization (ELM–FOA), ELM-based whale optimization algorithm (ELM–WOA), ELM-based lion swarm optimization (ELM–LOA), ELM-based seagull optimization algorithm (ELM–SOA) and ELM-based sparrow search algorithm (ELM–SSA). In total, 234 data records from blasting operations in the Sungun mine in Iran were used in this study, including six input parameters (special drilling, spacing, burden, hole length, stemming, powder factor) and one output parameter (i.e., BB). To evaluate the predictive performance of the different optimization models and initial models, six performance indicators including the root mean square error (RMSE), Pearson correlation coefficient (R), determination coefficient (R2), variance accounted for (VAF), mean absolute error (MAE) and sum of square error (SSE) were used to evaluate the models in the training and testing phases. The results show that the ELM–LSO was the best model to predict BB with RMSE of 0.1129 (R: 0.9991, R2: 0.9981, VAF: 99.8135%, MAE: 0.0706 and SSE: 2.0917) in the training phase and 0.2441 in the testing phase (R: 0.9949, R2: 0.9891, VAF: 98.9806%, MAE: 0.1669 and SSE: 4.1710). Hence, ELM techniques combined with SIO algorithms are an effective method to predict BB.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
Introduction
Blasting has been regarded as the main effective technique for rock excavation in open pit and underground mines (Bhandari 1997; Armaghani et al. 2016; Wang et al., 2018a, b; Huo et al., 2020; Du et al., 2022). Nevertheless, only a small amount (20–30%) of the explosive energy is used to break rock with current blasting techniques, and most (70–80%) of the explosive energy is lost with varying degrees of adverse consequences (Fig. 1) such as flyrock, backbreak and air blast (Agrawal and Mishra 2018; Uyar and Aksoy 2019; Fang et al., 2020; Fattahi and Hasanipanah 2021; Ramesh et al., 2021; Ye et al., 2021; Zhou et al., 2021a, b, c; Dai et al., 2022). Among these unwanted consequences, backbreak (BB) is one of the continuous and focused concerns of blasting engineers and scholars (Khandelwal and Monjezi 2013; Shirani et al., 2016). BB is defined as the damaged rocks beyond the limits of the last row of holes (Konya and Walter 1991; Jimeno et al., 1995). Different studies have investigated various parameters associated with BB, including controllable blasting parameters and uncontrollable blasting parameters (Konya and Walter 1991; Bhandari 1997; Konya 2003; Monjezi and Dehghani 2008; Monjezi et al., 2010a, b). Controllable parameters include blast design parameters and explosive properties such asburden (B), spacing (S), stemming (ST), subdrilling (SU), blasthole length (BL), blasthole diameter (BD), stiffness ratio (SR), explosive type, explosive density, explosive strength, powder factor (PF) and coupling ratio (Sari et al., 2014; Ghasemi 2017). Konya (2003) found that BB is positively correlated with ST and B. Monjezi and Dehghani (2008) considered that ST/B, PF, charge per delay and other parameters have the greatest influence on BB. Monjezi et al., (2010a) reported the different influences of ST, B, S and depth of the hole (DH) on BB. Sari et al. (2014) showed that reducing explosive strength and PF can effectively reduce BB. Several researchers reported the effects of different materials of explosive and coupling ratio on BB (Wilson and Moxon 1988; Firouzadj et al., 2006; Iverson et al., 2009; Enayatollahi and AghajaniBazzazi 2010). Uncontrollable blasting parameters refer to physical and mechanical properties of rock masses such as rock density, rock porosity, rock strength, discontinuities orientation, and discontinuities strength (Ghasemi 2017). Bhandari and Badal (1990) considered the relationship between the orientation of discontinuities and BB. Bhandari (1997) showed the effects characteristics of rock mass on BB. Jia et al. (1998) believed that joints with a dip angle were one of the main causes of BB based on the numerical simulation results.
Because it is too difficult to evaluate and predict BB quickly and correctly based on the various influence parameters, several scholars proposed some empirical models or regression models to predict BB by considering different input variables from controllable or uncontrollable blasting parameters (Lundborg 1974; Roth 1979; Monjezi et al., 2010a, b; Esmaeili et al., 2014). Nevertheless, only partial valid parameters were considered in empirical formulas, which lack updates to new data (Saghatforoush et al., 2016; Kumar et al., 2021). In recent years, artificial intelligence (AI) technologies had been used widely in civil and mining engineering to solve forecasting problems (Zhou et al., 2012, 2016, 2019, 2022a, b; Khandelwal and Singh, 2011; Khandelwal et al., 2017, 2018; Nguyen et al., 2020; Armaghani et al., 2020, 2021; Wang et al., 2021; Li et al., 2021a, b). Several researchers have adopted different AI technologies to predict BB, including, among others, artificial neural network (ANN) (Monjezi et al., 2013, 2014; Esmaeili et al., 2014), back propagation neural network (BPNN) (Sayadi et al., 2013), support vector machine (SVM) (Khandelwal and Monjezi 2013; Mohammadnejad et al., 2013; Yu et al., 2021), adaptive neuro-fuzzy inference system (ANFIS) (Esmaeili et al., 2014; Ghasemi et al., 2016), and random forest (RF) ( Sharma et al., 2021; Zhou et al., 2021c; Dai et al., 2022). Nonetheless, most single AI algorithms are prone to falling into local minima with low learning rates, particularly ANN, SVM, and ANFIS (Wang et al., 2004; Moayedi and Jahed Armaghani 2018; Ghaleini et al., 2019). The extreme learning machine (ELM) proposed by Huang et al., (2006) was proved to be superior to ANN and SVM in solving the prediction problem (Shariati et al., 2020). Meanwhile, swarm intelligence optimization (SIO) algorithms based on the biological behavior of natural populations have been used widely to optimize single AI algorithms to improve the performance of the model for BB prediction (Ebrahimi et al., 2016; Saghatforoush et al., 2016; Ghasemi 2017; Hasanipanah et al., 2017; Eskandar et al., 2018; Zhou et al., 2021c; Bhatawdekar et al., 2021; Dai et al., 2022;).
Therefore, this study aimed to develop novel hybrid ELM models by using six SIO algorithms to predict BB in an open pit, i.e., ELM-based particle swarm optimization (ELM–PSO), ELM-based fruit fly optimization (ELM–FOA), ELM-based whale optimization algorithm (ELM–WOA), ELM-based lion swarm optimization (ELM–LSO), ELM-based seagull optimization algorithm (ELM–SOA) and ELM-based sparrow search algorithm (ELM–SSA). The rest of this study is organized as follows. The section "Methodology" introduces the ELM model and six SIO algorithms. The section "Dataset and Preparation" shows data sources and detailed data analysis. The section "Performance Indicators" introduces six indicators to evaluate the performance of different models. The section "Results and Discussion" describes the development of all models and displays the results of models for BB prediction. The section "Conclusion and Summary" gives the main conclusion remarks of this study and some personal opinions.
Methodology
Extreme Learning Machine
Huang et al. (2006) proposed a special neural network model, called the ELM, as one of the single-layer feed-forward neural network (SLFN) architectures. This model has one hidden layer, which can easily handle optimization problems by simply adjusting the number of neurons in the hidden layer (Zhang et al., 2021a, b). Assuming a training set D that contains K-dimensional (\(x_{i} = \left[ {x_{i1} ,x_{i2} , \ldots ,x_{iK} } \right]^{T}\)) input vectors and L-dimensional output vectors ti = \(\left[ {t_{i1} , \, t_{i1} , \, ... \, ...{ , }t_{iL} } \right]^{T}\), an effective ELM model is built to simulate the internal connection between input and output vectors according to the following three steps.
-
Step 1: Building an SLFN. The purpose of this step is to establish preliminarily the input weights Wq and bias Bq between the input layer and the hidden layer, and the output weights \(\beta_{i}\) between the hidden layer and the output layer. Therefore, an SLFN with M neurons in a hidden layer can be written as:
where g(x) represents the activation function, wq belongs to the set W: \(\left[ {W_{q1} , \, W_{q1} , \, ... \, ...{ , }W_{qn} } \right]^{T}\), and Bq belongs to B: \(\left[ {B_{q1} , \, B_{q1} , \, ... \, ...{ , }B_{qn} } \right]^{T}\).
-
Step 2: Selecting weights and biases. These have an important effect on the output for a certain number of neurons in the hidden layer. To minimize the output error \(\sum\nolimits_{i = 1}^{P} {t_{i} - u_{i} = 0}\), the SLFN in step 1 can be transformed to:
$$\sum\limits_{q = 1}^{M} {\beta_{i} g\left( {W_{q} \cdot x_{i} + B_{q} } \right) = u_{i} } \, i = 1, \, 2, \, 3,...,D$$(2)where ui = \(\left[ {u_{i1} , \, u_{i1} , \, ... \, ...{ , }u_{iL} } \right]^{T}\) represents the target vector. Then, the output of hidden layer neurons H and weights β can be expressed as:
$$H(W_{1} , \, ...W_{M} , \, B_{1} , \, ..B_{M} , \, x_{1} , \, ...x_{M} ) = \left[ {\begin{array}{*{20}c} {g(W_{1} \cdot x_{1} + B_{1} )} & \cdots & {g(W_{P} \cdot x_{1} + B_{M} )} \\ \vdots & \cdots & \vdots \\ {g(W_{1} \cdot x_{D} + B_{1} )} & \cdots & {g(W_{M} \cdot x_{D} + B_{M} )} \\ \end{array} } \right]_{D \times M} \beta = \left[ {\begin{array}{*{20}c} {\beta_{1}^{T} } \\ \vdots \\ {\beta_{M}^{T} } \\ \end{array} } \right]_{M \times L}$$(3)
-
Step 3: Estimating the weights between the hidden and output layers. The optimal output weight can be solved by an inverse hidden layer output matrix (Shariati et al., 2019). It means that the target vector Tv is closest to the real vector. Therefore, the target vector Tv and the corresponding output weights vector \(\hat{\beta }\) can be expressed as:
$$T_{v} = H \cdot \beta = \left[ {\begin{array}{*{20}c} {u_{1}^{T} } \\ \vdots \\ {u_{D}^{T} } \\ \end{array} } \right]_{D \times L} \, \hat{\beta } = H^{\dag } T_{v}$$(4)where \(H^{\dag }\) is Moore–Penrose generalized inverse matrix.
Swarm Intelligence Optimization
Particle Swarm Optimization
Kennedy and Eberhart (1995) proposed a particle swarm optimization (PSO) algorithm to solve the optimization problem inspired by the predation behavior of birds. The core of PSO comprises massless particles with velocity and position. Velocity indicates how fast birds search for food, and position affects the direction of birds. Each bird (particle) is independent but shares the position of the food at the same time. Throughout the search space, the individual extremum is a position of food for each bird. Birds aim to move toward the best food location by comparing shared food positions. The velocity and position of each bird in the n + 1th iteration can be expressed by two mathematical formulas, thus:
where N is the number of particles, u is a factor not less than zero, c1 and c2 are individual and social learning factors, where c1 = c2 = 2 in this study, r1 and r2 are random numbers between 0 and 1, and \(P_{individual,i}^{n}\) and \(P_{group,i}^{n}\) are the optimal positions for the individual and the group, respectively.
Fruit Fly Optimization Algorithm
Pan (2012) proposed a new algorithm based on the foraging behavior of fruit flies to solve the global optimization problem, named the fruit fly optimization algorithm (FOA). The fruit fly is considered one of the best hunters in nature because of its excellent sense of smell and vision. The illustration of the body looks and foraging process of the fruit fly is depicted in Figure 2. Within a certain range of search space, the fruit fly first activates the olfactory function to search for food. After approaching, it uses keen vision to search for food precisely and finally determine the position. Therefore, there are two main steps in the FOA.
-
Step 1: Osphresis search. Assume the position of one fruit fly is (xi, yi), which randomly searches for food in a certain space based on olfactory feedback. However, the position of the food is not known in advance. The smell concentration (Smelli) is assumed to be inversely proportional to the distance (Disti) of the ith fruit flies from the starting point (0, 0). Then, the Osphresis foraging can be expressed as:
-
Step 2: Vision search. Olfactory search aims to determine the position of flies with the best smell concentration (Smellbest) and moving toward the position (x_axis, y_axis), which can be expressed as:
$${\text{Smellbest}} = \max {\text{Smell}}_{i} \, \left\{ \begin{gathered} x\_{\text{axis}} = x({\text{Smellbest}}) \hfill \\ y\_{\text{axis}} = y({\text{Smellbest}}) \hfill \\ \end{gathered} \right.$$(10)
Whale Optimization Algorithm
Mirjalili and Lewis (2016) developed the whale optimization algorithm (WOA) by mimicking the predatory behavior of humpback whales in the ocean. Whales are relatively intelligent creatures in the ocean, thanks to having more than twice as many spindle cells as humans, especially humpback whales have even developed their own language (Hof et al., 2007). The most interesting thing about humpback whales is their foraging behavior, which is called bubble-net hunting as shown in Figure 3a. This foraging is required such that humpback whales dive 12–15 m to the bottom of the shoal and then attack by creating bubbles along a circle or ‘9’-shaped path (Mirjalili and Lewis 2016; Fan et al., 2020; Zhou et al. 2022c). Before hunting, humpback whales are very good at locating and encircling prey, and this behavior can be expressed mathematically as:
where C1 and C2 are coefficient vectors, respectively, \(X_{w}^{*}\) and X represent the best and current positions of whales, respectively, in the nth iteration, E is the absolute value of the distance between whales and prey, and e is decreasing from 2 to 0 in the course of iteration, and u is randomly changed in [0, 1].
After encircling their prey, humpback whales shrink encircling and constantly reposition themselves to complete the bubble-net feeding behavior. As shown in Figure 3b, the shrinking encircling behavior is done by decreasing b in Eq. 11. Meanwhile, the positions of humpback whales (X, Y) are also changing in a spiral as shown in Figure 3c. The new position of the whale is expressed as:
where x and l are changed randomly in [0, 1], and s is a constant to define the spiral shape.
Lion Swarm Optimization
Liu et al. (2018) proposed the lion swarm optimization (LSO) based on the hunting behavior of a lion swarm in nature. There is a strict social hierarchy within a lion swarm. The first echelon is the king lion, called a leader. A leader is responsible for assigning tasks to the other lions, distributing food and accepting status challenges. The second echelon is the lioness, called a predator. The predator is responsible for hunting, including searching for, tracking, trapping, and attacking prey. In addition, the predator is a direct communication link between the leader and the other lions and is responsible for giving instructions and feedback. The lion cubs are at the bottom of the swarm, and their main job is to learn how to hunt from a predator, a called follower. Once in their adulthood, followers are driven out of the group and trained to challenge the leader. Assume a lion swarm has N lions, where the ratio of leaders is less than 0.5, the positions of the lions in the different echelons are expressed as follows.
-
(a)
The position of leader is near the food:
-
(b)
The predator often needs the help of another lioness to move:
$$p_{i}^{k + 1} = \frac{{l_{i}^{k} + l_{c}^{k} }}{2}(1 + \alpha_{f} \gamma )$$(15)
-
(c)
The positions of the cubs are determined by the leader and the predator:
$$p_{i}^{k + 1} = \left\{ {\begin{array}{*{20}c} {\frac{{l_{i}^{k} + g^{k} }}{2}(1 + \alpha_{c} \gamma ), \, q \le 1/3} \\ {\frac{{l_{m}^{k} + l_{i}^{k} }}{2}(1 + \alpha_{c} \gamma ),{ 1/3} \le q < 2/3} \\ {\frac{{l_{i}^{k} + \overline{g}^{k} }}{2}(1 + \alpha_{c} \gamma ),{ 2/3} \le q \le 1} \\ \end{array} } \right.$$(16)where \(p_{i}^{k + 1}\) represents the position of the ith leader at the k + 1th iteration, gk and \(l_{i}^{k}\) represent the optimal and historical position of the leader in the kth iteration, respectively, \(\gamma\) and q are changed randomly in [0, 1], \(l_{c}^{k}\) and \(l_{m}^{k}\) represent the historical position of the ith predator and follower at the kth iteration, respectively, and \(\overline{g}^{k}\) represents the current position of followers. The disturbance factors are defined as \(\alpha_{f}\) and \(\alpha_{c}\) in the movement range of predator and follower.
Seagull Optimization Algorithm
Dhiman and Kumar (2019) proposed a new algorithm, called the seagull optimization algorithm (SOA), based on the migration and attacking behavior of seagulls (Fig. 4) to solve the optimization problem. Seagulls rely on unique intelligence to catch prey, such as imitating the sound of rain to lure fish to the surface (Dhiman and Kumar 2019). As a kind of seasonal migration, seagulls need to obtain food to supplement energy in the process of migration to reach the destination (Avise 2017). However, the population of seagulls is very large in migration such that it is important to avoid colliding with each other. Assume the movement behavior (U) of seagulls, and this problem can be solved as:
where \(\vec{x}_{s}\) and \(\vec{N}_{s}\) represent the updated and current position of the seagulls, respectively, t represents the iteration time, m_iter indicates the maximum iteration, and C represents a control factor of U, which can be decreased linearly from 2 to 0. To get enough good food, seagulls need to constantly adjust their position to keep moving toward the best food. This behavior can be described as:
where \(\vec{N}_{bs} \left( t \right)\) represents the current best position of a seagull, \(\vec{D}_{s}\) indicates the position where the current seagull toward the best seagull, \(\vec{A}_{s}\) represents the distance between the updated seagull and the best seagull, and F is a balance factor that can be estimated as:
where h is changed randomly in [0, 1]. The seagulls maintain a spiral motion and change the angle and speed through their wings and weight in the attack. This attack pattern can be written in x, y, z planes, thus:
where r indicates the radius of the spiral in each turn, K represents a random number in the range 0–\(2\pi\), c1 and c2 represent constants to describe the shape of a spiral, and e is the base of the natural logarithm. Therefore, the best position of seagulls is calculated as:
Sparrow Search Algorithm
Xue et al. (2020) developed a new SIO algorithm that was inspired by the foraging behaviors of sparrows. The sparrows are common small social birds and do not migrate seasonally. Meanwhile, sparrows have powerful memories that help them better find food. Note that the sparrows are mainly divided into two types in this so-called sparrow search algorithm (SSA). The producers are responsible for searching for high-energy foods, and the food of the scroungers comes from the producers. The interesting thing is the flexible interchangeability of the producers and the scroungers identities, but the ratio of producers to scroungers is fixed in the sparrow swarm (Barta et al., 2004; Xue et al., 2020). This means that the strategy is useful for the producers and the scroungers to find higher-energy foods (Liker and Barta 2002). The natural curiosity of sparrows helps the producers, and the scroungers evade attackers. When one or more individuals spot attackers and sing, the entire swarm flies away (Pulliam 1973).
Assuming that there are n sparrows, J represents the spatial distribution and setting of the warning signal \(W_{s}\). In the SSA, the producers are not only responsible for finding the food, but also for feeding the scroungers. Therefore, the producers can search a wider area for energy-dense foods, and the position of the producers can be written as:
where \(S_{i,j}^{k + 1}\) represents the position of the ith producer in the jth dimension at the k + 1th iteration, Q represents a random number that follows a normal distribution, L represents a matrix (\(1 \times d\)) where each element is 1, and the maximum number of columns (d) is the maximum dimension of J; \(\alpha\) and R2 are random numbers that vary in \((\mathrm{0,1})\), and itermax indicates the maximum time of iteration. As shown in Eq. 25, when R2 > Ws it means that individuals detect the attackers, the producers and the scroungers quickly fly to safe places. Inversely, the producers continue to search for food. The positions of the scroungers are related to the producers. The scroungers grab the producers of higher energy foods and update their positions according to:
where \(S_{b}^{k + 1}\) and \(S_{worst}^{k}\) represent the current best position of the producers and the global worst position, A represents a matrix (\(1 \times d\)) where each element is 1 or -1 and the maximum number of columns (d) also is the maximum dimension of J, and \(A^{ + } = A^{T} (AA^{T} )^{ - 1}\). However, if i > n/2, then the ith scrounger is most likely the starving sparrow. As soon as one or more individuals spot attackers, the sparrows move to safety. The mathematical expression for this behavior is:
where \(S_{best}^{k}\) represents the current global best position at iteration k, \(\beta\) shows a control factor of step size that corresponds to a normal distribution with a mean of 0 and a variance of 1, \(\kappa\) represents a random number in [-1, 1]. Here, fi, fb and fw are values of a present sparrow, the current global best and worst, respectively. \(\varepsilon\) is a constant to avoid fi= fw. To put it simply, fi > fb indicates that sparrows in this position are highly vulnerable to attacker assail, while fi = fb indicates that sparrows in the center of the group are also aware of the presence of an attacker and begin to approach others to reduce the risk.
Dataset and Preparation
The Sungun mine is one of the large open-pit mines in Iran (Fig. 5). Investigations have shown that the maximum BB in this mine was 10 m. In total, 234 blasting operations recorded by Khandelwal and Monjezi (2013) in the Sungun mine were used as a dataset in this study. Before a blasting operation, controllable blasting parameters can be set and recorded by blasting engineers actively. Therefore, B, HL, ST, S, PF and special drilling (SD) were used as input parameters to predict BB. Figure 6 shows details of the six input parameters in boxplots, and the correlations between different parameters and BB are shown in Figure 7. The dataset was divided randomly into two groups: The training set (70%) was responsible for constructing the prediction model with certain precision, and the testing set (30%) was responsible for evaluating the prediction performance of the model.
Performance Indicators
In order to obtain accurately the prediction performance of ELM and six novel ELM–SIO models, the root mean square error (RMSE), Pearson correlation coefficient (R), determination coefficient (R2), mean absolute error (MAE), variance accounted for (VAF) and sum of square error (SSE) were used to evaluate these models in the training and testing phase. This was performed not only to verify the optimization effect of the swarm intelligence algorithm but also to select the best model for application in practical engineering. Therefore, six performance indicators were defined as follows (Hasanipanah et al., 2016; Zhou et al., 2020a, b, 2021a, b;; Jahed et al., 2021; Li et al., 2021a, b, c; Xie et al., 2021a, b):
where N represents the number of samples, yi and \(\overline{y}\) indicate the observed and mean observed BB, respectively, and ti and \(\overline{t}\) indicate the predicted and mean predicted BB, respectively.
Results and Discussion
In total, seven models were considered for BB prediction. Figure 8 depicts the entire prediction process, including input data, model development, and performance evaluation. In addition to stability analysis, model development and performance evaluation were emphasized.
Models Development
ELM Model
Six swarm intelligence models were developed based on ELM. Therefore, it was necessary to determine the optimal ELM structure. ELM is a special neural network structure with a single hidden layer, and the number of neurons in the hidden layer determines the performance of prediction. In this study, the RMSE index was used to evaluate the performance of ELM in the training and testing phases. The initial number of neurons was 10, and the next experiment increased by increments of 10 and was stopped until 150. Table 1 shows the prediction performance and corresponding RMSE values of the considered model with different numbers of neurons in 15 experiments. As shown in this table, the lowest RMSE occurred in the hidden layer with 60 neurons in both the training and testing phases. Therefore, an ELM model with 60 neurons in the single-layer feed-forward neural network (SLFN) architecture was developed (Fig. 9).
ELM–SIO Models
Six ELM–SIO models were developed with the same architecture as the ELM (i.e., a hidden layer with 60 neurons in the SLFN). Thus, the optimization problem was to obtain the best input weights and biases value. SIO is a relatively new idea, and it was proposed by imitating the behavior of insects and animals (Saghatforoush et al., 2016). Different from other methods, SIO only needs to train the most appropriate population to solve an optimization problem. For example, Shariati et al. (2020) considered 75 wolves in the ELM–GWO model to predict the compressive strength of concrete with partial replacements for cement. For a similar purpose, six ELM–SIO models (ELM–PSO, ELM–FOA, ELM–WOA, ELM–LSO, ELM–SOA, ELM–SSA) were used here to obtain the optimal number of the population in a certain number of iterations. The RMSE index was also used here to evaluate the model performance, and the calculation time of each iteration was also recorded. The result of the fitness value in a different numbers of the population is shown in Figure 10. This illustrates that the fitness value was not affected by the number of the population that exceeded 400 iterations in the ELM–PSO model, the rest were 500 in FOA (Fig. 10b), 300 in WOA (Fig. 10c), 500 in LSO (Fig. 10d), 400 in SOA (Fig. 10e) and 500 in SSA (Fig. 10f). As shown in Figure 11, the lowest RMSE and the corresponding iteration time are shown per ELM–SIO model each with a different number of populations. As can be realized, 60 birds were considered in the ELM–PSO model, the rest were 50 fruit flies in FOA (Fig. 11b), 50 whales in WOA (Fig. 11c), 50 lions in LSO (Fig. 11d), 70 seagulls in SOA (Fig. 11e) and 40 (Fig. 11f) sparrows in SSA.
Comparison of Results
As discussed earlier, the numbers of neurons and population in six ELM–SIO models were turned, and these hybrid models were used to predict BB. The prediction performances of the six models were compared by comparing the predicted values with the observed values in regression diagrams, and six evaluation indices (RMSE, R, R2, VAF, MAE, SSE) were calculated. The regression diagrams of the six ELM–SIO models in the training phase are shown in Figure 12. The vertical and horizontal axes represent predicted and observed values of BB, respectively. The diagonal (solid line) is responsible for separating the predicted value from the observed value. If the predicted value is greater than the observed value, it falls on the perfect fitting line and, on the contrary, falls below. Only when the predicted value is equal to the observed value can it appear on the line, and the model with more targets on the line has higher predictive performance. At the same time, a boundary of 10% off the perfect fitting line was set to cover more points to compare performance. As can be realized, the points distributed on the perfect fitting line were the least in the ELM–FOA model, and some predicted values even exceeded the 10% range. The prediction performances of the other five models were obviously better than that of ELM–FOA, but it was difficult to distinguish the optimal model by the naked eye. Table 2 records the performance evaluation indicators per model. As shown in this table, the ELM–FOA model had the worst performance indicators of all SIO models (RMSE: 0.4055, R: 0.9879, R2: 0.9760, VAF: 97.5956%, MAE: 0.3023 and SSE: 26.9645) and the same is depicted in the regression diagrams. Among the other five models, the ELM–LSO model had the best performance indicators (RMSE: 0.1129, R: 0.9991, R2: 0.9981, VAF: 99.8135%, MAE: 0.0706 and SSE: 2.0917) even though the differences among them were not significant.
The result of training does not represent the final predictive ability of the model (Shariati et al., 2020). Conversely, models that perform well in the training may perform poorly in the testing phase. To avoid misevaluation, a boundary of 30% off the perfect fitting line was increased based on the original 10%. Figure 13 illustrates regression diagrams of the six ELM–SIO models in the testing phase. Compared with the training phase, the predictive performance of each model in the testing phase decreased. As shown in this figure, five models all had the phenomenon that the prediction point was outside the boundary line of 30%, except the ELM–LSO model. Table 3 records the performance evaluation indicators per model in the testing phase. As can be observed in this table, the RMSE (0.2441), MAE (0.1669) and SSE (4.1710) of the ELM–LSO model were higher than those in the training phase, but still the lowest among the six models in the testing phase. The R (0.9949), R2 (0.9891) and VAF (98.9806%) of ELM–LSO were the highest. As mentioned earlier, the ELM–SOA model performed second only to ELM–LSO, while the ELM–FOA model also had the worst predictive performance in the testing phase.
Furthermore, the ranking scores of the six ELM-SLO models were obtained to offer more intuition as to their performance indicators, as shown in Figure 14. In terms of ranking scores, the ELM–LSO model was the best in both the training (36) and the testing (36) phases. To further verify the model performance, Figure 15 depicts the predicted curve versus the observed curve per model in the testing phase for BB prediction. Overall, the predicted curve of each model was consistent with the observed curve. However, some local details are very important to evaluate the predictive performance of each model. As shown in this figure, the ELM–PSO model had obvious prediction errors for sample numbers 6, 37 and 63; the ELM–FOA model had obvious prediction errors for sample numbers 33–35, 42, 46–49 and 65–66; the ELM–WOA model had obvious prediction errors for sample numbers 18, 37 and 63; the ELM–SOA model had obvious prediction errors for sample numbers 9 and 49; the ELM–SSA model had obvious prediction errors for sample numbers 9, 15, 42 and 63. However, the ELM–LSO model was not obviously wrong in the details. Therefore, the results indicate that the ELM–LSO was the best model for BB prediction.
Figure 16 clearly shows the difference between the ELM model and the ELM–LSO model. In this figure, the abscissa is the number of samples, and the ordinate is the ratio of predicted value to observed value. A ratio of 1 indicates that the predicted value is equal to the observed value, and the closer it is to 1, the better the prediction performance is. Figure 16a and c shows the prediction results in the training phase. The ELM–LSO model significantly improved the prediction performance of the ELM model and made the predicted value closer to the observed value. Also, in the testing phase, there were more targets close to 1 in the ELM–LSO model. While this model is not perfect, the outliers were not very far away.
The research results not only show that the performance of the ELM model can be improved by using SIO algorithms for BB prediction, but that ELM–LSO was the best model. Therefore, the comparison between the six hybrid models proposed in this study and previous studies is shown in Table 4. As can be seen in this table, the performance of the ELM–LSO model was the best among all models, especially based on the same dataset proposed and used by Khandelwal and Monjezi (2013), Zhou et al., (2021c) and Dai et al., (2022).
Relative Importance of Influence Variables
Sensitivity analysis is of great help to judge the prediction effect of influence variables on BB. According to the comparison results of different SIO models, the importance of variables was extracted from the ELM–LSO model in this study. The variable importance test mechanism is reflected in the amount of impurity reduction after changing randomly the values of the variables (Qi et al., 2018). Therefore, a new global sensitivity analysis method called PAWN, introduced by Pianosi and Wagener (2015, 2018), was used in this study. Different from the total local sensitivity analysis method based on square difference, the importance score (S) calculation expression is:
where stat is a statistic (e.g., maximum, mean or median), and maximum was selected in this study.\(\hat{F}_{y} \left( y \right)\) and \(\hat{F}_{{y\left| {x_{i} } \right.}}\) are unconditional and conditional cumulative distribution functions (CDFs) of the output variable y, respectively. \({\rm I}_{{\tilde{u}}}\) is equally spaced subintervals of input important variable xi. \(\tilde{u}\) is usually set to the default value, 10 (Xue et al., 2021).
Figure 17 shows the results of the variable importance scores. B and ST were the most sensitive variable for BB, with the highest importance score of 0.8717. The variable with the lowest score was SD (0.4817). However, there are no definitive studies to show that SD is necessarily the least important variable, especially given the amount of data covered in this study. In contrast, Faradonbeh et al., (2016) imposed that the PF and B are the most influential variables on BB. Some research suggested that HL and ST have greater effects on BB (Zhou et al., 2021a, b, c; Dai et al., 2022). Without further discussion, it is believed that the order of importance of influence variables to BB is the B and ST → HL and S → PF → SD.
Conclusions and Summary
Predicting BB is an interesting and productive exercise. In this study, 234 cases of BB were involved, with six input variables (hole length, spacing, burden, stemming, powder factor and specific drilling) and unique output (BB). Under the unique advantages of the ELM algorithm combined with SIO, six hybrid models were developed to predict BB. It is concluded that the performance of ELM–LSO was the best model in the prediction of BB compared to the other five ELM–SIO models (ELM–PSO, ELM–FOA, ELM–WOA, ELM–SOA and ELM–SSA). For the ELM–LSO model, RMSE was 0.1129 (R: 0.9991, R2: 0.9981, VAF: 99.8135%, MAE: 0.0706 and SSE: 2.0917) in training phase and 0.2441 in testing phase (R: 0.9949, R2: 0.9891, VAF: 98.9806%, MAE: 0.1669 and SSE: 4.1710). Therefore, SIO can be a very effective solution in improving the performance of the ELM model. It is worth noting that the burden and the stemming (0.8717) were the most influential input variable, followed by hole length and spacing (0.8205), powder factor (0.5575) and specific drilling (0.4871). This only represents the conclusion supported by the data in this study, which can be used as a reference but is not immutable. Moreover, more valid parameters such as rock quality designation (RQD), geological strength index (GSI) and weathering index (WI) could be taken into account in future BB prediction tasks, even if these parameters are difficult to obtain in actual blasting investigations. Meanwhile, advanced and effective optimization algorithms need to be developed to improve the BB prediction accuracy as much as possible, which has always been the difficulty of this kind of work.
References
Agrawal, H., & Mishra, A. K. (2018). Evaluation of initiating system by measurement of seismic energy dissipation in surface blasting. Arabian Journal of Geosciences, 11(13), 1–12.
Armaghani, D. J., Tonnizam Mohamad, E., Hajihassani, M., Abad, A. N. K., & S. V., Marto, A., & Moghaddam, M. R. (2016). Evaluation and prediction of flyrock resulting from blasting operations using empirical and computational methods. Engineering with Computers, 32(1), 109–121.
Armaghani, D. J., Kumar, D., Samui, P., Hasanipanah, M., & Roy, B. (2021). A novel approach for forecasting of ground vibrations resulting from blasting: Modified particle swarm optimization coupled extreme learning machine. Engineering with Computers, 37(4), 3221–3235.
Armaghani, D. J., Koopialipoor, M., Bahri, M., Hasanipanah, M., & Tahir, M. M. (2020). A SVR-GWO technique to minimize flyrock distance resulting from blasting. Bulletin of Engineering Geology and the Environment, 79(8), 4369–4385.
Avise, J.C., 2017. The Life of a Couple of Birds. from Aardvarks to Zooxanthellae. Springer, pp. 41–46.
Barta, Z., Liker, A., & Mónus, F. (2004). The effects of predation risk on the use of social foraging tactics. Animal Behaviour, 67, 301–308.
Bhandari, S. (1997). Engineering rock blasting operations. Taylor & Francis.
Bhandari, S., & Badal, R. (1990, August). Relationship of joint orientation with hole spacing parameter in multihole blasting. In Proceedings of the 3rd International Symposium on Rock Fragmentation by Blasting, Brisbane, Australia (pp. 225–231).
Bhatawdekar, R. M., Armaghani, D. J., & Azizi, A. (2021). Applications of AI and ML Techniques to Predict Backbreak and Flyrock Distance Resulting from Blasting. In Environmental Issues of Blasting (pp. 41–59). Springer, Singapore.
Dai, Y., Khandelwal, M., Qiu, Y., Zhou, J., Monjezi, M., & Yang, P. (2022). A hybrid metaheuristic approach using random forest and particle swarm optimization to study and evaluate backbreak in open-pit blasting. Neural Computing and Applications, 34, 6273–6288.
Dhiman, G., & Kumar, V. (2019). Seagull optimization algorithm: Theory and its applications for large-scale industrial engineering problems. Knowledge-Based Systems, 165, 169–196.
Du, K., Li, X., Su, R., Tao, M., Lv, S., Luo, J., & Zhou, J. (2022). Shape ratio effects on the mechanical characteristics of rectangular prism rocks and isolated pillars under uniaxial compression. International Journal of Mining Science and Technology, 32(2), 347–362. https://doi.org/10.1016/j.ijmst.2022.01.004
Ebrahimi, E., Monjezi, M., Khalesi, M. R., & Armaghani, D. J. (2016). Prediction and optimization of back-break and rock fragmentation using an artificial neural network and a bee colony algorithm. Bulletin of Engineering Geology and the Environment, 75(1), 27–36.
Enayatollahi I, Aghajani-Bazzazi A (2010) Evaluation of salt-ANFO mixture in back break reduction by data envelopment analysis. In: Proceedings of the 9th International Symposium on Rock Fragmentation by Blasting, Granada, Spain, September 2009, pp 127–133.
Eskandar, H., Heydari, E., Hasanipanah, M., Masir, M. J., & Derakhsh, A. M. (2018). Feasibility of particle swarm optimization and multiple regression for the prediction of an environmental issue of mine blasting. Engineering Computations.
Esmaeili, M., Osanloo, M., Rashidinejad, F., Aghajani Bazzazi, A., & Taji, M. (2014). Multiple regression, ANN and ANFIS models for prediction of backbreak in the open pit blasting. Engineering with Computers, 30(4), 549–558.
Fan, J., Wu, L., Ma, X., Zhou, H., & Zhang, F. (2020). Hybrid support vector machines with heuristic algorithms for prediction of daily diffuse solar radiation in air-polluted regions. Renewable Energy, 145, 2034–2045.
Fang, Q., Nguyen, H., Bui, X. N., & Nguyen-Thoi, T. (2020). Prediction of blast-induced ground vibration in open-pit mines using a new technique based on imperialist competitive algorithm and M5Rules. Natural Resources Research, 29(2), 791–806.
Faradonbeh, R. S., Monjezi, M., & Armaghani, D. J. (2016). Genetic programing and non-linear multiple regression techniques to predict backbreak in blasting operation. Engineering with Computers, 32(1), 123–133.
Fattahi, H., & Hasanipanah, M. (2021). Prediction of blast-induced ground vibration in a mine using relevance vector regression optimized by metaheuristic algorithms. Natural Resources Research, 30(2), 1849–1863.
Firouzadj, A., Farsangi, M. A. E., Mansouri, H., & Esfahani, S. K. (2006, May). Application of controlled blasting (pre-splitting) in Sarcheshmeh copper mine. In Proceedings of the 8th international symposium on rock fragmentation by blasting, Santiago, Chile (pp. 383–387).
Ghaleini, E. N., Koopialipoor, M., Momenzadeh, M., Sarafraz, M. E., Mohamad, E. T., & Gordan, B. (2019). A combination of artificial bee colony and neural network for approximating the safety factor of retaining walls. Engineering with Computers, 35(2), 647–658.
Ghasemi, E. (2017). Particle swarm optimization approach for forecasting backbreak induced by bench blasting. Neural Computing and Applications, 28(7), 1855–1862.
Ghasemi, E., Amnieh, H. B., & Bagherpour, R. (2016). Assessment of backbreak due to blasting operation in open pit mines: A case study. Environmental Earth Sciences, 75(7), 1–11.
Hasanipanah, M., Noorian-Bidgoli, M., Jahed Armaghani, D., & Khamesi, H. (2016). Feasibility of PSO-ANN model for predicting surface settlement caused by tunneling. Engineering with Computers, 32(4), 705–715.
Hasanipanah, M., Shahnazar, A., Arab, H., Golzar, S. B., & Amiri, M. (2017). Developing a new hybrid-AI model to predict blast-induced backbreak. Engineering with Computers, 33(3), 349–359.
Hof, P. R., & Van der Gucht, E. (2007). Structure of the cerebral cortex of the humpback whale, Megaptera novaeangliae (Cetacea, Mysticeti, Balaenopteridae). The Anatomical Record: Advances in Integrative Anatomy and Evolutionary Biology: Advances in Integrative Anatomy and Evolutionary Biology, 290(1), 1–31.
Huang, G. B., Zhu, Q. Y., & Siew, C. K. (2006). Extreme learning machine: Theory and applications. Neurocomputing, 70(1–3), 489–501.
Huo, X., Shi, X., Qiu, X., Zhou, J., Gou, Y., Yu, Z., & Ke, W. (2020). Rock damage control for large-diameter-hole lateral blasting excavation based on charge structure optimization. Tunnelling and Underground Space Technology, 106, 103569.
Iverson, S. R., Hustrulid, W. A., Johnson, J. C., Tesarik, D., & Akbarzadeh, Y. (2009, September). The extent of blast damage from a fully coupled explosive charge. In Proceedings of the 9th international symposium on rock fragmentation by blasting, Fragblast (Vol. 9, pp. 459–68).
Jahed Armaghani, D., Kumar, D., Samui, P., Hasanipanah, M., & Roy, B. (2021). A novel approach for forecasting of ground vibrations resulting from blasting: Modified particle swarm optimization coupled extreme learning machine. Engineering with Computers, 37(4), 3221–3235.
Jia, Z., Chen, G., & Huang, S. (1998). Computer simulation of open pit bench blasting in jointed rock mass. International Journal of Rock Mechanics and Mining Sciences, 4(35), 476.
Jimeno, C. L., Jimeno, E. L., & Carcedo, F. J. A. (1995). Drilling and blasting of rocks. Balkema.
Kennedy, J., & Eberhart, R. (1995). Particle swarm optimization. In Proceedings of ICNN'95-international conference on neural networks (Vol. 4, pp. 1942-1948). IEEE.
Khandelwal, M., & Monjezi, M. (2013). Prediction of backbreak in open-pit blasting operations using the machine learning method. Rock mechanics and rock engineering, 46(2), 389–396.
Khandelwal, M., Marto, A., Fatemi, S. A., Ghoroqi, M., Armaghani, D. J., Singh, T. N., & Tabrizi, O. (2018). Implementing an ANN model optimized by genetic algorithm for estimating cohesion of limestone samples. Engineering with Computers, 34(2), 307–317.
Khandelwal, M., Mahdiyar, A., Armaghani, D. J., Singh, T. N., Fahimifar, A., Faradonbeh, R. S. (2017). An expert system based on hybrid ICA-ANN technique to estimate macerals contents of Indian coals. Environmental Earth Sciences, 76(11), 399. https://doi.org/10.1007/s12665-017-6726-2
Khandelwal, M., Mahdiyar, A., Armaghani, D. J., et al. (2017). An expert system based on hybrid ICA-ANN technique to estimate macerals contents of Indian coals. Environment and Earth Science, 76, 399.
Khandelwal, M., Singh, T.N. (2011) Predicting elastic properties of schistose rocks from unconfined strength using intelligent approach. Arabian Journal of Geosciences, 4(3–4), 435–442. https://doi.org/10.1007/s12517-009-0093-6
Konya, C. J. (2003). Rock Blasting and overbreak control, USA: National Highway Institute. FHWA-HI-92–001.
Konya, C. J., & Walter, E. J. (1991). Rock blasting and overbreak control (No. FHWA-HI-92–001; NHI-13211). United States. Federal Highway Administration.
Kumar, S., Mishra, A. K., & Choudhary, B. S. (2021). Prediction of back break in blasting using random decision trees. Engineering with Computers, 1–7.
Li, C., Zhou, J., Armaghani, D. J., & Li, X. (2021a). Stability analysis of underground mine hard rock pillars via combination of finite difference methods, neural networks, and Monte Carlo simulation techniques. Underground Space, 6(4), 379–395.
Li, C., Zhou, J., Armaghani, D. J., Cao, W., & Yagiz, S. (2021b). Stochastic assessment of hard rock pillar stability based on the geological strength index system. Geomechanics and Geophysics for Geo-Energy and Geo-Resources, 7(2), 1–24.
Li, E., Yang, F., Ren, M., Zhang, X., Zhou, J., & Khandelwal, M. (2021c). Prediction of blasting mean fragment size using support vector regression combined with five optimization algorithms. Journal of Rock Mechanics and Geotechnical Engineering, 13(6), 1380–1397.
Liker, A., & Barta, Z. (2002). The effects of dominance on social foraging tactic use in house sparrows. Behaviour, 139, 1061–1076.
Liu, S. J., Yang, Y., & Zhou, Y. Q. (2018). A kind of Swarm Intelligence Algorithm—Lion Group Algorithm. Pattern Recognit AI, 31(05), 431–441.
Lundborg N (1974) The hazards of fly rock in rock blasting. In: Report DS1974, Swedish Detonic Res Found (SveDeFo), Stockholm, p 12.
Mirjalili, S., & Lewis, A. (2016). The whale optimization algorithm. Advances in engineering software, 95, 51–67.
Moayedi, H., & Jahed Armaghani, D. (2018). Optimizing an ANN model with ICA for estimating bearing capacity of driven pile in cohesionless soil. Engineering with Computers, 34(2), 347–356.
Mohammadnejad, M., Gholami, R., Sereshki, F., & Jamshidi, A. (2013). A new methodology to predict backbreak in blasting operation. International journal of rock mechanics and mining sciences, 60, 75–81.
Monjezi, M., & Dehghani, H. (2008). Evaluation of effect of blasting pattern parameters on back break using neural networks. International Journal of Rock Mechanics and Mining Sciences, 45(8), 1446–1453.
Monjezi, M., Ahmadi, Z., Varjani, A. Y., & Khandelwal, M. (2013). Backbreak prediction in the Chadormalu iron mine using artificial neural network. Neural Computing and Applications, 23(3), 1101–1107.
Monjezi, M., Bahrami, A., & Varjani, A. Y. (2010a). Simultaneous prediction of fragmentation and flyrock in blasting operation using artificial neural networks. International Journal of Rock Mechanics and Mining Sciences, 3(47), 476–480.
Monjezi, M., Hashemi Rizi, S. M., Majd, V. J., & Khandelwal, M. (2014). Artificial neural network as a tool for backbreak prediction. Geotechnical and Geological Engineering, 32(1), 21–30.
Monjezi, M., Rezaei, M., & Yazdian, A. (2010b). Prediction of backbreak in open-pit blasting using fuzzy set theory. Expert Systems with Applications, 37(3), 2637–2643.
Nguyen, H., Bui, X.-N., Bui, H.-B., & Mai, N.-L. (2018). A comparative study of artificial neural networks in predicting blast-induced air-blast overpressure at Deo Nai open-pit coal mine. Vietnam. Neural Computing and Applications, 32(8), 3939–3955.
Nguyen, H., & Bui, X.-N. (2020). Soft computing models for predicting blast-induced air over-pressure: A novel artificial intelligence approach. Applied Soft Computing, 92, 106292.
Nguyen, H., Bui, X.-N., Tran, Q.-H., Nguyen, D.-A., Hoa, L. T. T., Le, Q.-T., & Giang, L. T. H. (2021). Predicting Blast-Induced Ground Vibration in Open-Pit Mines Using Different Nature-Inspired Optimization Algorithms and Deep Neural Network. Natural Resources Research, 30(6), 4695–4717.
Nguyen, H., Drebenstedt, C., Bui, X. N., & Bui, D. T. (2020). Prediction of blast-induced ground vibration in an open-pit mine by a novel hybrid model based on clustering and artificial neural network. Natural Resources Research, 29(2), 691–709.
Pan, W.-T. (2012). A new fruit fly optimization algorithm: Taking the financial distress model as an example. Knowledge-Based Systems, 26, 69–74.
Pianosi, F., & Wagener, T. (2015). A simple and efficient method for global sensitivity analysis based on cumulative distribution functions. Environmental Modelling & Software, 67, 1–11.
Pianosi, F., & Wagener, T. (2018). Distribution-based sensitivity analysis from a generic input-output sample. Environmental Modelling & Software, 108, 197–207.
Pulliam, H. R. (1973). On the advantages of flocking. Journal of Theoretical Biology, 38, 419–422.
Qi, C., Fourie, A., Ma, G., Tang, X., & Du, X. (2018). Comparative study of hybrid artificial intelligence approaches for predicting hangingwall stability. Journal of Computing in Civil Engineering, 32(2), 04017086.
Roth JA (1979) A model for the determination of flyrock range as a function of shot condition. In: US Bureau of Mines Contract J0387242, Management Science Associates, p 61.
Ramesh Murlidhar, B., Yazdani Bejarbaneh, B., Jahed Armaghani, D., Mohammed, A. S., & Tonnizam Mohamad, E. (2021). Application of tree-based predictive models to forecast air overpressure induced by mine blasting. Natural Resources Research, 30(2), 1865–1887.
Saghatforoush, A., Monjezi, M., Faradonbeh, R. S., & Armaghani, D. J. (2016). Combination of neural network and ant colony optimization algorithms for prediction and optimization of flyrock and back-break induced by blasting. Engineering with Computers, 32(2), 255–266.
Sari, M., Ghasemi, E., & Ataei, M. (2014). Stochastic modeling approach for the evaluation of backbreak due to blasting operations in open pit mines. Rock mechanics and rock engineering, 47(2), 771–783.
Sayadi, A., Monjezi, M., Talebi, N., & Khandelwal, M. (2013). A comparative study on the application of various artificial neural networks to simultaneous prediction of rock fragmentation and backbreak. Journal of Rock Mechanics and Geotechnical Engineering, 5(4), 318–324.
Shariati, M., Mafipour, M. S., Ghahremani, B., Azarhomayun, F., Ahmadi, M., Trung, N. T., & Shariati, A. (2020). A novel hybrid extreme learning machine–grey wolf optimizer (ELM-GWO) model to predict compressive strength of concrete with partial replacements for cement. Engineering with Computers, 1–23.
Shariati, M., Trung, N. T., Wakil, K., Mehrabi, P., Safa, M., & Khorami, M. (2019). Estimation of moment and rotation of steel rack connections using extreme learning machine. Steel and Composite Structures, 31(5), 427–435.
Sharma, M., Choudhary, B. S., & Agrawal, H. (2021). Prediction and assessment of back break by multivariate regression analysis, and random forest algorithm in hot strata/fiery seam of open-pit coal mine.
Shirani Faradonbeh, R., Monjezi, M., & Jahed Armaghani, D. (2016). Genetic programing and non-linear multiple regression techniques to predict backbreak in blasting operation. Engineering with Computers, 32(1), 123–133.
Trivedi, R., Singh, T. N., & Raina, A. K. (2014). Prediction of blast-induced flyrock in Indian limestone mines using neural networks. Journal of Rock Mechanics and Geotechnical Engineering, 6(5), 447–454.
Uyar, G. G., & Aksoy, C. O. (2019). Comparative review and interpretation of the conventional and new methods in blast vibration analyses. Geomechanics and Engineering, 18(5), 545–554.
Wang, M., Shi, X., & Zhou, J. (2018a). Charge design scheme optimization for ring blasting based on the developed Scaled Heelan model. International Journal of Rock Mechanics and Mining Sciences, 110, 199–209.
Wang, M., Shi, X., Zhou, J., & Qiu, X. (2018b). Multi-planar detection optimization algorithm for the interval charging structure of large-diameter longhole blasting design based on rock fragmentation aspects. Engineering Optimization, 50(12), 2177–2191.
Wang, X., Tang, Z., Tamura, H., Ishii, M., & Sun, W. D. (2004). An improved backpropagation algorithm to avoid the local minima problem. Neurocomputing, 56, 455–460.
Wang, S.M., Zhou, J., Li, C.Q., Armaghani, D.J., Li, X.B., Mitri, H.S. (2021) Rockburst prediction in hard rock mines developing bagging and boosting tree-based ensemble techniques. Journal of Central South University, 28(2), 527–542.
Wilson, J. M., & Moxon, N. T. (1988). The development of low energy ammonium nitrate based explosives. Proceedings of the Australasian Institute of Mining and Metallurgy, Melbourne, Australia, 27–32.
Xie, C., Nguyen, H., Bui, X. N., Choi, Y., Zhou, J., & Nguyen-Trang, T. (2021a). Predicting rock size distribution in mine blasting using various novel soft computing models based on meta-heuristics and machine learning algorithms. Geoscience Frontiers, 12(3), 101108.
Xie, C., Nguyen, H., Bui, X. N., Nguyen, V. T., & Zhou, J. (2021b). Predicting roof displacement of roadways in underground coal mines using adaptive neuro-fuzzy inference system optimized by various physics-based optimization algorithms. Journal of Rock Mechanics and Geotechnical Engineering, 13(6), 1452–1465.
Xue, J., & Shen, B. (2020). A novel swarm intelligence optimization approach: Sparrow search algorithm. Systems Science & Control Engineering, 8(1), 22–34.
Xue, Y., Wu, Y. P., Miao, F. S., & Li, L. W. (2021). Back analysis of shear strength parameters of sliding surface by using combination method of random field and Bayes theory. Journal of ZheJiang University (Engineering Science), 55(6), 1118–1127.
Ye, J., Koopialipoor, M., Zhou, J., Armaghani, D. J., & He, X. (2021). A novel combination of tree-based modeling and Monte Carlo simulation for assessing risk levels of flyrock induced by mine blasting. Natural Resources Research, 30(1), 225–243.
Yong, W., Zhang, W., Nguyen, H., Bui, X.-N., Choi, Y., Nguyen-Thoi, T., Zhou, J., & Tran, T. T. (2022). Analysis and prediction of diaphragm wall deflection induced by deep braced excavations using finite element method and artificial neural network optimized by metaheuristic algorithms. Reliability Engineering & System Safety, 221, 108335.
Yu, Q., Monjezi, M., Mohammed, A. S., Dehghani, H., Armaghani, D. J., & Ulrikh, D. V. (2021). Optimized support vector machines combined with evolutionary random forest for prediction of back-break caused by blasting operation. Sustainability, 13(22), 12797.
Zhang, X., Nguyen, H., Choi, Y., Bui, X.-N., & Zhou, J. (2021a). Novel Extreme Learning Machine-Multi-Verse Optimization Model for Predicting Peak Particle Velocity Induced by Mine Blasting. Natural Resources Research, 30(6), 4735–4751.
Zhou, J., Asteris, P. G., Armaghani, D. J., & Pham, B. T. (2020a). Prediction of ground vibration induced by blasting operations through the use of the Bayesian Network and random forest models. Soil Dynamics and Earthquake Engineering, 139, 106390.
Zhou, J., Bejarbaneh, B. Y., Armaghani, D. J., & Tahir, M. M. (2020b). Forecasting of TBM advance rate in hard rock condition based on artificial neural network and genetic programming techniques. Bulletin of Engineering Geology and the Environment, 79, 2069–2084.
Zhou, J., Dai, Y., Khandelwal, M., Monjezi, M., Yu, Z., & Qiu, Y. (2021a). Performance of hybrid SCA-RF and HHO-RF models for predicting backbreak in open-pit mine blasting operations. Natural Resources Research, 30(6), 4753–4771.
Zhou, J., Li, C., Arslan, C. A., Hasanipanah, M., & Bakhshandeh Amnieh, H. (2021b). Performance evaluation of hybrid FFA-ANFIS and GA-ANFIS models to predict particle size distribution of a muck-pile after blasting. Engineering with Computers, 37(1), 265–274.
Zhou, J., Li, E., Yang, S., Wang, M., Shi, X., Yao, S., & Mitri, H. S. (2019). Slope stability prediction for circular mode failure using gradient boosting machine approach based on an updated database of case histories. Safety Science, 118, 505–518.
Zhou, J., Li, X., & Mitri, H. S. (2016). Classification of rockburst in underground projects: Comparison of ten supervised learning methods. Journal of Computing in Civil Engineering, 30(5), 04016003.
Zhou, J., Li, X., & Shi, X. (2012). Long-term prediction model of rockburst in underground openings using heuristic algorithms and support vector machines. Safety science, 50(4), 629–644.
Zhou, J., Qiu, Y., Zhu, S., Armaghani, D. J., Li, C., Nguyen, H., & Yagiz, S. (2021c). Optimization of support vector machine through the use of metaheuristic algorithms in forecasting TBM advance rate. Engineering Applications of Artificial Intelligence, 97, 104015.
Zhou, J., Huang, S., & Qiu, Y. (2022a). Optimization of random forest through the use of MVO, GWO and MFO in evaluating the stability of underground entry-type excavations. Tunnelling and Underground Space Technology, 124, 104494.
Zhou, J., Huang, S., Zhou, T., Armaghani, D. J., & Qiu, Y. (2022b). Employing a genetic algorithm and grey wolf optimizer for optimizing RF models to evaluate soil liquefaction potential. Artificial Intelligence Review. https://doi.org/10.1007/s10462-022-10140-5
Zhou, J., Zhu, S., Qiu, Y., Armaghani, D. J., Zhou, A., & Yong, W. (2022c). Predicting tunnel squeezing using support vector machine optimized by whale optimization algorithm. Acta Geotechnica, 17(4), 1343–1366.
Acknowledgments
This research is partially supported by the National Natural Science Foundation Project of China (42177164) and the Innovation-Driven Project of Central South University (2020CX040). The first author was funded by China Scholarship Council (Grant No. 202106370038).
Funding
Open Access funding enabled and organized by CAUL and its Member Institutions.
Author information
Authors and Affiliations
Corresponding authors
Ethics declarations
Conflict of Interest
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Li, C., Zhou, J., Khandelwal, M. et al. Six Novel Hybrid Extreme Learning Machine–Swarm Intelligence Optimization (ELM–SIO) Models for Predicting Backbreak in Open-Pit Blasting. Nat Resour Res 31, 3017–3039 (2022). https://doi.org/10.1007/s11053-022-10082-3
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11053-022-10082-3