Abstract
The sine cosine algorithm (SCA) is widely recognized for its efficacy in solving optimization problems, although it encounters challenges in striking a balance between exploration and exploitation. To improve these limitations, a novel model, termed the novel sine cosine algorithm (nSCA), is introduced. In this advanced model, the roulette wheel selection (RWS) mechanism and opposition-based learning (OBL) techniques are integrated to augment its global optimization capabilities. A meticulous evaluation of nSCA performance has been carried out in comparison with state-of-the-art optimization algorithms, including multi-verse optimizer (MVO), salp swarm algorithm (SSA), moth-flame optimization (MFO), grasshopper optimization algorithm (GOA), and whale optimization algorithm (WOA), in addition to the original SCA. This comparative analysis was conducted across a wide array of 23 classical test functions and 29 CEC2017 benchmark functions, thereby facilitating a comprehensive assessment. Further validation of nSCA utility has been achieved through its deployment in five distinct engineering optimization case studies. Its effectiveness and relevance in addressing real-world optimization issues have thus been emphasized. Across all conducted tests and practical applications, nSCA was found to outperform its competitors consistently, furnishing more effective solutions to both theoretical and applied optimization problems.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
1 Introduction
1.1 Evolutionary Algorithm
In recent years, there has been a growing scholarly emphasis on the exploration of nature-inspired optimization algorithms, primarily due to their remarkable capabilities in addressing complex optimization challenges. Within this context, the work of Mirjalili, Mirjalili [1] introduced the MVO algorithm, which draws inspiration from cosmological concepts. Their study not only demonstrates its competitive performance across benchmark assessments but also within real-world engineering scenarios, highlighting its potential to tackle complex challenges characterized by intricate search spaces. Similarly, Mirjalili [2] presented SCA, showcasing its effectiveness through rigorous benchmark testing and the optimization of an aircraft wing’s cross-section. The study emphasizes its promise in resolving intricate real-world problems, particularly those constrained by both the complexity and the obscurity of their search domains.
It is noteworthy that the landscape of optimization algorithms includes the differential evolution (DE) method developed by Storn and Price [3], renowned for its simplicity and effectiveness in global optimization. Building on this trajectory, Mirjalili [4] introduced the MFO, inspired by the transverse orientation behavior of moths. The investigation demonstrates the algorithm’s competitiveness through comprehensive benchmark tests and applications in real-world engineering domains. Notably, Mirjalili and Lewis [5] pioneered the WOA, drawing inspiration from the intricate social behavior of humpback whales. Their work resonates with the algorithm’s competitive prowess, illustrated through exhaustive assessments of mathematical optimization landscapes and the intricacies of structural design problems.
Adding further to this diverse spectrum, Saremi, Mirjalili [6] proposed GOA, deriving insights from the collective behavior of grasshopper swarms. Their study provides compelling evidence of its efficacy in solving optimization challenges, supported by rigorous benchmarking exercises and practical applications to intricate structural optimization scenarios. Finally, the work of Mirjalili, Gandomi [7] unveiled SSA, inspired by the cooperative swarming behavior of salps. Their comprehensive exploration demonstrates the algorithm’s effectiveness in both single and multi-objective optimization landscapes, validated through mathematical function evaluations and real-world engineering design complexities.
In evolutionary algorithms reliant on population-based methods, the optimization process is commonly divided into two critical phases, irrespective of the algorithm’s specific characteristics [8, 9]. The initial phase, often referred to as exploration, is designed to scan the search landscape and identify high-potential regions. In this phase, significant shifts in directions are made, potentially leading to notable results. The subsequent phase, known as exploitation, focuses on refining the existing choices based on the data that has been gathered during the exploration phase. These data are employed to facilitate the algorithm’s convergence. Achieving a judicious balance between exploration and exploitation is considered essential for the effective accomplishment of comprehensive optimization by the algorithm.
The ongoing advancements in algorithmic design and optimization have captured significant scholarly attention [10]. This focus is substantiated by the commonly held belief that no single algorithm can universally address diverse optimization challenges. Consequently, a strong motivation has been observed among researchers to either augment existing methodologies or develop innovative algorithms capable of competing effectively with established solutions. In the specific area of multi-facility production scheduling, Pham, Trang [11] introduced an integration of the gray wolf optimizer (GWO) and the dragonfly algorithm (DA) to enhance optimization processes. In a similar vein, Son and Nguyen Dang [12] proposed an MVO model aimed at simultaneous time and cost optimization in small-scale scenarios. In the realm of environmental impact, Qiao, Lu [13] unveiled a hybrid algorithm that merges the lion swarm optimizer with a genetic algorithm (GA). The algorithm was found to improve both the stability and accuracy of carbon dioxide emissions forecasts, outperforming existing models. Regarding structural optimization, a study by Altay, Cetindemir [14] evaluated the SSA and introduced a modified version, termed modified SSA (MSSA), for optimizing truss system structures. The study found that, unlike SSA, MSSA effectively addresses convergence issues and proves especially effective for discrete problems. In the domain of construction, Pham and Soulisa [15] proposed a hybrid ant-lion optimizer (ALO) algorithm. This algorithm demonstrated improved capabilities for site layout planning by combining optimization techniques with heuristic methods. Meanwhile, Goksal, Karaoglan [16] introduced a heuristic solution for the vehicle routing problem, an NP-hard problem, by utilizing a PSO algorithm enhanced with variable neighborhood descent (VND) for local searches. Furthermore, Son, Duy [17] introduced a novel optimization algorithm that merges the DA and PSO to control construction material costs effectively.
1.2 Sine Cosine Algorithm
Since its inception in 2016, the SCA has garnered significant attention as a potential optimization technique. Its applications span diverse fields, addressing an array of complex issues. For example, in the realm of engineering, Shang, Zhou [18] unveiled a modified SCA to expedite convergence speed and promote population diversity. This modification involved redefining the position update formula and incorporating a Levy random walk mutation strategy for solving intricate engineering design problems. In the field of electrical networks, Raut and Mishra [19] introduced an SCA variant specifically tailored for the power distribution network reconfiguration (PDNR) problem. The algorithm aimed to minimize power loss as its sole objective. In a similar vein, Reddy, Panwar [20] presented a binary SCA aimed at optimizing the profit-based unit commitment (PBUC) problem in competitive electricity markets, demonstrating enhanced solution quality and convergence rates compared to existing methods. Within the sphere of bioinformatics and environmental science, Sahlol, Ewees [21] employed an SCA-optimized neural network model to enhance the prediction accuracy of oxidative stress biomarkers in fish liver tissue. Specifically, the model demonstrated improved performance when assessing the impact of varying selenium nanoparticle concentrations. For community detection and system modelling, Zhao, Zou [22] presented a discrete SCA tailored for community detection in complex networks. The algorithm showed superior effectiveness compared to existing methods like FM, BGLL, and GA on real-world network data. Aydin, Gozde [23] utilized both WOA and SCA for estimating critical parameters in photovoltaic (PV) cell models, targeting improved accuracy in system analysis and electrical generation efficiency.
Given the diverse nature of optimization problems, it is widely acknowledged that there is no universally applicable optimization algorithm competent in addressing diverse optimization problems [10]. As a result, there have been numerous investigations aimed at improving the effectiveness of the SCA. For instance, Cheng and Duan [24] proposed a hybrid version that combines SCA and the cloud model to handle benchmark test functions with different dimensions. Bureerat and Pholdee [25] developed a hybrid model that combines SCA and DE for detecting structural damage. Turgut [26] proposed a model that integrates the SCA with the backtracking search algorithm to effectively address multi-objective problems in heat exchanger design. Bairathi and Gopalani [27] improved SCA by integrating the opposition-based mechanism to instruct multi-layer neural networks. Qu, Zeng [28] introduced an upgraded version of the SCA by incorporating a neighborhood search technique and a greedy Levy mutation. Son and Nguyen Dang [29] proposed a hybrid SCA model to optimize simultaneously time and cost in large-scale projects. Finally, Pham and Nguyen [30] proposed an integrated SCA version with tournament selection, OBL, and mutation and crossover methods to handle cement transport routing.
1.3 The Motivation of this Study
Since its introduction, the SCA has witnessed growing popularity across various scientific disciplines, a trend primarily attributed to its straightforward methodology. However, the algorithm has been criticized for its tendency toward premature convergence, a drawback often ascribed to an inadequately defined exploitation strategy within its search landscape [31]. As a result, academic interest has been piqued in the development of enhanced versions of the SCA framework, viewed as potential solutions for overcoming the intricate challenges frequently encountered in optimization tasks.
Numerous efforts have been undertaken to enhance the efficacy of the SCA, encompassing a range of strategies including its fusion with OBL [27], its integration with tournament selection [30], incorporation of the Levy flight approach [18, 28], and hybridizations with other algorithmic paradigms [25, 26, 28]. However, the integration of both the RWS and OBL methodologies to achieve a harmonious balance between the exploration and exploitation phases remains an underexplored area. This comprehensive integration aims to culminate in the pursuit of global optimization. Within the research landscape, this study endeavors to address this notable gap by embarking on a journey to unify the RWS and OBL techniques. This unification not only seeks to bridge an existing research void but also aims to present a streamlined and efficient tool for tackling optimization challenges, catering to a distinct requirement within the research panorama.
In the following section, the formulation of the nSCA is detailed. Section 3 is devoted to an exhaustive evaluation of the algorithm’s convergence properties, including an analysis of its performance metrics and behavioral patterns. Section 4 provides an empirical substantiation of the model’s efficacy, achieved through its application in five real-world optimization case studies. Finally, the key findings of the research are summarized in Sect. 5, where potential avenues for future academic inquiry are also delineated.
2 Novel Version of Sine Cosine Algorithm
2.1 Roulette Wheel Selection (RWS)
The RWS mechanism is extensively employed across various optimization algorithms, including cuckoo search (CS), PSO, DE, GA, and ant colony optimization (ACO), marking its prominence as a commonly adopted technique in optimization disciplines. Pandey, Kulhari [32] introduced a roulette wheel-based cuckoo search clustering method for sentiment analysis. This method was found to outperform existing clustering methods like K-means and GWO in terms of mean accuracy, precision, and recall across nine sentimental datasets. Zhu, Yang [33] introduced a ranking weight-based RWS method to enhance the performance of comprehensive learning PSO. Experimental results indicate that this method surpasses other selection techniques in overall optimization efficiency. Yu, Fu [34] presented an improved RWS method designed for GA, targeting the traveling salesman problem. The method showed enhanced result precision and faster convergence rates. Ho-Huu, Nguyen-Thoi [35] introduced ReDE, a variant of the DE algorithm enhanced with RWS and elitist techniques. This variant was aimed at optimizing truss structures with frequency constraints, and numerical results suggest it outperforms several existing optimization methods. Lloyd and Amos [36] conducted the first comprehensive analysis of Independent Roulette (I-Roulette), an alternative to standard RWS in parallel ACO. The study revealed its capability for dynamic adaptation and faster convergence, especially when implemented on high-performance parallel architectures like GPUs.
2.2 Opposition-Based Learning (OBL)
The OBL technique has garnered significant attention for its wide-ranging applicability and effectiveness in various optimization applications. Originally introduced by Tizhoosh [37] in 2005, OBL serves as a novel framework for computational intelligence, creating complementary solutions to existing ones. Subsequent work has extended the utility of OBL in different computational algorithms, thereby yielding promising results in terms of faster convergence and improved performance. For example, Verma, Aggarwal [38] proposed a modified firefly algorithm that incorporates OBL. This innovation not only enhances initial candidate solutions but also employs a dimension-based approach for updating the positions of individual fireflies. Experimental results confirmed faster convergence and superior performance in high-dimensional problems when compared to existing evolutionary algorithms. Similarly, Upadhyay, Kar [39] presented an opposition-based harmony search algorithm aimed at optimizing adaptive infinite impulse response system identification. They reported faster convergence rates and superior mean square error fitness values when compared to traditional optimization methods such as GA, PSO, and DE. In the realm of project management, Luong, Tran [40] introduced a novel algorithm termed opposition-based multiple objective differential evolution. This algorithm employs opposition numbers to address the time–cost-quality trade-off in construction projects, thereby improving both exploration and convergence rates. Wang, Wu [41] proposed an enhanced PSO algorithm named GOPSO, which incorporates generalized opposition-based learning along with Cauchy mutation. This approach was specifically designed to mitigate the problem of premature convergence in complex optimization scenarios. Ewees, Abd Elaziz [42] introduced OBLGOA, an enhanced GOA that incorporates OBL at two distinct stages. This implementation was shown to improve solution quality and reduce time complexity. The algorithm outperformed ten well-known optimization algorithms across twenty-three benchmark functions and four engineering problems. In summary, OBL has been effectively integrated into a variety of optimization algorithms, consistently offering advantages in terms of speed and performance.
2.3 Novel Version of SCA (nSCA)
In the nSCA algorithm, the location of each solution is specified by an array of variables. These arrays collectively constitute sets of solutions, which are systematically organized in a matrix format, as described in Eq. (1). Similarly, the sets of opposite solutions generated during the exploration stage are also presented in a matrix layout, as delineated in Eq. (2). These matrix-based representations facilitate the management and assessment of solutions within the algorithm, thereby enabling more effective exploration and optimization of the search landscape.
In the initial population generation phase, the OBL method is utilized to create opposite solutions, as illustrated in Fig. 1. The specific process for incorporating OBL within nSCA is outlined in the accompanying pseudocode presented in Table 1. Subsequently, a fitness function evaluates both the randomly generated solutions and their oppositional counterparts. This evaluation identifies superior and inferior solutions. The algorithm retains the more performant solutions while discarding the less effective ones, thereby ensuring a consistent population size throughout the optimization process.
The opposite solution \({s}^{*}\) of the solution \(s\in [{b}_{l},{b}_{u}]\) can be identified as follow:
where bl and bu denote the lower and upper boundary of alternative s, respectively.
Given a solution S characterized by d parameters, where each parameter constrained within \([{b}_{l,j},{b}_{u,j}]\), an opposition solution \({S}^{*}=({s}_{1}^{*},{s}_{2}^{*},{s}_{3}^{*},\dots ,{s}_{d}^{*})\) can be defined as follow:
where bl,j and bu,j show the lower and upper limits of the jth dimension, respectively.
Upon refreshing the solution set during the initial population creation phase, the solutions undergo sorting to identify the current best-performing candidate. Subsequently, each solution's normalized fitness score is computed. This computation is integral to the functioning of the RWS mechanism, as depicted in Fig. 2. The formula for calculating the normalized fitness score is articulated in Eq. (5), while the mathematical representation of the RWS mechanism is provided in Eq. (6). These computational processes and mechanisms are pivotal in guiding the algorithm's solution selection and subsequent exploratory activities.
In Eqs. (5) and (6), NF(Si) and F(Si) denote the normalized fitness value and the fitness value of the ith solution, Si, respectively. The notation \({s}_{i}^{j}\) represents the jth parameter of the ith solution, while \({s}_{1}^{j}\) refers to the jth parameter of the current best-performing solution. The variable σ2 is a random number that falls within the range of 0 to 1.
The partitioning of the optimization process into exploration and exploitation phases is a recurring theme in the existing literature, particularly in relation to population-based stochastic algorithms [8]. During the exploration phase, the optimization algorithm utilizes a higher degree of randomness to facilitate the combination of diverse solutions, swiftly identifying promising areas within the search space. In contrast, the exploitation phase concentrates on the refinement of existing solutions through incremental adjustments, exhibiting significantly reduced levels of stochastic variability relative to the exploration stage. Within the SCA framework, specific mathematical expressions, represented by Eq. (7), govern the updating of agent positions in both exploration and exploitation stages. These equations are pivotal as they guide the search mechanism of the SCA, thereby enabling efficient exploration and targeted exploitation of the search landscape.
where \({s}_{j}^{t}\) represents the position of the solution in the jth dimension at the tth iteration; σ1 defines the direction of movement; σ3 is a uniformly distributed random variable ranging between 0 and 1; σ4 serves as a stochastic variable that regulates the extent of movement toward or away from the target, while σ5 acts as a randomly determined weight for the destination; the position of the target solution in the jth dimension is denoted by \({D}_{j}^{t}\), and the absolute value is symbolized by ||.
Figure 3 presents a detailed model to elucidate the efficacy of sine and cosine functions within the interval [− 2, 2]. These trigonometric functions serve as versatile tools for navigational purposes, either by confining movement within the ranges defined by them or by facilitating extensions beyond these boundaries. Such flexibility is conducive to steering toward the desired objectives effectively. Importantly, the figure delineates the dynamic ranges of the sine and cosine functions, which play a crucial role in updating the positions of potential solutions. Furthermore, Eq. (7) introduces a stochastic variable, denoted as σ4, with a range between 0 and 2π. The inclusion of this stochastic element imbues the algorithm with a degree of randomness, thereby enhancing its exploratory capabilities. This feature allows for a more thorough evaluation of potential solutions within the given search landscape.
During each iteration cycle, the range of the sine and cosine functions, as outlined in Eq. (7), is adaptively modified to achieve a balanced trade-off between exploration and exploitation. This is further illustrated in Fig. 4. This dynamic adjustment is specifically engineered to effectively identify promising regions within the search space, thus facilitating more efficient discovery of the optimal solution. The guidelines for this modification process are set forth in Eq. (8), where the constant v is designated a value of 2. In this equation, Icur symbolizes the current iteration count, and Imax represents the maximum number of iterations permitted.
In the exploitation stage, as detailed in the pseudocode for nSCA presented in Table 1, solution updates are carried out in accordance with Eq. (7). Following these updates, a jumping condition, denoted as JC in Eq. (9), is activated to dynamically generate an opposite solution in accordance with Eq. (10). It is noteworthy that this approach deviates from the methodology employed in the initial phase of population generation. Subsequent to the generation of opposite solutions, the objective function is applied to both the original solutions and the newly formed opposite solutions. The superior solution is retained, while the inferior one is eliminated. This process ensures that the population size remains constant, as mandated by Eq. (11).
where Si represents the ith solution while \({S}_{i}^{*}\) represents the opposite solution of the ith solution created by OBL; σ6 is a uniformly distributed random variable between 0 and 1.
3 Convergence Analysis
In the field of optimization, encompassing the application of evolutionary algorithms and metaheuristics, the validation of algorithmic effectiveness is critically dependent on the use of specialized test cases. This is particularly important given the inherently stochastic nature of these methodologies, where achieving optimal results requires the careful selection of a diverse and appropriate set of test functions. The aim of this section is to evaluate the performance of the nSCA algorithm, as substantiated through its application to 23 classical test functions, as well as the CEC2017 set. Each of these test functions has unique characteristics, designed to enable an in-depth assessment of the algorithm’s performance.
3.1 Convergence Analysis on Classical Benchmark Functions
The efficacy of the nSCA algorithm was rigorously assessed using an extensive set of 23 test functions [43,44,45]. These functions were grouped into three distinct categories, as outlined in Table 2: unimodal, multimodal, and fixed functions. The unimodal category consists of functions with a single global optimum and no local optimum, serving as a basis to evaluate the algorithm's capacity for rapid convergence and focused exploitation. In contrast, multimodal functions feature multiple local optima in addition to a global optimum, enabling a thorough assessment of the algorithm’s capability to navigate around local optima for effective exploration of the search space. Finally, the fixed category includes modified versions of both unimodal and multimodal functions, which are altered through operations such as rotation, shifting, and bias. These composite functions are designed to evaluate the algorithm's adaptability and performance in complex optimization landscapes.
To rigorously evaluate the performance capabilities of the nSCA algorithm in optimization tasks, an ensemble of 25 search agents was employed to locate the global optimum within a suite of 23 test functions. This experiment was conducted over a span of 300 iterations. The performance of nSCA was subsequently benchmarked against a selection of leading metaheuristic algorithms, including SSA, MVO, MFO, WOA, GOA, and the original SCA. Due to the stochastic components intrinsic to these algorithms, each was executed 30 times to ensure result reliability. Key statistical metrics, including average values (avg) and standard deviations (std), were calculated, and are presented in Tables 3, 4, 5 and 6. This comprehensive approach provides valuable insights into the comparative effectiveness of nSCA and other algorithms in optimization contexts.
In the realm of unimodal test functions, as evidenced by the results in Table 3, nSCA holds a marked advantage over its competitors. Specifically, within the scope of unimodal optimization, nSCA’s exploitation capabilities surpass those of SCA, MFO, MVO, WOA, SSA, and GOA in most test functions. This data effectively emphasizes nSCA’s proficiency in handling unimodal optimization challenges. Regarding multimodal optimization, Table 4 provides data that confirm nSCA’s superior performance over SCA, MFO, MVO, WOA, SSA, and GOA in most test cases. This impressive showing reinforces nSCA’s capabilities in effectively navigating complex search spaces and avoiding local optima. Lastly, when examined in the context of fixed test functions, nSCA shows performance metrics that are on par with those of SCA, WOA, MVO, SSA, GOA, and MFO, as illustrated in Tables 5 and 6. These results lend further support to nSCA’s considerable versatility and competitive edge when compared to other state-of-the-art optimization algorithms.
Additional performance metrics such as the convergence curve, average solution fitness, trajectory of the first solution, and search history were scrutinized to provide a more nuanced assessment of nSCA’s effectiveness. The study employed a configuration of 300 iterations and 25 search agents to examine three representative test functions (f1, f9, and f21). Each of these functions represents a different category: unimodal, multimodal, and composite, as depicted in Fig. 5. Analysis of the convergence curve and average fitness reveals a consistent improvement in the quality of the search agents over successive iterations. This observation underscores nSCA’s capability to enhance the quality of initially randomized solutions in specific optimization tasks.
Analysis of the trajectory of the first solution underscores nSCA’s abilities in both convergence and local search optimization. This is supported by the notable fluctuations in average fitness levels during the exploration phase and the relatively stable metrics seen in the exploitation stage, as cited in reference [46]. Further, the search histories associated with functions f1, f9, and f21 substantiate nSCA’s aptitude for identifying and concentrating on high-potential regions within the search space. The incorporation of RWS and OBL mechanisms proves to be beneficial, facilitating initial exploration and contributing to the ultimate convergence of optimal solutions initially identified during the exploration phase.
Figures 6, 7 and 8 display the convergence patterns for the 23 test functions, obtained over 150 iterations employing 25 search agents. The findings suggest that more efficient convergence for the majority of the test functions analyzed is achieved by the nSCA in comparison to other algorithms such as the original SCA, MVO, MFO, SSA, GOA, and WOA.
3.2 CEC2017 Benchmark Test Functions
The CEC2017 test functions constitute a specialized set of benchmarks, introduced at the 2017 IEEE Congress on Evolutionary Computation (CEC), focusing on the optimization of real parameters. Building upon the groundwork established by previous benchmark suites, the CEC2017 collection is designed to present a diverse range of challenges to optimization algorithms. These functions are generally considered to provide more realistic problem scenarios in comparison to the traditional set of 23 benchmark functions.
Spanning both unimodal and multimodal optimization landscapes, the CEC2017 suite also encompasses separable and non-separable problem domains. Moreover, it incorporates shifted and rotated variations, thereby offering a comprehensive environment for the testing of optimization algorithms. This extensive array of test scenarios enables researchers to conduct in-depth evaluations, thereby discerning the merits and limitations of various optimization methods under different conditions.
The efficacy of nSCA is evaluated using the IEEE CEC2017 benchmark suites [47]. These test functions are categorized into four distinct groups: unimodal, multimodal, hybrid, and composition. Table 7 offers a comprehensive breakdown of the definitions associated with the CEC2017 benchmark challenges. To increase the level of complexity and rigorously assess the capabilities of the proposed method in handling complex optimization problems, all functions within the CEC2017 suite are configured as 30-dimensional problems.
Tables 8 and 9 provide an in-depth statistical comparison between nSCA and other swarm-based optimization algorithms such as SSA, MVO, MFO, WOA, GOA, and the original SCA. To ensure a rigorous and unbiased evaluation, each algorithm was executed 30 times on a variety of benchmark functions. Statistical metrics like mean values (avg) and standard deviations (std) were subsequently calculated from these multiple runs. For the purposes of this study, a cohort of 50 search agents was deployed, each limited to a maximum of 300 iterations. A careful analysis of the data presented in Tables 8 and 9 clearly shows that nSCA consistently outperforms its counterparts, specifically SSA, MVO, MFO, WOA, GOA, and the original SCA, in various benchmark categories including unimodal, multimodal, hybrid, and composition functions.
4 Engineering Optimization Challenges
The purpose of this section is to assess the performance of nSCA as evidenced through its deployment in five real-world technical optimization problems, each characterized by varying inequality constraints. The primary focus lies in evaluating the capability of the algorithm to manage these constraints effectively throughout the optimization process.
4.1 Cantilever Beam Design Challenge
The objective of this optimization task is to achieve minimization of the weight of a cantilever beam, which is constructed from hollow square blocks. The structure consists of five such blocks, with the first block being fixed in position and the fifth subjected to a vertical load. A visual representation of the five parameters that determine the cross-sectional geometry of the blocks is provided in Fig. 9. Detailed formulations for addressing this problem can be found in Appendix 1.
The findings from an exhaustive analysis of this task are summarized in Table 10, which provides a comprehensive breakdown of key performance indicators. The data convincingly demonstrate that the nSCA algorithm consistently yields results that are either commensurate with or superior to those of leading optimization algorithms such as COA [52], RFO [51], GOA [6], MVO [1], ALO [50], CS [48] and SOS [49]. These findings strongly substantiate the algorithm’s capability to address and optimize complex, constraint-bound problems effectively. Additionally, the results underscore the algorithm's aptitude for real-world engineering applications, highlighting its proficiency in navigating intricate problem landscapes.
4.2 Pressure Vessel Design Challenge
The primary objective of this optimization task is the reduction of manufacturing costs associated with the fabrication of a pressure vessel. A representation of the vessel’s unique design, featuring one flat and one hemispherical end, is illustrated in Fig. 10. The variables subject to optimization encompass the inner radius (R), shell thickness (Ts), length of the cylindrical section exclusive of the head (L), and the head's thickness (Th). These variables are pivotal in establishing the optimal design of the vessel. Specific mathematical equations and constraints have been formulated to encapsulate the dual aim of cost minimization and design requirement adherence. Comprehensive formulations for this task can be found in Appendix 1.
The outcomes of a comprehensive evaluation of this problem are summarized in Table 11, which offers a detailed analysis of various performance metrics. The data presented in this table confirm the reliable effectiveness of the nSCA algorithm, often matching or even surpassing other well-established optimization methods such as SCSO [58], RFO [51], AOA [57], GSA [1], MVO [1], ACO [56], ES [55], DE [54], and PSO [53]. These results robustly endorse the capabilities of nSCA in proficiently navigating the search space, an ability further augmented by the integration of roulette wheel selection (RWS) and opposition-based learning (OBL). Additionally, the findings underscore the algorithm’s versatility, demonstrating its suitability for application in engineering contexts, particularly in instances where the attributes of the search domain are either ambiguous or poorly defined.
4.3 Three-Bar Truss Design Challenge
The primary objective of this challenge is the weight reduction of the truss structure, to be achieved within the boundaries of various constraints. Successful truss design necessitates the consideration of essential limitations, including those related to stress, deflection, and buckling factors. The engineering characteristics pertinent to this issue are illustrated in Fig. 11. Although the objective function may appear straightforward, it is governed by multiple intricate constraints, rendering the achievement of an optimal solution notably challenging. Detailed formulations relevant to this problem are provided in Appendix 1.
Table 12 provides an exhaustive comparison between the nSCA and various state-of-the-art optimization methods, including GOA [6], MVO [1], ALO [50], MBA [63], CS [48], PSO-DE [62], DEDS [61], as well as models put forth by Ray and Saini [61] and Tsai [62]. The data strongly suggest that nSCA consistently performs at a level comparable to the best algorithms in the field, thereby establishing itself as a formidable competitor in achieving optimal outcomes.
4.4 Gear Train Design Challenge
The objective of this technical task, illustrated in Fig. 12, is the minimization of the gear ratio through the optimization of four discrete variables: the tooth counts on gears nA, nB, nC and nD. The gear ratio is utilized as a measure of the relationship between the angular speeds of the output and input shafts. Incrementation by units of one characterizes these discrete variables. Emphasis in the problem formulation is placed on establishing constraints for the permissible range of these variables. Detailed specifications related to this challenge are delineated in Appendix 1.
Table 13 presents an in-depth comparison between the nSCA and a range of well-known optimization techniques. The data in this table highlight a remarkable similarity in the performance of nSCA to that of leading optimization methods, including MVO [1], ISA [66], CS [48], MBA [63], ABC [63], as well as models developed by Deb and Goyal [65] and Kannan and Kramer [64]. These results strongly affirm the effectiveness of the proposed nSCA algorithm, demonstrating its capabilities even when faced with challenges involving discrete variables. The proficiency of nSCA in managing discrete variables expands its range of applicability and emphasizes its suitability for addressing a diverse array of optimization problems across various disciplines.
4.5 Welded Beam Design Challenge
The overarching aim of this engineering task is the minimization of manufacturing costs associated with a welded beam. An overview of the system and structural parameters relevant to this challenge is provided in Fig. 13, emphasizing four principal design variables: the length of the attached bar (l), weld thickness (ℎ), the thickness of the bar (b), and the height of the bar (t). For the design to be considered feasible, the beam must satisfy seven specific constraints when subjected to a top-applied load. These constraints encompass various factors, such as side constraints, end deflection of the beam (δ), shear stress (τ), bending stress in the beam (θ), and the buckling load on the bar (Pc). Comprehensive formulations pertinent to this task are outlined in Appendix 1.
Table 14 presents a comprehensive comparison between the nSCA and various other cutting-edge optimization techniques. The findings presented in the table offer compelling evidence that the nSCA consistently achieves superior outcomes when juxtaposed with established algorithms, including SSA [68], RFO [51], MVO [1], GSA [1], CPSO [1], HS [53], and GA [67]. The outcomes elucidated in Table 14 distinctly illustrate that the nSCA proficiently identifies optimal solutions even within the confines of complex constrained challenges.
The remarkable performance of the nSCA in effectively navigating intricate problem spaces serves to underscore its potential in addressing practical engineering applications marked by multifaceted and intricate constraints. This further underscores the significant role that the nSCA plays as a valuable instrument within the domain of engineering optimization. Its capabilities offer promising avenues for the enhancement of problem-solving strategies and the facilitation of effective decision-making processes.
5 Conclusion
This study introduces an innovative approach that synergistically merges the roulette wheel selection (RWS) mechanism with opposition-based learning (OBL) to enhance the efficacy of the sine cosine algorithm (SCA) in navigating intricate search spaces. This integration gives rise to a novel iteration of the SCA, referred to as nSCA. The comprehensive assessment of nSCA performance is meticulously conducted through comparative experiments involving a range of state-of-the-art algorithms, including MVO, MFO, SSA, WOA, GOA, and the original SCA. To rigorously gauge its capabilities, 23 benchmark test functions are employed, offering a thorough benchmarking of nSCA performance. Additionally, the practical effectiveness of nSCA is demonstrated by successfully addressing five distinct engineering optimization problems. The outcomes underscore the superiority of nSCA when compared to alternative evolutionary computation approaches, highlighting its ability to generate exceptionally competitive solutions across both benchmark test functions and real-world engineering optimization challenges. These compelling findings emphasize the value of nSCA as an indispensable tool in the domain of engineering optimization, promising significant contributions to problem-solving strategies and decision-making processes. Given these substantial insights, it is evident that nSCA presents an impactful and robust approach well-equipped to address intricate optimization challenges encountered in real-world scenarios.
Data Availability
The corresponding author is available to provide the data, model, or code underlying the findings of this study upon request, in accordance with reasonable conditions.
References
Mirjalili, S., Mirjalili, S.M., Hatamlou, A.: Multi-verse optimizer: a nature-inspired algorithm for global optimization. Neural Comput. Appl. 27(2), 495–513 (2016)
Mirjalili, S.: SCA: a sine cosine algorithm for solving optimization problems. Knowl.-Based Syst. 96, 120–133 (2016)
Storn, R., Price, K.: Differential evolution-a simple and efficient heuristic for global optimization over continuous spaces. J. Global Optim. 11(4), 341 (1997)
Mirjalili, S.: Moth-flame optimization algorithm: a novel nature-inspired heuristic paradigm. Knowl.-Based Syst.Based Syst. 89, 228–249 (2015)
Mirjalili, S., Lewis, A.: The whale optimization algorithm. Adv. Eng. Softw. 95, 51–67 (2016)
Saremi, S., Mirjalili, S., Lewis, A.: Grasshopper optimisation algorithm: theory and application. Adv. Eng. Softw. 105, 30–47 (2017)
Mirjalili, S., et al.: Salp Swarm Algorithm: a bio-inspired optimizer for engineering design problems. Adv. Eng. Softw. 114, 163–191 (2017)
Črepinšek, M., Liu, S.-H., Mernik, M.: Exploration and exploitation in evolutionary algorithms: a survey. ACM Comput. Surv. (CSUR) 45(3), 1–33 (2013)
Lin, L., Gen, M.: Auto-tuning strategy for evolutionary algorithms: balancing between exploration and exploitation. Soft. Comput. 13(2), 157–168 (2009)
Wolpert, D.H., Macready, W.G.: No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1(1), 67–82 (1997)
Pham, V.H.S., Trang, N.T.N., Dat, C.Q.: Optimization of production schedules of multi-plants for dispatching ready-mix concrete trucks by integrating grey wolf optimizer and dragonfly algorithm. Eng. Construct. Architect. Manage. (2023). https://doi.org/10.1108/ECAM-12-2022-1176
Son, P.V.H., Nguyen Dang, N.T.: Optimizing time and cost simultaneously in projects with multi-verse optimizer. Asian J. Civ. Eng. (2023). https://doi.org/10.1007/s42107-023-00652-y
Qiao, W., et al.: A hybrid algorithm for carbon dioxide emissions forecasting based on improved lion swarm optimizer. J. Clean. Prod. 244, 118612 (2020)
Altay, O., Cetindemir, O., Aydogdu, I.: Size optimization of planar truss systems using the modified salp swarm algorithm. Eng. Optimiz. (2023). https://doi.org/10.1080/0305215X.2022.2160449
Pham, V.H.S., Soulisa, F.V.: A hybrid ant lion optimizer (ALO) algorithm for construction site layout optimization. J. Soft Comput. Civ. Eng. 7(4), 50–71 (2023)
Goksal, F.P., Karaoglan, I., Altiparmak, F.: A hybrid discrete particle swarm optimization for vehicle routing problem with simultaneous pickup and delivery. Comput. Ind. Eng. 65(1), 39–53 (2013)
Son, P.V.H., Duy, N.H.C., Dat, P.T.: Optimization of construction material cost through logistics planning model of dragonfly algorithm—particle swarm optimization. KSCE J. Civ. Eng. 25(7), 2350–2359 (2021)
Shang, C., Zhou, T.-T., Liu, S.: Optimization of complex engineering problems using modified sine cosine algorithm. Sci. Rep. 12(1), 20528 (2022)
Raut, U. and Mishra, S.: Power distribution network reconfiguration using an improved sine–cosine algorithm-based meta-heuristic search. In: Soft Computing for Problem Solving: SocProS 2017, Volume 1. 2019. Springer
Reddy, K.S., et al.: A new binary variant of sine–cosine algorithm: development and application to solve profit-based unit commitment problem. Arab. J. Sci. Eng. 43, 4041–4056 (2018)
Sahlol, A.T., et al.: Training feedforward neural networks using Sine-Cosine algorithm to improve the prediction of liver enzymes on fish farmed on nano-selenite. In: 2016 12th International Computer Engineering Conference (ICENCO). 2016. IEEE
Zhao, Y., Zou, F. and Chen, D.: A discrete sine cosine algorithm for community detection. In: Intelligent Computing Theories and Application: 15th International Conference, ICIC 2019, Nanchang, China, August 3–6, 2019, Proceedings, Part I 15. 2019. Springer
Aydin, O., et al.: Comparative parameter estimation of single diode PV-cell model by using sine-cosine algorithm and whale optimization algorithm. In: 2019 6th International Conference on Electrical and Electronics Engineering (ICEEE). 2019. IEEE
Cheng, J., Duan, Z.: Cloud model based sine cosine algorithm for solving optimization problems. Evol. Intel. 12, 503–514 (2019)
Bureerat, S. and Pholdee, N.: Adaptive sine cosine algorithm integrated with differential evolution for structural damage detection. In: International Conference on Computational Science and Its Applications. 2017. Springer
Turgut, O.E.: Thermal and economical optimization of a shell and tube evaporator using hybrid backtracking search—sine–cosine algorithm. Arab. J. Sci. Eng. 42(5), 2105–2123 (2017)
Bairathi, D. and Gopalani, D.: Opposition-based sine cosine algorithm (OSCA) for training feed-forward neural networks. In: 2017 13th International Conference on Signal-Image Technology & Internet-Based Systems (SITIS). 2017. IEEE
Qu, C., et al.: A modified sine-cosine algorithm based on neighborhood search and greedy levy mutation. Comput. Intell. Neurosci.. Intell. Neurosci. (2018). https://doi.org/10.1155/2018/4231647
Son, P.V.H., Nguyen Dang, N.T.: Solving large-scale discrete time–cost trade-off problem using hybrid multi-verse optimizer model. Sci. Reports 13(1), 1987 (2023)
Pham, V.H.S., Nguyen, V.N.: Cement transport vehicle routing with a hybrid sine cosine optimization algorithm. Adv. Civ. Eng. 2023, 2728039 (2023)
Abualigah, L., Diabat, A.: Advances in sine cosine algorithm: a comprehensive survey. Artif. Intell. Rev. 54(4), 2567–2608 (2021)
Pandey, A.C., Kulhari, A., Shukla, D.S.: Enhancing sentiment analysis using roulette wheel selection based cuckoo search clustering method. J. Ambient. Intell. Humaniz. Comput. 13(1), 1–29 (2022)
Zhu, Y.-P., et al.: A ranking weight based roulette wheel selection method for comprehensive learning particle swarm optimization. In: 2022 IEEE International Conference on Systems, Man, and Cybernetics (SMC). 2022. IEEE
Yu, F., et al.: Improved roulette wheel selection-based genetic algorithm for TSP. In: 2016 International Conference on Network and Information Systems for Computers (ICNISC). 2016. IEEE
Ho-Huu, V., et al.: An improved differential evolution based on roulette wheel selection for shape and size optimization of truss structures with frequency constraints. Neural Comput. Appl. 29, 167–185 (2018)
Lloyd, H. and Amos, M.: Analysis of independent roulette selection in parallel ant colony optimization. In: Proceedings of the Genetic and Evolutionary Computation Conference. 2017
Tizhoosh, H.R.: Opposition-based learning: a new scheme for machine intelligence. In: International Conference on Computational Intelligence for Modelling, Control and Automation and International Conference on Intelligent Agents, Web Technologies and Internet Commerce (CIMCA-IAWTIC'06). 2005. IEEE
Verma, O.P., Aggarwal, D., Patodi, T.: Opposition and dimensional based modified firefly algorithm. Expert Syst. Appl. 44, 168–176 (2016)
Upadhyay, P., et al.: A novel design method for optimal IIR system identification using opposition based harmony search algorithm. J. Franklin Inst. 351(5), 2454–2488 (2014)
Luong, D.-L., Tran, D.-H., Nguyen, P.T.: Optimizing multi-mode time-cost-quality trade-off of construction project using opposition multiple objective difference evolution. Int. J. Constr. Manag. 21(3), 271–283 (2021)
Wang, H., et al.: Enhancing particle swarm optimization using generalized opposition-based learning. Inf. Sci. 181(20), 4699–4714 (2011)
Ewees, A.A., Abd-Elaziz, M., Houssein, E.H.: Improved grasshopper optimization algorithm using opposition-based learning. Expert Syst. Appl. 112, 156–172 (2018)
Yao, X., Liu, Y., Lin, G.: Evolutionary programming made faster. IEEE Trans. Evol. Comput. 3(2), 82–102 (1999)
Yang, X.-S.: Test problems in optimization. arXiv preprint arXiv:1008.0549, 2010.
Digalakis, J.G., Margaritis, K.G.: On benchmarking functions for genetic algorithms. Int. J. Comput. Math. 77(4), 481–506 (2001)
Van den Bergh, F., Engelbrecht, A.P.: A study of particle swarm optimization particle trajectories. Inf. Sci. 176(8), 937–971 (2006)
Wu, G., Mallipeddi, R. and Suganthan, P.N.: Problem definitions and evaluation criteria for the CEC 2017 competition on constrained real-parameter optimization. National University of Defense Technology, Changsha, Hunan, PR China and Kyungpook National University, Daegu, South Korea and Nanyang Technological University, Singapore, Technical Report, 2017.
Gandomi, A.H., Yang, X.-S., Alavi, A.H.: Cuckoo search algorithm: a metaheuristic approach to solve structural optimization problems. Eng. Comput.Comput. 29, 17–35 (2013)
Cheng, M.-Y., Prayogo, D.: Symbiotic organisms search: a new metaheuristic optimization algorithm. Comput. Struct. 139, 98–112 (2014)
Mirjalili, S.: The ant lion optimizer. Adv. Eng. Softw. 83, 80–98 (2015)
Połap, D., Woźniak, M.: Red fox optimization algorithm. Expert Syst. Appl. 166, 114107 (2021)
Jia, H., et al.: Crayfish optimization algorithm. Artif. Intell. Rev.. Intell. Rev. (2023). https://doi.org/10.1007/s10462-023-10567-4
Lee, K.S., Geem, Z.W.: A new meta-heuristic algorithm for continuous engineering optimization: harmony search theory and practice. Comput. Methods Appl. Mech. Eng. 194(36–38), 3902–3933 (2005)
Li, L.-J., et al.: A heuristic particle swarm optimizer for optimization of pin connected structures. Comput. Struct. 85(7–8), 340–349 (2007)
Mezura-Montes, E., Coello, C.A.C.: An empirical study about the usefulness of evolution strategies to solve constrained optimization problems. Int. J. Gen. Syst. 37(4), 443–473 (2008)
Kaveh, A., Talatahari, S.: An improved ant colony optimization for constrained engineering design problems. Eng. Comput.Comput. (2010). https://doi.org/10.1108/02644401011008577
Abualigah, L., et al.: The arithmetic optimization algorithm. Comput. Methods Appl. Mech. Eng. 376, 113609 (2021)
Seyyedabbasi, A., Kiani, F.: Sand Cat swarm optimization: a nature-inspired algorithm to solve global optimization problems. Eng. Comput.Comput. 39(4), 2627–2651 (2023)
Ray, T., Saini, P.: Engineering design optimization using a swarm with an intelligent information sharing among individuals. Eng. Optim. 33(6), 735–748 (2001)
Tsai, J.-F.: Global optimization of nonlinear fractional programming problems in engineering design. Eng. Optim. 37(4), 399–409 (2005)
Zhang, M., Luo, W., Wang, X.: Differential evolution with dynamic stochastic selection for constrained optimization. Inf. Sci. 178(15), 3043–3074 (2008)
Liu, H., Cai, Z., Wang, Y.: Hybridizing particle swarm optimization with differential evolution for constrained numerical and engineering optimization. Appl. Soft Comput. 10(2), 629–640 (2010)
Sadollah, A., et al.: Mine blast algorithm: a new population based algorithm for solving constrained engineering optimization problems. Appl. Soft Comput. 13(5), 2592–2612 (2013)
Kannan, B. and Kramer, S.N.: An augmented Lagrange multiplier based method for mixed integer discrete continuous optimization and its applications to mechanical design. 1994.
Deb, K., Goyal, M.: A combined genetic adaptive search (GeneAS) for engineering design. Comput. Sci. Inform. 26, 30–45 (1996)
Gandomi, A.H.: Interior search algorithm (ISA): a novel approach for global optimization. ISA Trans. 53(4), 1168–1183 (2014)
Coello Coello, C.A.: Constraint-handling using an evolutionary multiobjective optimization technique. Civ. Eng. Syst. 17(4), 319–346 (2000)
Hashim, F.A., Hussien, A.G.: Snake Optimizer: A novel meta-heuristic optimization algorithm. Knowl.-Based Syst..-Based Syst. 242, 108320 (2022)
Acknowledgements
We acknowledge Ho Chi Minh City University of Technology (HCMUT), VNU-HCM for supporting this study.
Funding
This research did not receive dedicated funding from public, commercial, or non-profit grant agencies.
Author information
Authors and Affiliations
Contributions
All authors, including VHSP, NTND, and VNN jointly contributed to the writing of the main manuscript, preparation of all figures and tables, and reviewed and approved the final version prior to submission.
Corresponding author
Ethics declarations
Conflict of Interest
There is no conflict of interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendices
Appendix 1
Cantilever Beam Design
Consider:
Minimize:
Subject to:
Variable range:
Pressure Vessel Design Problem
Consider:
Minimize:
Subject to:
Variable range:
Three-Bar Truss Design Problem
Consider:
Minimize:
Subject to:
Variable range:
where l = 100 cm, P = 2 KN/cm2, σ = 2 KN/cm2.
Gear Train Design Problem
Consider:
Minimize:
Variable range:
5.1 Welded Beam Design Problem
Consider:
Minimize:
Subject to:
Variable range:
where
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Pham, V.H.S., Nguyen Dang, N.T. & Nguyen, V.N. Hybrid Sine Cosine Algorithm with Integrated Roulette Wheel Selection and Opposition-Based Learning for Engineering Optimization Problems. Int J Comput Intell Syst 16, 171 (2023). https://doi.org/10.1007/s44196-023-00350-2
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s44196-023-00350-2