Nature Inspired Computing: An Overview and Some Future Directions
- 2.5k Downloads
This paper presents an overview of significant advances made in the emerging field of nature-inspired computing (NIC) with a focus on the physics- and biology-based approaches and algorithms. A parallel development in the past two decades has been the emergence of the field of computational intelligence (CI) consisting primarily of the three fields of neural networks, evolutionary computing and fuzzy logic. It is observed that NIC and CI intersect. The authors advocate and foresee more cross-fertilisation of the two emerging fields.
KeywordsNature-inspired computing Physics-based algorithms Biology-based algorithms Meta-heuristic algorithms Search and optimisation
Inspiration from the Nature
Nature does things in an amazing way. Behind the visible phenomena, there are innumerable invisible causes hidden at times. Philosophers and scientists have been observing these phenomena in the nature for centuries and trying to understand, explain, adapt and replicate the artificial systems. There are innumerable agents and forces within the living and non-living world, most of which are unknown and the underlying complexity is beyond human comprehension as a whole. These agents act in parallel and very often against each other giving form and feature to nature, and regulating the harmony, beauty and vigour of life. This is seen as the dialectics of nature which lies in the concept of the evolution of the natural world. The evolution of complexity in nature follows a distinctive order. There is also information processing in nature performed in a distributed, self-organised and optimal manner without any central control . This whole series of forms, mechanical, physical, chemical, biological and social, is distributed according to complexity from lower to higher. This sequence expresses its mutual dependence and relationship in terms of structure and history. The activities change due to changed circumstances. All these phenomena known or partially known so far are emerging as new fields of science and technology, and computing that study problem-solving techniques inspired by nature as well as attempts to understand the underlying principles and mechanisms of natural, physical, chemical and biological organisms that perform complex tasks in a befitting manner with limited resources and capability.
Science is a dialogue between the scientists and the nature  which has evolved over the centuries enriching with new concepts, methods and tools and developed into well-defined disciplines of scientific endeavour. Mankind has been trying to understand the nature ever since by developing new tools and techniques. The field of nature-inspired computing (NIC) is interdisciplinary in nature combining computing science with knowledge from different branches of sciences, e.g. physics, chemistry, biology, mathematics and engineering, that allows development of new computational tools such as algorithms, hardware, or wetware for problem-solving, synthesis of patterns, behaviours and organisms [3, 4]. This Keynote paper presents an overview of significant advances made in the emerging field of nature-inspired computing (NIC) with a focus on the physics- and biology-based approaches and algorithms.
Search and Optimisation
All the living and non-living world, the planetary, galactic, stellar system and the heavenly bodies in the universe belong to nature. One common aspect can be observed in nature, be it physical, chemical or biological, that the nature maintains its equilibrium by any means known or unknown to us. A simplified explanation of the state of equilibrium is the idea of optimum seeking in nature. There is optimum seeking in all spheres of life and nature [5, 6, 7]. In all optimum seeking, there are goals or objectives to be achieved and constraints to be satisfied within which the optimum has to be found [8, 9, 10, 11]. This optimum seeking can be formulated as an optimisation problem [12, 13, 14, 15]. That is, it is reduced to finding the best solution measured by a performance index often known as objective function in many areas of computing and engineering which varies from problem to problem [16, 17, 18, 19].
On the other hand, nondeterministic or stochastic methods exhibit some randomness and produce different solutions in different runs. The advantage is that these methods explore several regions of the search space at the same time and have the ability to escape from local optima and reach the global optimum. Therefore, these methods are more capable of handling NP-hard problems (i.e. problems that have no known solutions in polynomial time) . There are a variety of derivative-free stochastic optimisation algorithms which are of two types: heuristic algorithms (HA) and meta-heuristic algorithms (MHA) (Fig. 1).
Heuristic means to find or discover by means of trial and error. Alan Turning was one of the first to use heuristic algorithms during the Second World War and called his search methods heuristic search. Glover  possibly revived the use of heuristic algorithms in 1970s. The general problem with heuristic algorithms (e.g. scatter search) is that there is no guarantee that optimal solutions are reached though quality solutions are found in a reasonable amount of time. The second generation of the optimisation methods is meta-heuristic proposed to solve more complex problems and very often provides better solutions than heuristic algorithms. The 1980s and 1990s saw a proliferation of meta-heuristic algorithms. The recent trends in meta-heuristic algorithms are stochastic algorithms with certain trade-off of random and local search. Every meta-heuristic method consists of a group of search agents that explore the feasible region based on both randomisation and some specified rules. These methods rely extensively on repeated evaluations of the objective function and use heuristic guidelines for estimating the next search direction. The guidelines used are often simple, and the rules are usually inspired by natural phenomena or laws. Glover and Kochenberger  present a review of the field of meta-heuristics up to 2003.
There are different classifications of meta-heuristic algorithms reported in the literature [24, 25]. They can be classified as population based (PB) and neighbourhood or trajectory based (TB) (Fig. 1). Neighbourhood-based meta-heuristics such as simulated annealing  and tabu search  evaluate only one potential solution at a time and the solution moves through a trajectory in the solution space. The steps or moves trace a trajectory in the search space, with nonzero probability that this trajectory can reach the global optimum. In the population-based meta-heuristics, a set of potential solutions move towards goals simultaneously. For example, genetic algorithm (GA) [28, 29] and particle swarm optimisation (PSO) [30, 31] are population-based algorithms and use a population of solutions.
Nature-Inspired Computing Paradigm
The problem is complex and nonlinear and involves a large number of variables or potential solutions or has multiple objectives.
The problem to be solved cannot be suitably modelled using conventional approaches such as complex pattern recognition and classification tasks.
Finding an optimal solution using traditional approaches is not possible, difficult to obtain or cannot be guaranteed, but a quality measure exists that allows comparison of various solutions.
The problem lends itself to a diversity of solutions or a diversity of solutions is desirable.
Inspired by Newton’s laws of motion, e.g. Colliding Bodies Optimisation (CBO),
Inspired by Newton’s gravitational force, e.g. Gravitational Search Algorithm (GSA), Central Force Optimisation (CFO), Space Gravitation Optimisation (SGO) and Gravitational Interaction Optimisation (GIO)
Inspired by celestial mechanics and astronomy, e.g. Big Bang–Big Crunch search (BB–BC), Black Hole Search (BHS), Galaxy-based Search Algorithm (GbSA), Artificial Physics-based Optimisation (APO) and Integrated Radiation Search (IRS),
Inspired by electromagnetism, e.g. Electromagnetism-like Optimisation (EMO), Charged System Search (CSS) and Hysteretic Optimisation (HO),
Inspired by optics, e.g. Ray Optimisation (RO),
Inspired by acoustics, e.g. Harmony Search Algorithm (HSA),
Inspired by thermodynamics, e.g. Simulated Annealing (SA),
Inspired by hydrology and hydrodynamics, e.g. Water Drop Algorithm (WDA), River Formation Dynamics Algorithm (RFDA) and Water Cycle Algorithm (WCA).
The earliest of all these algorithms was the Simulated Annealing (SA) algorithm based on the principle of thermo-dynamics . The algorithm simulates the cooling process by gradually lowering the temperature of the system until it converges to a steady state. The idea to use simulated annealing to search for feasible solutions and converge to an optimal solution was very stimulating and led researchers to explore other areas of physics.
An idea from the field of sound and acoustics led to the development of HSA inspired by a phenomenon commonly observed in music. The concept behind the HSA is to find a perfect state of harmony determined by aesthetic estimation . A review of harmony search algorithms and its variants is provided by Siddique and Adeli . Hybrid harmony search algorithms are presented by Siddique and Adeli . Applications of HSA are reviewed in Siddique and Adeli .
Zaránd et al.  proposed a method of optimisation inspired by demagnetisation, called hysteretic optimisation (HO). This is a process similar to simulated annealing where the material achieves a stable state by slowly decreasing the temperature. That is, finding the ground states of magnetic samples is similar to finding the optimal point in the search process. Based on the principles of electromagnetism, Birbil and Fang  introduced the electromagnetism-based optimisation. The EM-based algorithm imitates the attraction–repulsion mechanism of the electromagnetism theory in order to solve unconstrained or bound constrained global optimisation problems. It is called electromagnetism-like optimisation (EMO) algorithm. A solution in EMO algorithm is seen as a charged particle in the search space and its charge relates to the objective function value.
Motivated by natural physical forces, Spears et al.  introduced the Artificial Physics Optimisation (APO) where particles are seen as solutions sampled from the feasible region of the problem space. Particles move towards higher fitness regions and cluster to optimal region over time. Heavier mass represents higher fitness value and attracts other masses of lower fitness values. The individual with the best fitness attracts all other individuals with lower fitness values. The individuals with lower fitness values repel each other. That means the individual with best fitness has the biggest mass and move with lower velocity than others. Thus, the attractive–repulsive rule can be treated as the search strategy in the optimisation algorithm which ultimately leads the population to search the better fitness region of the problem. In the initial state, individuals are randomly generated within the feasible region. In APO, mass is defined as the fitness function for the optimisation problem in question. A suitable definition of mass of the individuals is necessary.
Central Force Optimisation (CFO) uses a population of probes that are distributed across a search space . The basic concept of the CFO is the search for the biggest mass that has the strongest force to attract all other masses distributed within a decision space towards it considered as the global optimum of the problem at hand. A review of articles on CFO and its applications to various problems is presented in a recent article by Siddique and Adeli .
Gravitational Search Algorithm (GSA) is a population-based search algorithm inspired by the law of gravity and mass interaction . The algorithm considers agents as objects consisting of different masses. The entire agents move due to the gravitational attraction force acting between them, and the progress of the algorithm directs the movements of all agents globally towards the agents with heavier masses . Gravitational Interactions Optimisation (GIO) is inspired by Newton’s law . It has some similarities with GSA and was introduced around the same time independently of GSA. The gravitational constant G in GSA decreases linearly with time, whereas GIO uses a hypothetical gravitational constant G as constant. GSA uses a set of best individuals to reduce computation time, while GIO allows all masses to interact with each other.
Based on the simple principle of continuous collision between bodies, Kaveh and Mahdavi  proposed the Colliding Bodies Optimisation (CBO). Hsiao et al.  proposed an optimal searching approach, called Space Gravitational Optimisation (SGO) using the notion of space gravitational curvature inspired by the concept of Einstein equivalence principle. SGO is an embryonic form of CFO . Based on the notion of Big Bang and shrinking phenomenon of Big Crunch, Erol and Eksin  proposed Big Bang and Big Crunch (BB–BC) algorithm. In the Big Bang phase, a population of masses is generated with respect to centre of mass. In the Big Crunch phase, all masses collapse into one centre of mass. Thus, the Big Bang phase explores the solution space, while Big Crunch phase performs necessary exploitation as well as convergence. Chuang and Jiang  proposed Integrated Radiation Optimisation (IRO) inspired by the gravitational radiation in the curvature of space–time. Hosseini  proposed Galaxy-based Search Algorithm (GbSA) inspired by the spiral arm of spiral galaxies to search its surrounding. GbSA uses a spiral-like movement in each dimension of the search space with the help of chaotic steps and constant rotation around the initial solution. The spiral optimisation (SpO) is a multipoint search for continuous optimisation problems. The SpO model is composed of plural logarithmic spiral models and their common centre .
Inspired by the phenomenon of the black hole, Hatamlou  proposed the Black Hole (BH) algorithm where candidate solutions are considered as stars and the solution is selected to be black hole. At each iteration, the black hole starts attracting other stars around it. If a star gets too close to the black hole, it will be swallowed and a new star (candidate solution) is randomly generated and placed in the search space to start a new search.
The basic idea of Snell’s law is utilised in Ray Optimisation (RO) proposed by Kaveh and Khayatazad  where a solution consisting of a vector of variables is simulated by a ray of light passing through space treated as media with different refractive indices. Based on the principles of hydrodynamics and water cycles, Intelligent Water Drop (IWD) was proposed by Shah-Hosseini . Considering the natural phenomenon of river formations through land erosion and sediment deposits, Rabanal et al.  proposed River Formation Dynamics (RFD). Eskandar et al.  proposed Water Cycle Algorithm (WCA) based on the principle of water cycle that forms streams and rivers where all rivers flow to the sea which is the ultimate destination and optimal solution in terms of optimisation.
The fundamental idea of evolutionary algorithms is based on Darwin’s theory of evolution, which gained momentum in the late 1950s nearly a century after publication of the book ‘Origin of Species’. Fraser  first conducted a simulation of genetic systems representing organisms by binary strings. Box  proposed an evolutionary operation to optimising industrial production. Friedberg  proposed an approach to evolve computer programs. The fundamental works of Lowrence Fogel  in evolutionary programming, John Holland  in genetic algorithms, Ingo Rechenberg  and Hans-Paul Schwefel  in evolution strategies had great influences on the development of evolutionary algorithms and computation as a general concept for problem-solving and as a powerful tool for optimisation. Since the development years of 1960s, the field evolved into three main branches : evolution strategies , evolutionary programming and genetic algorithms. In the 1990s there was another set of development in the evolutionary algorithms such as Koza  developed genetic programming, Reynolds  developed cultural algorithms and Storn and Price  developed differential evolution. Evolutionary algorithms have now found wide spread applications in almost all branches of science and engineering [68, 69, 70]. Different variants of EAs such as Evolutionary Programming (EP) , Evolution Strategies (ES) [72, 73], Genetic Algorithm (GA) [74, 75, 76], Genetic Programming (GP), Differential Evolution (DE) and Cultural Algorithm (CA) are discussed in the book by Siddique and Adeli .
The BIA are based on the notion of commonly observed phenomenon in some animal species and movement of organisms. Flocks of birds, herds of quadrupeds and schools of fish are often shown as fascinating examples of self-organised coordination [1, 78]. Particle Swarm Optimisation (PSO) simulates social behaviour of swarms such as birds flocking and fish schooling in nature [79, 80, 81]. Particles make use of the best positions encountered and the best position of their neighbours to position themselves towards an optimum solution . There are now as many as about 20 different variants of PSO [83, 84, 85, 86].
Bird Flocking (BF) is seen as feature of coherent manoeuvring of a group of individuals due to advantages for protecting and defending from predators, searching for food, and social and mating activities . Natural flocks maintain two balanced behaviours: a desire to stay close to the flock and a desire to avoid collisions within the flock . Reynolds  developed a model to mimic the flocking behaviour of birds using three simple rules: collision avoidance with flockmates, velocity matching with nearby flockmates and flock centring to stay close to the flock [90, 91]. Fish School (FS) shows very interesting features in their behaviour. About half the fish species are known to form fish schools at some stage in their lives. FS is observed as self-organised systems consisting of individual autonomous agents [92, 93] and come in many different shapes and sizes [87, 94, 95].
MacArthur and Wilson  developed mathematical models of biogeography that describe how species migrate from one island to another, how new species arise and how species become extinct. Since 1960s biogeography has become a major area of research that studies the geographical distribution of biological species. Based on the concept of biogeography, Simon  proposed Biogeography Based Optimisation (BBO). Based on the principles of biological immune systems, models of Artificial Immune Systems (AIS) were proposed by Farmer et al.  in the 1980s that stipulated the interaction between antibodies mathematically. In 1968, Lindenmayer  introduced formalism for simulating the development of multi-cellular organisms, initially known as Lindenmayer systems and subsequently named L-systems which attracted the interest of theoretical computer scientists. Aono and Kunii  and Smith  used L-systems to create realistic-looking images of trees and plants. There are other bio-inspired search and optimisation algorithms reported in the literature which haven’t attract much attention in the research community such as atmosphere clouds model , dolphin echolocation, Japanese tree frogs calling, Egyptian vulture, flower pollination algorithm, great salmon run, invasive weed optimisation, paddy field algorithm, roach infestation algorithm and shuffle frog leaping algorithm.
The SIA are based on the idea of collective behaviours of insects living in colonies such as ants, bees, wasps and termites. Researchers are interested in the new way of achieving a form of collective intelligence called swarm intelligence. SIAs are also advanced as a computational intelligence technique based around the study of collective behaviour in decentralised and self-organised systems. The inspiring source of Ant Colony Optimisation (ACO) is based on the foraging behaviour of real ant colonies [103, 104]. While moving, ants leave a chemical pheromone trail on the ground. When choosing their way, they tend to choose paths marked by strong pheromone concentrations. The pheromone trails will guide other ants to the food source. It has been shown that the indirect communication between the ants via pheromone trails enables them to find the shortest paths between their nest and food sources.
Honey bees search for food sources and collect by foraging in promising flower patches. The simple mechanism of the honey bees inspired researchers to develop a new search algorithm, called Bee Algorithm [105, 106]. Similarly, Artificial Bee Colony (ABC) algorithm was proposed by Karaboga  and virtual bee algorithm was proposed by Yang . Bat Algorithm (BatA) is based on the echolocation behaviour of bats. The capability of micro-bats is fascinating as they use a type of sonar, called echolocation, to detect prey, avoid obstacles and locate their roosting crevices in the dark. Yang  simulated echolocation behaviour of bats. Quite a number of cuckoo species engage the obligate brood parasitism by laying their eggs in the nests of host birds of different species. Yang and Deb  describe the Cuckoo Search (CS) algorithm based on the breeding behaviour of certain cuckoo species. The flashing of fireflies in the summer sky in the tropical regions has been attracting the naturalists and researchers for many years. The rhythm, the rate and the duration of flashing form part of the signalling system that brings two fireflies together. Based on some idealised rules, Yang  proposed the Firefly Algorithm (FA).
Individual and groups of bacteria forage for nutrients, e.g. chemotactic (foraging) behaviour of E. coli bacteria. Based on this concept, Passino  proposed Bacterial Foraging Optimisation Algorithm (BFOA). There are many swarm intelligence-based search and optimisation algorithms reported in the literature which haven’t attract much attention in the research community such as wolf search, cat swarm optimisation, fish swarm optimisation, eagle strategy, krill herd, monkey search and weightless swarm algorithms.
It is obvious from this review that the field of nature-inspired computing is large and expanding. This invited paper provided a brief summary of significant advances made in this exciting area of research with a focus on the physics- and biology-based approaches and algorithms.
A parallel development has been the emergence of the field of computational intelligence (CI) mainly consisting of neural networks [113, 114, 115, 116, 117, 118, 119, 120], evolutionary computing  and fuzzy logic [122, 123, 124, 125] in the past twenty years starting with the seminal book of Adeli and Hung  which demonstrated how a multi-paradigm approach and integration of the three CI computing paradigms can lead to more effective solutions of complicated and intractable pattern recognition and learning problems. It is observed that NIC and CI intersect. Some researchers have argued that swarm intelligence provides computational intelligence. The authors advocate and foresee more cross-fertilisation of the two emerging fields. Evolving neural networks is an example of such cross-fertilisation of two domains [127, 128].
- 2.Prigogine I. The end of certainty. New York: The Free Press; 1996.Google Scholar
- 5.Arango C, Cortés P, Onieva L, Escudero A. Simulation–optimisation models for the dynamic berth allocation problem. Comput Aided Civil Infrastruct Eng. 2013;28(10):769–79.Google Scholar
- 7.Adeli H, Park HS. Neurocomputing for design automation. Boca Raton: CRC Press; 1998.Google Scholar
- 9.Jia L, Wang Y, Fan L. Multiobjective bilevel optimization for production-distribution planning problems using hybrid genetic algorithm. Integr Comput Aided Eng. 2014;21(1):77–90.Google Scholar
- 10.Faturechi R, Miller-Hooks E. A mathematical framework for quantifying and optimizing protective actions for civil infrastructure systems. Comput Aided Civil Infrastruct Eng. 2014;29(8):572–89.Google Scholar
- 16.Adeli H. Advances in design optimization. London: Chapman and Hall; 1994.Google Scholar
- 21.Lin M-H, Tsai J-F, Yu C-S. A review of deterministic optimization methods in engineering and management. Math Probl Eng Optim Theory Methods Appl Eng. edt, 2012; vol 2012, article ID 756023.Google Scholar
- 24.Fister I Jr, Yang X-S, Fister I, Brest J, Fister D. A brief review of nature-inspired algorithms for optimisation. Elektroteh Vestn. 2013;80(3):1–7.Google Scholar
- 28.Hejazi F, Toloue I, Noorzaei J, Jaafar MS. Optimization of earthquake energy dissipation system by genetic algorithm. Comput Aided Civil Infrastruct Eng. 2013;28(10):796–810.Google Scholar
- 35.Siddique N, Adeli H. Hybrid harmony search algorithms. Int J Artif Intell Tools. 2015;24(6):1–16.Google Scholar
- 36.Siddique N, Adeli H. Applications of harmony search algorithms in engineering. Int J Artif Intell Tools. 2015;24(6):1–15.Google Scholar
- 41.Siddique N, Adeli H. Central force metaheuristic optimization. Sci Iran Trans A Civil Eng. 2015;22(6):2015.Google Scholar
- 45.Hsiao YT, Chuang CL, Jiang JA, Chien CC. A novel optimization algorithm: space gravitational optimization. In: Proceedings of 2005 IEEE international conference on systems, man and cybernetics, Oct 2005, vol. 3, p. 2323–8.Google Scholar
- 46.Kenyon IR. General relativity. Oxford: Oxford University Press; 1990.Google Scholar
- 48.Chuang C, Jiang J. Integrated radiation optimization: inspired by the gravitational radiation in the curvature of space–time. IEEE Congr Evolut Comput (CEC). 2007;25–28:3157–64.Google Scholar
- 50.Tamura K, Yasuda K. Spiral dynamics inspired optimisation. J Adv Comput Intell Intell Inform. 2011;15(8):1116–22.Google Scholar
- 54.Rabanal P, Rodríguez I, Rubio F. Using river formation dynamics to design heuristic algorithms. In: Unconventional computation, UC’07, LNCS 4618, Springer, 2007, p. 163–77.Google Scholar
- 56.Fraser AS. Simulation of genetic systems by automatic digital computers, I. Introduction. Aust J Biol Sci. 1957;10:484–91.Google Scholar
- 59.Fogel LJ. Autonomous automata. Ind Res. 1962;4:14–9.Google Scholar
- 61.Rechenberg I. Cybernetic solution path of an experimental problem, royal aircraft establishment. Library translation no. 1122, Farnborough, Hants, UK; 1965.Google Scholar
- 62.Schwefel H-P. Projekt MHD-Strausstrhlrohr: Experimentelle Optimierung einer Zweiphasenduese, Teil I, Technischer Bericht 11.034/68, 35, AEG Forschungsinstitute, Berlin, Germany; 1968.Google Scholar
- 63.De Jong KA. Evolutionary computation: a unified approach. Cambridge: The MIT Press; 2006.Google Scholar
- 64.Reyes O, Morell C, Ventura S. Evolutionary feature weighting to improve the performance of multi-label lazy algorithms. Integr Comput Aided Eng. 2014;21(4):339–54.Google Scholar
- 65.Koza John R. Genetic programming: on the programming of computers by means of natural selection. Cambridge: The MIT Press; 1992.Google Scholar
- 66.Reynolds RG. An overview of cultural algorithms: advances in evolutionary computation. New York: McGraw Hill Press; 1999.Google Scholar
- 68.Molina-García M, Calle-Sánchez J, González-Merino C, Fernández-Durán A, Alonso JI. Design of in-building wireless networks deployments using evolutionary algorithms. Integr Comput Aided Eng. 2014;21(4):367–85.Google Scholar
- 70.Adeli H, Kumar S. Distributed computer-aided engineering for analysis, design, and visualization. Boca Raton: CRC Press; 1999.Google Scholar
- 71.Badawy R, Yassine A, Heßler A, Hirsch B, Albayrak S. A novel multi-agent system utilizing quantum-inspired evolution for demand side management in the future smart grid. Integr Comput Aided Eng. 2013;20(2):127–41.Google Scholar
- 72.Campomanes-Álvareza BR, Cordón O, Damasa S. Evolutionary multi-objective optimization for mesh simplification of 3D open models. Integr Comput Aided Eng. 2013;20(4):375–90.Google Scholar
- 73.Joly MM, Verstraete T, Paniagua G. Integrated multifidelity, multidisciplinary evolutionary design optimization of counterrotating compressors. Integr Comput Aided Eng. 2014;21(3):249–61.Google Scholar
- 78.Camazine S, Deneubourg J-L, Franks NR, Sneyd J, Theraulaz G, Bonabeau E. Self-organization in biological systems. New Jersey: Princeton University Press; 2001.Google Scholar
- 80.Kennedy J, Eberhart R. Swarm intelligence. San Francisco: Morgan Kaufmann Publishers Inc; 2001.Google Scholar
- 81.Wu JW, Tseng JCR, Tsai WN. A hybrid linear text segmentation algorithm using hierarchical agglomerative clustering and discrete particle swarm optimization. Integr Comput Aided Eng. 2014;21(1):35–46.Google Scholar
- 88.Shaw E. Fish in schools. Nat History. 1975;84(8):40–6.Google Scholar
- 90.Momen S, Amavasai BP, Siddique NH. Mixed species flocking for heterogenous robotic swarms. In: The international conference on computer as a tool (EUROCON 2007), Piscataway, NJ. IEEE Press; 2007, p. 2329–36.Google Scholar
- 93.Pinto T, Praça I, Vale Z, Morais H, Sousa TM. Strategic bidding in electricity markets: an agent-based simulator with game theory for scenario analysis. Integr Comput Aided Eng. 2013;20(4):335–46.Google Scholar
- 96.MacArthur R, Wilson E. Theory of biogeography. Princeton: Princeton University Press; 1967.Google Scholar
- 101.Smith AR. Plants, fractals, and formal languages. In: Proceedings of SIG-GRAPH’84 in computer graphics, ACM SIGGRAPH, Minneapolis, Minnesota, July 22–27, 1984, p. 1–10.Google Scholar
- 102.Chen J, Wu T. A Computational intelligence optimization algorithm: cloud drops algorithm. Integr Comput Aided Eng. 2014;21(2):177–88.Google Scholar
- 106.Pham DT, Ghanbarzadeh A, Koc E, Otri S, Rahim S, Zaidi M. The bees algorithm, technical note. Manufacturing Engineering Centre, Cardiff University, UK; 2005.Google Scholar
- 107.Karaboga D. An idea based on honey bee swarm for numerical optimisation, technical report TR06. Erciyes University, Turkey; 2005.Google Scholar
- 110.Yang XS, Deb S. Engineering optimisation by cuckoo search. Int J Math Modell Numer Optim. 2010;1(4):330–43.Google Scholar
- 111.Yang X-S. Firefly algorithms for multimodal optimization. In: Stochastic algorithms: foundations and applications, SAGA 2009. Lect Notes Comput Sci. 2009;5792:169–78.Google Scholar
- 126.Adeli H, Hung SL. Machine learning—neural networks, genetic algorithms, and fuzzy sets. New York: Wiley; 1995.Google Scholar
Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.