Artigo Elsevier 2014 Gray Wolf Optimizer

March 17, 2018 | Author: Andrey Lopes | Category: Metaheuristic, Pack (Canine), Mathematical Optimization, Applied Mathematics, Mathematics


Comments



Description

Advances in Engineering Software 69 (2014) 46–61Contents lists available at ScienceDirect Advances in Engineering Software journal homepage: www.elsevier.com/locate/advengsoft Grey Wolf Optimizer Seyedali Mirjalili a,⇑, Seyed Mohammad Mirjalili b, Andrew Lewis a a b School of Information and Communication Technology, Griffith University, Nathan Campus, Brisbane QLD 4111, Australia Department of Electrical Engineering, Faculty of Electrical and Computer Engineering, Shahid Beheshti University, G.C. 1983963113, Tehran, Iran a r t i c l e i n f o Article history: Received 27 June 2013 Received in revised form 18 October 2013 Accepted 11 December 2013 Available online 21 January 2014 Keywords: Optimization Optimization techniques Heuristic algorithm Metaheuristics Constrained optimization GWO a b s t r a c t This work proposes a new meta-heuristic called Grey Wolf Optimizer (GWO) inspired by grey wolves (Canis lupus). The GWO algorithm mimics the leadership hierarchy and hunting mechanism of grey wolves in nature. Four types of grey wolves such as alpha, beta, delta, and omega are employed for simulating the leadership hierarchy. In addition, the three main steps of hunting, searching for prey, encircling prey, and attacking prey, are implemented. The algorithm is then benchmarked on 29 well-known test functions, and the results are verified by a comparative study with Particle Swarm Optimization (PSO), Gravitational Search Algorithm (GSA), Differential Evolution (DE), Evolutionary Programming (EP), and Evolution Strategy (ES). The results show that the GWO algorithm is able to provide very competitive results compared to these well-known meta-heuristics. The paper also considers solving three classical engineering design problems (tension/compression spring, welded beam, and pressure vessel designs) and presents a real application of the proposed method in the field of optical engineering. The results of the classical engineering design problems and real application prove that the proposed algorithm is applicable to challenging problems with unknown search spaces. Ó 2013 Elsevier Ltd. All rights reserved. 1. Introduction Meta-heuristic optimization techniques have become very popular over the last two decades. Surprisingly, some of them such as Genetic Algorithm (GA) [1], Ant Colony Optimization (ACO) [2], and Particle Swarm Optimization (PSO) [3] are fairly well-known among not only computer scientists but also scientists from different fields. In addition to the huge number of theoretical works, such optimization techniques have been applied in various fields of study. There is a question here as to why meta-heuristics have become remarkably common. The answer to this question can be summarized into four main reasons: simplicity, flexibility, derivation-free mechanism, and local optima avoidance. First, meta-heuristics are fairly simple. They have been mostly inspired by very simple concepts. The inspirations are typically related to physical phenomena, animals’ behaviors, or evolutionary concepts. The simplicity allows computer scientists to simulate different natural concepts, propose new meta-heuristics, hybridize two or more meta-heuristics, or improve the current meta-heuristics. Moreover, the simplicity assists other scientists to learn metaheuristics quickly and apply them to their problems. Second, flexibility refers to the applicability of meta-heuristics to different problems without any special changes in the structure of the algorithm. Meta-heuristics are readily applicable to different ⇑ Corresponding author. Tel.: +61 434555738. E-mail addresses: seyedali.mirjalili@griffithuni.edu.au (S. Mirjalili), [email protected] (S.M. Mirjalili), a.lewis@griffith.edu.au (A. Lewis). 0965-9978/$ - see front matter Ó 2013 Elsevier Ltd. All rights reserved. http://dx.doi.org/10.1016/j.advengsoft.2013.12.007 problems since they mostly assume problems as black boxes. In other words, only the input(s) and output(s) of a system are important for a meta-heuristic. So, all a designer needs is to know how to represent his/her problem for meta-heuristics. Third, the majority of meta-heuristics have derivation-free mechanisms. In contrast to gradient-based optimization approaches, meta-heuristics optimize problems stochastically. The optimization process starts with random solution(s), and there is no need to calculate the derivative of search spaces to find the optimum. This makes meta-heuristics highly suitable for real problems with expensive or unknown derivative information. Finally, meta-heuristics have superior abilities to avoid local optima compared to conventional optimization techniques. This is due to the stochastic nature of meta-heuristics which allow them to avoid stagnation in local solutions and search the entire search space extensively. The search space of real problems is usually unknown and very complex with a massive number of local optima, so meta-heuristics are good options for optimizing these challenging real problems. The No Free Lunch (NFL) theorem [4] is worth mentioning here. This theorem has logically proved that there is no meta-heuristic best suited for solving all optimization problems. In other words, a particular meta-heuristic may show very promising results on a set of problems, but the same algorithm may show poor performance on a different set of problems. Obviously, NFL makes this field of study highly active which results in enhancing current approaches and proposing new meta-heuristics every year. This also In this field of study different ecosystems (habitats or territories) are investigated for finding the relations between different species (habitants) in terms of immigration. This movement is implemented. using gravitational force. Ray Optimization (RO) [29] algorithm. and SI algorithms. whereas Evolutionary Algorithms (EA) discard the information of the previous generations. This algorithm was proposed by Holland in 1992 [13] and simulates Darwnian evolution concepts. However. The second main branch of meta-heuristics is physics-based techniques. emigration. Galaxy-based Search Algorithm (GbSA) [31]. the bigger the attractive force. in that a random set of search agents communicate and move throughout search space according to physical rules. The rest of the paper is organized as follows: Section 2 presents a literature review of SI techniques. and Curved Space Optimization (CSO) [32].  SI algorithms often utilize memory to save the best solution obtained so far. The engineering applications of GA were extensively investigated by Goldberg [14]. The inspirations of SI techniques originate mostly from natural colonies. PSO [3]. herds. Regardless of the differences between the meta-heuristics. The evolution of ecosystems (considering different kinds of species such as predator and prey) over migration and mutation to reach a stable situation was the main inspiration of the BBO algorithm. SI is ‘‘The emergent collective intelligence of groups of simple agents’’. The most popular SI technique is PSO. Populationbased meta-heuristics. EAs are usually 47 inspired by the concepts of evolution in nature. The search agents of BBBC are scattered from a point in random directions in a search space according to the principles of the big bang theory. An algorithm needs to have stochastic operators to randomly and globally search the search space in order to support this phase. 2. The PSO algorithm was proposed by Kennedy and Eberhart [3] and inspired from the social behavior of birds flocking. the masses are attracted to each other by the gravitational forces between them. however. or millennia. This can guarantee that the initial random population is optimized over the course of generations. Central Force Optimization (CFO) [26]. Generally speaking. [1].  SI algorithms are easy to implement. lands.  Multiple candidate solutions assist each other to avoid locally optimal solutions. In the former class (Simulated Annealing [5] for instance) the search process starts with one candidate solution. Mirjalili et al. attracts the other masses in proportion to their distances. mutation. and schools. The most popular algorithm in this branch is GA. The heavier the mass. Finding a proper balance between these two phases is considered a challenging task due to the stochastic nature of meta-heuristics. ray casting. or even continents over decades. In this case the search process starts with a random initial population (multiple solutions). Evolutionary Programing (EP) [16. The case studies might include different islands. Populationbased meta-heuristics have some advantages compared to single solution-based algorithms:  Multiple candidate solutions share information about the search space which results in sudden jumps toward the promising part of search space. Some of the advantages of SI algorithms are:  SI algorithms preserve information about the search space over the course of iteration. electromagnetic force. Big-Bang Big-Crunch (BBBC) [23]. centuries. For example. Some of the most popular algorithms are Gravitational Local Search (GLSA) [22]. elitism. and a real application are presented in Sections 4-6. but the search agents navigate using the simulated collective and social intelligence of creatures. herds. Section 3 outlines the proposed GWO algorithm. Gravitational Search Algorithm (GSA) [24]. One of the interesting branches of the population-based metaheuristics is Swarm Intelligence (SI).  SI algorithms usually have fewer parameters to adjust. and Artificial Bee Colony (ABC) [7]. Some of the most popular SI techniques are ACO [2]. During iteration. physics-based. The third subclass of meta-heuristics is the SI methods. and so on).19]. The results and discussion of benchmark functions. The basic physical theory from which GSA is inspired is Newton’s law of universal gravitation. According to Bonabeau et al. and Biogeography-Based Optimizer (BBO) [21]. inertia force. Artificial Chemical Reaction Optimization Algorithm (ACROA) [27]. Some of the EAs are Differential Evolution (DE) [15]. A comprehensive literature review of the SI algorithms is provided in the next section. the new population is likely to be better than the previous generation(s). This work proposes a new SI technique with inspiration from the social hierarchy and hunting behavior of grey wolf packs. Since the best individuals have higher probability of participating in generating the new population. for example. Therefore. and this population is enhanced over the course of iterations. the BBO algorithm was first proposed by Simon in 2008 [21]. Charged System Search (CSS) [25]. the BBBC algorithm was inspired by the big bang and big crunch theories. The exploration phase refers to the process of investigating the promising area(s) of the search space as broadly as possible. This single candidate solution is then improved over the course of iterations.  SI algorithms have less operators compared to evolutionary approaches (crossover. / Advances in Engineering Software 69 (2014) 46–61 motivates our attempts to develop a new meta-heuristic with inspiration from grey wolves. The PSO algorithm employs multiple particles that chase the position of the best particle and their . flock. which is possibly close to the global optimum. Such optimization algorithms typically mimic physical rules. Literature review Meta-heuristics may be classified into three main classes: evolutionary. They search randomly and then gather in a final point (the best point obtained so far) according to the principles of the big crunch theory. and so on. The mechanism is almost similar to physics-based algorithm. the optimization is done by evolving an initial random solution in EAs. weights. or schools of creatures in nature. The basic idea of this algorithm has been inspired by biogeography which refers to the study of biological organisms in terms of geographical distribution (over time and space). semi-real problems. Genetic Programming (GP) [20]. perform the optimization using a set of solutions (population). As an example. and Evolution Strategy (ES) [18.17]. respectively. exploitation refers to the local search capability around the promising regions obtained in the exploration phase. The GSA algorithm performs search by employing a collection of agents that have masses proportional to the value of a fitness function. the heaviest mass. Finally. The concepts of SI was first proposed in 1993 [6].S. and mutation.  Population-based meta-heuristics generally have greater exploration compared to single solution-based algorithms. Small-World Optimization Algorithm (SWOA) [30]. The mechanism of these algorithms is different from EAs. Each new population is created by the combination and mutation of the individuals in the previous generation. Section 7 concludes the work and suggests some directions for future studies. a common feature is the division of the search process into two phases: exploration and exploitation [8–12]. GSA is another physics-based algorithm. These algorithms mostly mimic the social behavior of swarms. Generally speaking. flocks. Black Hole (BH) [28] algorithm. meta-heuristics can be divided into two main classes: single-solution-based and population-based. encircling. however. There are three types of bees in ABS: scout. Of particular interest is that they have a very strict social dominant hierarchy as shown in Fig.  Krill Herd (KH) in 2012 [44]. The beta reinforces the alpha’s commands throughout the pack and gives feedback to the alpha. been observed. sentinels. Another popular SI algorithm is ACO. According to Muro et al.  Bird Mating Optimizer (BMO) in 2012 [43]. This algorithm was inspired by the social behavior of ants in an ant colony. inspired by the echolocation behavior of bats. meaning that they are at the top of the food chain.48 S. In some cases the omega is also the babysitters in the pack. he/she is called subordinate (or delta in some references).  Dolphin Partner Optimization (DPO) in 2009 [41]. the social intelligence of ants in finding the shortest path between the nest and a source of food is the main inspiration of ACO. and approaching the prey. and caretakers belong to this category. They are different in terms of size and weight. Interestingly. 1. elders. in 2006 [2].  Bee Collecting Pollen Algorithm (BCPA) in 2008 [39]. but it has been observed that the whole pack face internal fighting and problems in case of losing the omega. hunters. some kind of democratic behavior has also Fig. This assists satisfying the entire pack and maintaining the dominance structure. the caretakers are responsible for caring for the weak.  Pursuing. This motivated our attempt to mathematically model the social behavior of grey wolves. Scouts. many of them inspired by hunting and search behaviors. and harassing the prey until it stops moving. [47] the main phases of grey wolf hunting are as follows:  Tracking. but they dominate the omega. The omega plays the role of scapegoat. mimicking the collective behavior of bees in finding food sources. The beta wolf should respect the alpha.  Termite Algorithm in 2005 [36].  Cuckoo Search (CS) in 2009 [40]. onlooker. / Advances in Engineering Software 69 (2014) 46–61 own best positions obtained so far. The alpha is mostly responsible for making decisions about hunting. in which an alpha follows the other wolves in the pack. a particle is moved considering its own best solution as well as the best solution the swarm has obtained.  Attack towards the prey. They are the last wolves that are allowed to eat. and so on. the alpha is not necessarily the strongest member of the pack but the best in terms of managing the pack. Grey wolves are considered as apex predators. The group size is 5–12 on average. To the best of our knowledge.  Artificial Fish-Swarm Algorithm (AFSA) in 2003 [35]. Bats utilize natural sonar in order to do this. In other words.  Firefly Algorithm (FA) in 2010 [42]. . The rest of the SI techniques proposed so far are as follows:  Marriage in Honey Bees Optimization Algorithm (MBO) in 2001 [34]. 1. whereas onlooker and employed bees exploit the promising solutions found by scout bees. proposed by Dorigo et al.1. the mathematical model is provided. sleeping place. Finally. Elders are the experienced wolves who used to be alpha or beta. Scouts are responsible for watching the boundaries of the territory and warning the pack in case of any danger. In this work this hunting technique and the social hierarchy of grey wolves are mathematically modeled in order to design GWO and perform optimization. The lowest ranking grey wolf is omega. Grey wolves mostly prefer to live in a pack. but they all have quite similar behaviors when navigating and hunting. In gatherings. well known for their pack hunting. Then. and wounded wolves in the pack. called alphas. The ABC is another popular algorithm. The alpha’s decisions are dictated to the pack. Mirjalili et al. This behavior has been mathematically modeled for the BA algorithm. A pheromone matrix is evolved over the course of iteration by the candidate solutions. or omega. Inspiration Grey wolf (Canis lupus) belongs to Canidae family. It may seem the omega is not an important individual in the pack. This list shows that there are many SI techniques proposed so far. and he/she is probably the best candidate to be the alpha in case one of the alpha wolves passes away or becomes very old. If a wolf is not an alpha. Sentinels protect and guarantee the safety of the pack. Finally. chasing. the entire pack acknowledges the alpha by holding their tails down. This is due to the venting of violence and frustration of all wolves by the omega(s). There are many types of bats in the nature. and investigate its abilities in solving benchmark and real problems. propose a new SI algorithm inspired by grey wolves. has been proposed recently [33]. ill. The scout bees are responsible for exploring the search space. The two main characteristics of bats when finding prey have been adopted in designing the BA algorithm. Grey Wolf Optimizer (GWO) In this section the inspiration of the proposed method is first discussed.  Fruit fly Optimization Algorithm (FOA) in 2012 [45]. This shows that the organization and discipline of a pack is much more important than its strength.  Wasp Swarm Algorithm in 2007 [37]. Omega wolves always have to submit to all the other dominant wolves. In addition to the social hierarchy of wolves. The alpha wolves are only allowed to mate in the pack. However. The beta wolf can be either male or female. beta. Hunters help the alphas and betas when hunting prey and providing food for the pack. These steps are shown in Fig. In fact. The alpha wolf is also called the dominant wolf since his/her orders should be followed by the pack [46]. It plays the role of an advisor to the alpha and discipliner for the pack. Bats tend to decrease the loudness and increase the rate of emitted ultrasonic sound when they chase prey. and employed bees. group hunting is another interesting social behavior of grey wolves. Hierarchy of grey wolf (dominance decreases from top down). but commands the other lower-level wolves as well. 2. there is no SI technique in the literature mimicking the leadership hierarchy of grey wolves. time to wake. The second level in the hierarchy of grey wolves is beta. The leaders are a male and a female. the Bat-inspired Algorithm (BA). 3. Delta wolves have to submit to alphas and betas. The betas are subordinate wolves that help the alpha in decision-making or other pack activities. 3.  Monkey Search in 2007 [38]. 2. 1]. we consider the fittest solution as the alpha (a). The possible updated positions of a grey wolf in 3D space are depicted in Fig. Then the GWO algorithm is outlined. Hunting behavior of grey wolves: (A) chasing. grey wolves encircle prey during the hunt. we save the first three best solutions obtained so far and oblige the other search agents (including the omegas) to update their positions according to the position of the best search agents. 3.1. beta. beta. ~ Xb  ~ Db Þ. Y) can update its position according to the position of the prey (X. 3(b). (X–X.S. ~ Db ¼ j~ Xb  ~ Xj. and ~ X indicates the position vector of a grey wolf. and tracking prey (B–D) pursuiting. Social hierarchy In order to mathematically model the social hierarchy of wolves when designing GWO. . In order to mathematically simulate the hunting behavior of grey wolves. the next position of a search agent can be in any position Fig. In order to mathematically model encircling behavior the following equations are proposed: ~ D ¼ j~ X p ðtÞ  ~ XðtÞj C ~ ~ Xðt þ 1Þ ¼ ~ X p ðtÞ  ~ D A~ ð3:1Þ 49 points illustrated in Fig. When random values of ~ A are in [1.2.2).3. ~ Da ¼ j~ Xa  ~ Xj.1) and (3.2. Therefore. Hunting Grey wolves have the ability to recognize the location of prey and encircle them.2. 3. The hunt is usually guided by the alpha. Consequently. Y) can be reached by setting ~ A ¼ ð1. approaching. r2 are random vectors in [0. Attacking prey (exploitation) As mentioned above the grey wolves finish the hunt by attacking the prey when it stops moving. and encircling (E) stationary situation and attack [47]. ~ Xd  ~ Dd Þ X2 ¼ ~ X3 ¼ ~ A1  ð~ A2  ð~ A3  ð~ ð3:6Þ ~ X1 þ ~ X2 þ ~ X3 ~ Xðt þ 1Þ ¼ 3 ð3:7Þ ð3:2Þ where t indicates the current iteration. 3. encircling. 3(a). It can be observed that the final position would be in a random place within a circle which is defined by the positions of alpha. As can be seen in this figure. and d. (3. In order to mathematically model approaching the prey we decrease the value of ~ a. tracking. The x wolves follow these three wolves. 3. and delta in a 2D search space.2. a two-dimensional position vector and some of the possible neighbors are illustrated in Fig. Note that the random vectors r1 and r2 allow wolves to reach any position between the Fig. 4 shows how a search agent updates its position according to alpha. in an abstract search space we have no idea about the location of the optimum (prey). Different places around the best agent can be reached with respect to the current position by adjusting the value of ~ A and ~ C vectors. and attacking prey are provided. In other words alpha. Note that the fluctuation range of ~ a. 2. Encircling prey As mentioned above. the second and third best solutions are named beta (b) and delta (d) respectively. X p is the position vector of the prey. 1Þ. For instance. ~ A and ~ C are coefficient vec~ tors. To see the effects of Eqs. we suppose that the alpha (best candidate solution) beta. Mathematical model and algorithm In this subsection the mathematical models of the social hierarchy. and the grey wolves will move in hyper-cubes (or hyper-spheres) around the best solution obtained so far.1) and (3.2. The beta and delta might also participate in hunting occasionally. and delta in the search space. harassing. 2a] where a is decreased from 2 to 0 over the course of iterations. beta. b. The rest of the candidate solutions are assumed to be omega (x). 1]. The same concept can be extended to a search space with n dimensions.4. ~ Dd ¼ j~ Xd  ~ Xj C1  ~ C2  ~ C3  ~ ð3:5Þ ~ X1 ¼ ~ Xa  ~ Da Þ.2). and delta estimate the position of the prey. So a grey wolf can update its position inside the space around the prey in any random location by using Eqs. However. Y). The following formulas are proposed in this regard. The vectors ~ A and ~ C are calculated as follows: ~ a ~ a A ¼ 2~ r1  ~ ð3:3Þ ~ r2 C ¼ 2 ~ ð3:4Þ a are linearly decreased from 2 to 0 over the where components of ~ course of iterations and r1. (3. and other wolves updates their positions randomly around the prey. 0Þ and ~ C ¼ ð1. 3. a grey wolf in the position of (X. In the GWO algorithm the hunting (optimization) is guided by a. Mirjalili et al. and delta have better knowledge about the potential location of prey. In other words ~ A is also decreased by ~ A is a random value in the interval [2a. / Advances in Engineering Software 69 (2014) 46–61 3. This assists GWO to show .Z) X*-X (X*-X.Y.4).5.Z*) (X*.Y) (X*-X.Y.Z*-Z) (X.Y. / Advances in Engineering Software 69 (2014) 46–61 (X*-X. and delta. a1 C1 a2 C2 R Dalpha Dbeta Move Ddelta a3 C3 or any other hunters Estimated position of the prey Fig. Another component of GWO that favors exploration is ~ C.Z*) (X*. This emphasizes exploration and allows the GWO algorithm to search globally. we utilize ~ A with random values greater than 1 or less than -1 to oblige the search agent to diverge from the prey. 3. Fig.Z*-Z) (X.Y.Y*.Y. They diverge from each other to search for prey and converge to attack prey. 4.Y*-Y. 1 If |> |A 1 If (a) |< |A (b) Fig. As may be seen in Eq.Y*-Y. Attacking prey versus searching for prey.Y*) (X.Y) (X.Y*-Y.Y*.Z*) (X. and delta.Z*-Z) (X.Z) Y*-Y (X*.Z*) (X*.Z-Z*) (X. Position updading in GWO. the ~ C vector contains random values in [0.1). between its current position and the position of the prey.Y. 2]. Mirjalili et al. This component provides random weights for prey in order to stochastically emphasize (C > 1) or deemphasize (C < 1) the effect of prey in defining the distance in Eq. 5(a) shows that |A| < 1 forces the wolves to attack towards the prey.Z*-Z) (X. 2D and 3D position vectors and their possible next locations. the GWO algorithm is prone to stagnation in local solutions with these operators. In order to mathematically model divergence. but GWO needs more operators to emphasize exploration.Y. With the operators proposed so far.Z*-Z) (b) Fig.Y*-Y) (X*-X.Y) (X*. 3.Y.2. (3. (3. Search for prey (exploration) Grey wolves mostly search according to the position of the alpha.Y*. beta. and attack towards the prey.Z) (X. the GWO algorithm allows its search agents to update their position based on the location of the alpha.Z*-Z) (a) (X. It is true that the encircling mechanism proposed shows exploration to some extent.Y.Y*. 5(b) also shows that |A| > 1 forces the grey wolves to diverge from the prey to hopefully find a fitter prey.Y*.Y*) (X*-X.Y.Z*-Z) (X*.Z*) (X*-X.Y*-Y) (X*-X. However. 5.Z*-Z) (X.Z) (X*.Y*-Y) (X*.Y*-Y. Fig.Z*) (X*-X.50 S.Y*.Y*) (X.Z) (X*. beta. For verifying the results.  The GWO has only two main parameters to be adjusted (a and C). the GWO algorithm is terminated by the satisfaction of an end criterion. Fast Evolutionary Programing (FEP) [16].au/matlabcentral/fileexchange/44974. and F7. There are possibilities to integrate mutation and other evolutionary operators to mimic the whole life cycle of grey wolves. we have chosen these test functions to be able to compare our results to those of the current meta-heuristics. the search process starts with creating a random population of grey wolves (candidate solutions) in the GWO algorithm. Results and discussion In this section the GWO algorithm is benchmarked on 29 benchmark functions. Over the course of iterations. the obstacles in nature appear in the hunting paths of wolves and in fact prevent them from quickly and conveniently approaching prey. Function P f1 ðxÞ ¼ ni¼1 x2i P Q f2 ðxÞ ¼ ni¼1 jxi j þ ni¼1 jxi j Pn Pi 2 f3 ðxÞ ¼ i¼1 ð j1 xj Þ Fig. Generally speaking.  The proposed hunting method allows candidate solutions to locate the probable position of the prey. Generally speaking. The parameter a is decreased from 2 to 0 in order to emphasize exploration and exploitation. The first 23 benchmark functions are the classical functions utilized by many researchers [16. 7–10 illustrate the 2D versions of the benchmark functions used. Finally. This algorithm outperforms all others in F1. To sum up. Exploration analysis In contrast to the unimodal functions. In addition. the benchmark functions used are minimization functions and can be divided into four groups: unimodal. 1Þ Dim Range 30 [100. Therefore. the GWO algorithm is compared to PSO [3] as an SI-based technique and GSA [24] as a physics-based algorithm. fixed-dimension multimodal. where Dim indicates dimension of the function. Mirjalili et al.com. This is exactly what the vector C does.mathworks.  With decreasing A. To see how GWO is theoretically able to solve optimization problems. multimodal. F2.2.com/GWO. 100] 0 0 30 30 [100. 4. It may be noted that the unimodal functions are suitable for benchmarking exploitation. rotated. half of the iterations are devoted to exploration (|A| P 1) and the other half are dedicated to exploitation (|A| < 1).28. respectively. expanded. and combined variants of the classical functions which offer the greatest complexity among the current benchmark functions [53]. and Evolution Strategy with Covariance Matrix Adaptation (CMA-ES) [18]. and delta wolves estimate the probable position of the prey. 100] 0 30 [1. It is worth mentioning here that C is not linearly decreased in contrast to A. Tables 4 lists the CEC 2005 test functions. Exploitation analysis According to the results of Table 5. 6.  Exploration and exploitation are guaranteed by the adaptive values of a and A. The source codes of this algorithm can be found in http://www. 4. The GWO algorithm was run 30 times on each benchmark function. This makes them suitable for benchmarking the Table 1 Unimodal benchmark functions. 100] [30. and fmin is the optimum.  The adaptive values of parameters a and A allow GWO to smoothly transition between exploration and exploitation. some points may be noted:  The proposed social hierarchy assists GWO to save the best solutions obtained so far over the course of iteration. especially in the final iterations. and composite functions. We deliberately require C to provide random values at all times in order to emphasize exploration not only during initial iterations but also final iterations.82]. the GWO algorithm is compared with three EAs: DE [15]. Depending on the position of a wolf. Candidate solutions tend to diverge from the prey when j~ Aj > 1 and converge towards the prey when j~ Aj < 1.  The random parameters A and C assist candidate solutions to have hyper-spheres with different random radii.28] 0 .1. Note that a detailed descriptions of the composite benchmark functions are available in the CEC 2005 technical report [52]. Range is the boundary of the function’s search space.51 S. Pseudo code of the GWO algorithm. Figs.48–51. Despite the simplicity. alpha. Range is the boundary of the function’s search space. multimodal functions have many local optima with the number increasing exponentially with dimension. The pseudo code of the GWO algorithm is presented in Fig. or vice versa. 10] [100. Each candidate solution updates its distance from the prey. 30] 0 0 30 [100. 4. 6.  The proposed encircling mechanism defines a circle-shaped neighborhood around the solutions which can be extended to higher dimensions as a hyper-sphere. This is due to the proposed exploitation operators previously discussed. However. The other test beds that we have chosen are six composite benchmark functions from a CEC 2005 special session [52]. it can randomly give the prey a weight and make it harder and farther to reach for wolves. These benchmark functions are listed in Tables 1–3 where Dim indicates dimension of the function. we have kept the GWO algorithm as simple as possible with the fewest operators to be adjusted. Such mechanisms are recommended for future work. 1. The statistical results (average and standard deviation) are reported in Tables 5–8. GWO is able to provide very competitive results.alimirjalili. These benchmark functions are the shifted. f4 ðxÞ ¼ maxi fjxi j. This component is very helpful in case of local optima stagnation. 1 6 i 6 ng P 2 2 2 f5 ðxÞ ¼ n1 i¼1 ½100ðxiþ1  xi Þ þ ðxi  1Þ  Pn 2 f6 ðxÞ ¼ i¼1 ð½xi þ 0:5Þ P f7 ðxÞ ¼ ni¼1 ix4i þ random½0. beta.html and http:// www. 100] fmin 0 30 30 [10. favoring exploration and local optima avoidance. The C vector can be also considered as the effect of obstacles to approaching prey in nature. / Advances in Engineering Software 69 (2014) 46–61 a more random behavior throughout optimization. these results show the superior performance of GWO in terms of exploiting the optimum. and fmin is the optimum. 50] 0 30 [50. This superior capability is due to the adaptive value of A. 50] 0 30 [0. 5] 0. This demonstrates that GWO shows a good balance between exploration and exploitation that results in high local optima avoidance. 100. Function pffiffiffiffiffiffiffi P F 8 ðxÞ ¼ ni¼1  xi sinð jxi jÞ Pn F 9 ðxÞ ¼ i¼1 ½x2i  10 cosð2pxi Þ þ 10 qffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi ffi   P  P F 10 ðxÞ ¼ 20 exp 0:2 1n ni¼1 x2i  exp 1n ni¼1 cosð2pxi Þ þ 20 þ e   Pn 2 Q n xiffi 1 p þ1 F 11 ðxÞ ¼ 4000 i¼1 xi  i¼1 cos i Pn1 P p F 12 ðxÞ ¼ n f10 sinðpy1 Þ þ i¼1 ðyi  1Þ2 ½1 þ 10 sin2 ðpyiþ1 Þ þ ðyn  1Þ2 g þ ni¼1 uðxi . So. the search history and trajectory of the first search agent in its first dimension are illustrated in Fig. exploration and exploitation can be simultaneously benchmarked by the composite functions. Moreover. the local optima avoidance of an algorithm can be examined due to the massive number of local optima in such test functions.12. mÞ ¼ 30 [20. 2 [5. 100. Convergence behavior analysis In this subsection the convergence behavior of GWO is investigated. According to Table 8. 4Þ     2m 2 P i:x F 14 ðxÞ ¼  ni¼1 sinðxi Þ  sin pi . [54]. generally very challenging test beds for meta-heuristic algorithms. m ¼ 10 Pn 2 i Q h Pn 2m F 15 ðxÞ ¼ e i¼1 ðxi =bÞ  2e i¼1 xi  ni¼1 cos2 xi . Function F 14 ðxÞ ¼ F 15 ðxÞ ¼  1 500 P11 þ i¼1 P25 j¼1 jþ P2 1 ðxi aij Þ6 1 Dim Range fmin 2 [65. p] 4. The GWO algorithm also provides very competitive results on the remaining composite benchmark functions.3.1532 4 [0.32 4 [0. This mechanism assists GWO to provide very good exploration. It can be seen that there are abrupt changes in the initial steps of iterations which are decreased gradually over the course of iterations.398 2 [2. / Advances in Engineering Software 69 (2014) 46–61 Table 2 Multimodal benchmark functions. The second column of Fig. These results show that the GWO algorithm has merit in terms of exploration. 11. and outperforms them occasionally. It may be observed that the search agents of GWO tend to extensively search promising regions of the search spaces and exploit the best one. in which changes of the first search agent in its first dimension can be observed. and we used six search agents to find the optima. this behavior can guarantee that a SI algorithm eventually convergences to a point in search space. Moreover. these changes should be reduced to emphasize exploitation at the end of optimization. 1] 3. 5] 1. 11 depicts the search history of the search agents. GWO is able to provide very competitive results on the multimodal benchmark functions as well. 600] 0 30 [50. and exploitation simultaneously. 5] 0. Then. 10] 10. 10] 1 Table 3 Fixed-dimension multimodal benchmark functions. This algorithm outperforms PSO and GSA on the majority of the multimodal functions. 32] 0 30 [600. According to Berg et al. 10] 10. Mirjalili et al. 10. In order to observe the convergence behavior of the GWO algorithm. GWO outperforms all others on half of the composite benchmark functions. GWO shows very competitive results compare to DE and FEP. the results verify the performance of the GWO algorithm in solving various benchmark functions compared to wellknown meta-heuristics.4.86 6 [0. 5. 3] 3.687 xi þ1 4 8 xi > a < kðxi  aÞm 0 a < xi < a : m xi < a kðxi  aÞ P P F 13 ðxÞ ¼ 0:1fsin2 ð3px1 Þ þ ni¼1 ðxi  1Þ2 ½1 þ sin2 ð3pxi þ 1Þ þ ðxn  1Þ2 ½1 þ sin2 ð2pxn Þg þ ni¼1 uðxi . 4Þ yi ¼ 1 þ Dim Range fmin 30 [500.5363 4.4028 4 [0.12] 0 30 [32. According to Berg et al. local minima avoidance. 500] 418. The animated versions of this figure can be found in Supplementary Materials. 11 shows the trajectory of the first particle. 10] 10. To further investigate the performance of . there should be abrupt changes in the movement of search agents over the initial steps of optimization. Local minima avoidance The fourth class of benchmark functions employed includes composite functions. half of the iterations are devoted to exploration (|A| P 1) and the rest to exploitation (|A| < 1).0316 2 [5. 2] 3 3 [1. To sum up. 4. As mentioned above.52 S. 65] 1 4 [5. In addition.00030 i¼1 2 x ðb2 þb x Þ ai  12 i i 2 bi þbi x3 þx4 F 16 ðxÞ ¼ 4x21  2:1x41 þ 13 x61 þ x1 x2  4x22 þ 4x42  2   2 5 F 17 ðxÞ ¼ x2  45:1 þ 10 1  81p cos x1 þ 10 p2 x1 þ p x1  6 F 18 ðxÞ ¼ ½1 þ ðx1 þ x2 þ 1Þ2 ð19  14x1 þ 3x21  14x2 þ 6x1 x2 þ 3x22 Þ  ½30 þ ð2x1  3x2 Þ2  ð18  32x1 þ 12x21 þ 48x2  36x1 x2 þ 27x22 Þ P P F 19 ðxÞ ¼  4i¼1 ci expð 3j¼1 aij ðxj  pij Þ2 Þ P4 P6 F 20 ðxÞ ¼  i¼1 ci expð j¼1 aij ðxj  pij Þ2 Þ P 1 F 21 ðxÞ ¼  5i¼1 ½ðX  ai ÞðX  ai ÞT þ ci  P7 1 T F 22 ðxÞ ¼  i¼1 ½ðX  ai ÞðX  ai Þ þ ci  P10 1 T F 23 ðxÞ ¼  i¼1 ½ðX  ai ÞðX  ai Þ þ ci  exploration ability of an algorithm. m ¼ 5 pffiffiffiffiffiffiffi P P P 2 2 F 16 ðxÞ ¼ f½ ni¼1 sin ðxi Þ  expð ni¼1 x2i Þg  exp½ ni¼1 sin jxi j uðxi . 20] 1 30 [10. 5.9829  5 30 [5. [54]. a. This assists a meta-heuristic to explore the search space extensively. According to the results of Tables 6 and 7. the fourth column of Fig. k. Note that the benchmark functions are shifted in this section. 0:8. .3 . . / Advances in Engineering Software 69 (2014) 46–61 Table 4 Composite benchmark functions. 0. 1. 0:9. f4 = Weierstras’s Function f5. 0:7. f10 = Griewank’s Function ½. 1/5. the proposed algorithm. . GWO for classical engineering problems In this section three constrained engineering design problems: tension/compression spring..5. 1] F27(CF4): f1. k2. .10  ¼ ½1. k10] = [1. .2 .5. 1. . . . are employed.5  5/100.2 . k10] = [5/100. These problems have several equality and inequality constraints. f8 = Griewank’s Function f9. 1 [k1. 5/0. k10] = [5/32. . . however. f2. k10] = [0. 5] 0 10 [5. . . . k10] = [5/100. . 1 [k1. 5/32. 0:4. .1 . 5. 5] 0 10 [5.3 . f10 = Griewank’s Function ½. three classical engineering design problems and a real problem in optical engineering are employed in the following sections. . . f6 = Griewank’s Function f7.1 . 5/100] F25(CF2): f1.10  ¼ ½1. f4 = Rastrigin’s Function f5. 5/32.8  5/32.1 . welded beam.2  1/5.2 . Since the search agents of the proposed GWO algorithm update their positions with respect .2 .. f2 = Rastrigin’s Function f3. k2. . . . . 1. k3.5. . . 1 [k1.10  ¼ ½0:1.5.3 . k3 . 0. . . . 1. f2. 0. any kind of constraint handling can be employed without the need to modify the mechanism of the algorithm (GA and PSO for instance). . . f2 = Rastrigin’s Function f3. 0. . . f3. . 5] 0 10 [5. Mirjalili et al. k3. 5/100. 5/100. k3.. 1  5/100] (F1) (F2) (F5) (F3) (F6) Dim Range fmin 10 [5. .6  5/100. . k3.5. 1. . . so the GWO should be equipped with a constraint handling method to be able to optimize constrained problems as well. 1. . . . . . . 1 [k1. f3. . 1. . 1 [k1.10  ¼ ½1.. 1. 5/100.10  ¼ ½1. 5] 0 (F4) (F7) Fig. . f10 = Sphere Function ½.2 . 0. .2 . f10 = Sphere Function ½. 0:2. .. 5/0. 5/100. .53 S. . k2. 0. 7.. 5] 0 10 [5. . 5/100] F26(CF3): f1. 5/100] F28(CF5): f1. k2. 5] 0 10 [5. f4 = Weierstras’s Function f5. . For the fitness independent algorithms. f6 = Griewank’s Function f7. .1 . . 5/100. f8 = Ackley’s Function f9. . 1. k10] = [1/5. The GWO algorithm is also compared with well-known techniques to confirm its results. f10 = Sphere Function ½. . 2-D versions of unimodal benchmark functions. 1. . . . 1 [k1. f2 = Ackley’s Function f3. and pressure vessel designs. Function F24(CF1): f1. . .. . . . . . . 0:3.3 .10  ¼ ½1.3 . . . 5/100. .1 .5. . . . .1 . 1. . ..3  5/0. Generally speaking. 0. . f6 = Weierstras’s Function f7. . . .. 1. 0:5. 5/32. .3 .. f3. 5/100. . . 1. . k2.1  1/5. 5/0.. .9  5/100. . 0:6.4  5/0. . . constraint handling becomes very challenging when the fitness function directly affects the position updating of the search agents (GSA for instance). 5/100] f29(CF6): f1. k2. . . k3. 0. 5/100. 1. f8 = Ackley’s Function f9. .7  5/32. f2. 5/100.. 5/100. . . . . . f10 = Sphere Function ½. . . . 5/0. 042144 70.000126 0.9559 1. there is no direct relation between the search agents and the fitness function.3522 to the alpha.194074 318.816579 0.00077 0.9E14 9. beta.87 0 0. (F24) (F25) (F27) (F26) (F28) (F29) Fig. 10. In this case. 8.67E17 0.54 S. penalty functions.81258 0.53E16 0. 2-D versions of composite benchmark functions. Table 5 Results of unimodal benchmark functions. they .22534 1.055655 896.4E11 0 0 0 0.100286 0. or delta violate constraints.1415 0.18E17 3.11924 0.089441 9. So the simplest constraint handling method.317039 60.045421 22.122854 0.59E28 7.8E11 0 0 0 0. where search agents are assigned big objective function values if they violate any of the constraints.0081 0. Mirjalili et al. if the alpha.5E09 6.54309 2.014 0.002213 6.315088 69.9E10 7.3 5.741452 62.00057 0.0012 0.044957 2.04339 8.5347 7.35487 67.000202 0.29E06 5.00463 5.34E05 0.000102 0. 2-D version of fixed-dimension multimodal benchmark functions.029014 79. and delta locations. 2-D versions of multimodal benchmark functions.61E07 26.2E14 1.11559 8.74E16 0.12562 1.086481 96.14958 1.5E16 0.90499 0. F F1 F2 F3 F4 F5 F6 F7 GWO PSO GSA DE FEP Ave Std Ave Std Ave Std Ave Std Ave Std 6. (F14) (F16) (F17) (F18) Fig. can be employed effectively to handle constraints in GWO. 9.016 0.00013 0.000136 0.06 0 0.5 5.28E05 0. / Advances in Engineering Software 69 (2014) 46–61 (F8) (F9) (F10) (F12) (F11) (F13) Fig. beta.71832 0. Table 9 suggests that GWO finds a design with the minimum weight for this problem. 1 4x2 x1 x2 g2 ð~ xÞ ¼ 3 4 2 x1 x1 Þ 4x22 x1 x2 12566ðx2 x31 x41 Þ 1 þ 5108x 2 6 0.0021 0.29 46.560828 0. g 2 ð~ xÞ ¼ dð~ xÞ  dmax 6 0.397887 3 N/A N/A 10.12 3.03163 0.9E09 2E15 N/A N/A 0.17E15 2.060516 3.59 2.1514 10.000625 1. Subject to 2 3 g 1 ð~ xÞ ¼ 1  71785x 4 6 0. length of attached part of bar (l). ð5:1Þ 1 g3 ð~ xÞ ¼ 1  140:45x 6 0. g 4 ð~ xÞ ¼ P  P c ð~ xÞ 6 0.53 6.45653 9.5 0.95512 9.782786 5. f ð~ xÞ ¼ ðx3 þ 2Þx2 x21 . In this case.00032 4.1 0. and Differential Evolution (DE) [62] algorithms have also been employed as heuristic optimizers for this problem. We used simple.03163 0. 0:1 6 x3 6 10. and the number of active coils (N).0375 7.96841 0.007724 0. F F8 F9 F10 F11 F12 F13 GWO PSO GSA DE FEP Ave Std Ave Std Ave Std Ave Std Ave Std 6123.016 9.000337 1. Harmony Search (HS) [61].001647 4.2 9.018 0. mean coil diameter (D).23628 5.397887 3 3. Side constraints.1532 10.397887 3 3.006675 1152.57 0.1E14 574.003673 1. Buckling load on the bar (Pc).11 0. beta.58441 8.3E16 0.654464 4087.56 0.03 0. The mathematical formulation is as follows: x3 x 2 g2 ð~ xÞ ¼ 12566ðx mathematical approaches that have been adopted to solve this problem are the numerical optimization technique (constraints correction at constant cost) [55] and mathematical optimization technique [56].26634 6.627168 0. and thickness of the bar (b).07 25.397887 3 3. The mathematical formulation of this problem is as follows: Consider Minimize ~ x ¼ ½x1 x2 x3  ¼ ½dDN.5343 4.58E15 0.077835 0.019644 3.06E13 0. The objective of this problem is to minimize the fabrication cost of a welded beam as shown in Fig.470068 0.014088 2.004485 0.0000025 3.6E15 0.059 1.95291 2.899084 493. 1 þ 1 5108x21 6 0.03163 0.70154 1.009215 0.22 0.55 S. and minimum deflection.68447 10.737079 2. Mirjalili et al.087094 1.55899 3.397889 3.33E15 2.022 3. / Advances in Engineering Software 69 (2014) 46–61 Table 6 Results of multimodal benchmark functions.046 0. if the penalty makes the alpha. 0:25 6 x2 6 1:30. the height of the bar (t).310521 1.814 11.44 47.25056 9. 2:00 6 x3 6 15:0 This problem has been tackled by both mathematical and heuristic approaches.000014 0.000577 1.062087 27.5364 3.9E07 1.126241 11080.276015 0.859838 0. ðf ~ g 1 ð~ xÞ ¼ sð~ xÞ  smax 6 0. Welded beam design 5.70423 0.86278 3. xÞ ¼ x11:5 g4 ð~ Variable range 0:05 6 x1 6 2:00. The Evolution Strategy (ES) [59].998004 4. End deflection of the beam (d). x2 x 2 3 þx2  1 6 0. 13 [60].86278 3. The constraints are as follows:      Shear stress (s).31778 5.29E15 0.9E07 1.004474 4841. xÞ ¼ rð~ xÞ  rmax 6 0.8 4.008907 2821.00033 3.000222 6.1.5E14 1. scalar penalty functions for the rest of problems except the tension/compression spring design problem which uses a more complex penalty function. The minimization process is subject to some constraints such as shear stress.95114 7. it is automatically replaced with a new search agent in the next iteration.4029 10.86 3.28654 10. 0:1 6 x4 6 2 ð5:2Þ .042493 0.88E16 0 4.14015 8.020734 0.1 69.50901 0. g 3 ð~ xÞ ¼ x1  x4 6 0.012 0.006659 0. xÞ ¼ 1:10471x21 x2 þ 0:04811x3 x4 ð14:0 þ x2 Þ.9E07 1.1E13 9. There are three variables in this problem: wire diameter (d).35612 0.8E14 12554.4015 10.2E06 0. 5.000073 Table 7 Results of fixed-dimension multimodal benchmark functions.14 are automatically replaced with a new search agent in the next iteration. g 5 ð~ xÞ ¼ 0:125  x1 6 0 g 6 ð~ xÞ ¼ 1:10471x21 þ 0:04811x3 x4 ð14:0 þ x2 Þ  5:0 6 0 g 7 ð~ Variable range 0:1 6 x1 6 2.398 3.252799 0.7E08 0 7. Ha and Wang tried to solve this problem using PSO [58]. or delta less fit than any other wolves.52 5.00016 52.006917 0.9E15 5.6E06 0.799617 8.02 3. Bending stress in the beam (h).86278 3.27 5.053438 0.03163 0.040343 0. F F14 F15 F16 F17 F18 F19 F20 F21 F22 F23 GWO PSO GSA DE FEP Ave Std Ave Std Ave Std Ave Std Ave Std 4. The Consider Minimize Subject to ~ x ¼ ½x1 x2 x3 x4  ¼ ½hltb.2E08 0 8E15 4.6 0. 0:1 6 x2 6 10. surge frequency.7 38.831299 0. The comparison of results of these techniques and GWO are provided in Table 9.25E16 0 1.03163 0.5364 3. Note that we use a similar penalty function for GWO to perform a fair comparison [63].000028 3. GA [60]. Tension/compression spring design The objective of this problem is to minimize the weight of a tension/compression spring as illustrated in Fig. Any kind of penalty function can readily be employed in order to penalize search agents based on their level of violation.5E07 0.023081 3.8651 8. 12 [55–57].026301 0.2. This problem has four variables such as thickness of weld (h).62938 0.86263 3.0005 1. 91 172.56 151 74.176 32. Inner radius (R).3 900. 14. s ¼ J . ð5:3Þ Variable range 0 6 x1 6 99. 4G dmax ¼ 0:25 in: .11E01 8.68816 163. g3 ð~ ~ g4 ðxÞ ¼ x4  240 6 0.72087 91.401 14. J ¼ 2 2x1 x2 4 þ x1 þx 2 R¼     3 6PL ~ rð~xÞ ¼ x6PL 2 . Length of the cylindrical section without considering the head (L).26 8. L ¼ 14 in:.86146 95.461 100 161.06 616. Subject to g 1 ð~ xÞ ¼ x3 þ 0:00954x3 6 0. Griffith and Stewart’s successive linear approximation are the mathematical approaches that have been adopted by Ragsdell and Philips [68] for this problem.83544 91.066 101.3 83. Richardson’s random method.14045 88.41 324.5518 68. qffiffiffiffi E . Davidon-Fletcher-Powell.78E17 67.32726 47. E ¼ 30  16 psi. Search history and trajectory of the first particle in the first dimension.6277 19.63E17 200.11 125.75E02 28. Both ends of the vessel are capped. This problem is subject to four constraints.94 1.43776 123.66] employed GA. 0 P 00 MR s ¼ pffiffi2x1 x2 .42 81. Minimize f ð~ xÞ ¼ 0:6224x1 x3 x4 þ 1:7781x2 x23 þ 3:1661x21 x4 þ 19:84x21 x3 .45 861. There are four variables in this problem: 2 x2 sð~xÞ ¼ ðs0 Þ2 þ 2s0 s00 2R þ ðs00 Þ . whereas Lee and Geem [67] used HS to solve this problem. . 10 6 x3 6 200. Simplex method.1429 43. smax ¼ 13600 psi. and welding of a cylindrical vessel as 10 6 x4 6 200.9937 81.87141 6. Thickness of the shell (Ts). Thickness of the head (Th).181 671. The results show that GWO finds a design with the minimum cost compared to others. dðxÞ ¼ Ex2 x4 4x 3 Pc ð~ xÞ ¼ qffiffiffiffiffiffi 4:013E L2 x2 x6 3 4 36 3  1  x2L3 P ¼ 6000 lb. M ¼ PðL þ x22 Þ.G ¼ 12  106 psi. and the head has a hemi-spherical shape.92 168.759 144. These constraints and the problem are formulated as follows: Consider ~ x ¼ ½x1 x2 x3 x4  ¼ ½T s T h RL. rmax ¼ 30000 psi Coello [64] and Deb [65.4 358.6202 180 170 200 142.99 214.03 314.81 6.89366 82. Mirjalili et al. forming.80086 61.604 39.25536 84.0906 2. Fig.784 2. Pressure vessel design 0 6 x2 6 99.48573 100 155. The objective of this problem is to minimize the total cost consisting of material.1235 102.14261 69.26 188. The comparison results are provided in Table 10. / Advances in Engineering Software 69 (2014) 46–61 Table 8 Results of composite benchmark functions. 5. qffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi  2 x22 3 þ x1 þx .32E02 qffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi in Fig.789 490. g2 ð~ xÞ ¼ px23 x4  43 px33 þ 1296000 6 0. xÞ ¼ x1 þ 0:0193x3 6 0. 11.65 13.769 20. F GWO F24 F25 F26 F27 F28 F29 where PSO GSA DE CMA-ES Ave Std Ave Std Ave Std Ave Std Ave Std 43.86 10.56 S.3. 4 2 npffiffiffi h2  2 io x2 3 . DE [62]. 11 (continued) Fig. ES [59]. and ACO [70].S. The results of this problem are provided in Table 11. The heuristic methods that have been adopted to optimize this problem are: PSO [58]. According to this table. the results on the three classical engineering problems demonstrate that GWO shows high performance in solving challenging problems. This is again due to the operators that are designed to allow GWO to avoid local optima successfully and converge towards the optimum quickly. 12. . GWO is again able to find a design with the minimum cost. GA [57.69]. In summary. / Advances in Engineering Software 69 (2014) 46–61 57 Fig. (b) stress heatmap (c) displacement heatmap.60. Mathematical methods used are augmented Lagrangian Multiplier [71] and branch-and-bound [72]. The next section probes the performance of the GWO algorithm in solving a recent real problem in the field of optical engineering. Mirjalili et al. This problem has also been popular among researchers and optimized in various studies. Tension/compression spring: (a) shematic. an optical buffer is one of the main components of optical CPUs. The radii of holes and shape of the line defect yield different slow light characteristics.1789 8.051728 0.2444 4.250000 0. and wh. and wh define the shape of the slot and have an impact on the final dispersion and slow light properties as well.05169 0.2443 1.356737 0.00000 N/A N/A 8.2915 0.051154 0.1854000 0. C is the velocity of light in free space.0128334 6.0127048 0.3815 There are two metrics for comparing the performance of slow light devices: Delay-Bandwidth Product (DBP) and Normalized DBP (NDBP). Structure of welded beam design (a) shematic (b) stress heatmap (c) displacement heatmap.1185 2. wl. The Bragg slot structure allows the BSPCW to have precise control of dispersion and slow light properties.354714 0. / Advances in Engineering Software 69 (2014) 46–61 Table 9 Comparison of results for tension/compression spring design problem. The first five holes adjacent to the slot have the highest impact on slow light properties.6.3807 0.050276 0. and x0 is the normalized central frequency of light wave.244543 10.632201 12.2444 4. k indicates the wave vector.48. Dx is the normalized bandwidth.051609 0.0127303 0.2444 0.357644 0.28885 13. l.410831 9. Mirjalili et al. and shows the group index.2792 0.72624 1. As may be seen in Fig. Optimum variables Optimum cost h l t b 0. in 2011 [73].349871 0.0126810 0.8245 2. This problem has several constraints.0126702 0. it should be averaged as follows: ng ¼ Z xH n g ð xÞ xL dx Dx ð6:4Þ The bandwidth of a PCW refers to the region of the ng curve where ng has an approximately constant value with a maximum fluctua- Fig. To compare devices with different lengths and operating frequencies. Varying radii and line defects provides different environments for refracting the light in the waveguide.2189 5.) Mathematical optimization (Belegundu) Constraint correction (Arora) Optimum variables Table 10 Comparison results of the welded beam design problem.3800 2.03681 10. The researchers in this field try to manipulate the radii of holes and pins of line defect in order to achieve desirable optical buffering characteristics.890522 11. Since NDBP has a direct relation to the group index (ng). The most popular device to do this is a Photonic Crystal Waveguide (PCW). BSPCW structure was first proposed by Caer et al.363965 0. The structure of BSPCWs is illustrated in Fig.012666 0.879952 1.478377 3. as discussed in [73].323680 0.0126747 0.7512 8.051480 0. 13.205676 0.0853 7.5307 2. Optimum weight Algorithm GWO GSA GA (Coello) GA (Deb) GA (Deb) HS (Lee and Geem) Random Simplex David APPROX d D N 0.2231 9.399180 11. The background slab is silicon with a refractive index equal to 3.2489 0. In fact.4331 2.1730 6.053396 0.2915 0. 15.58 S.2552 6.351661 0.2442 3. This is achieved by increasing the length of the device (L).3841 2.0126706 0.051989 0.076432 11.4575 0.525410 11. various dispersion and slow light properties can be achieved by manipulating the radii of holes. The optical buffer slows the group velocity of light and allows the optical CPUs to process optical packets or adjust its timing. Dt should be increased in order to increase DBP. Obviously. . Algorithm GWO GSA PSO (Ha and Wang) ES (Coello and Montes) GA (Coello) HS (Mahdavi et al. so we utilize the simplest constraint handling method for GWO in this section as well.6256 6. Since ng is changing in the bandwidth range.2796 0.2533 0. The slot and holes are filled by a material with a refractive index of 1.6600 0. wl. Real application of GWO in optical engineering (optical buffer design) The problem investigated in this section is called optical buffer design.315900 14.182129 N/A N/A 0. which are defined as follows [74]: DBP ¼ Dt  Df ð6:1Þ where Dt indicates the delay and Df is the bandwidth of the slow light device. There are also different types of PCW that are suitable for specific applications. 15.050000 0.2915 8.856979 N/A N/A 6. PCWs mostly have a lattice-shaped structure with a line defect in the middle.0127022 0. NDBP is a better choice [75]: NDBP ¼ ng  Dx=x0 ð6:2Þ where ng is the average of the group index.) DE (Huang et al.202376 N/A N/A 0. can be formulated as follows [76]: ng ¼ C vg ¼C dk dx ð6:3Þ where x is the dispersion. In this section the structure of a PCW called a Bragg Slot PCW (BSPCW) is optimized by the GWO algorithm.205778 0.2434 0. So. In slow light devices the ultimate goal is to achieve maximum transmission delay of an optical pulse with highest PCW bandwidth.7313 5. l. 437500 0.637690 176.098411 42.0428 8129.000000 176.7445 6059.8359 6061.103624 58.812500 1.701000 Super cell Consider : ~ x ¼ ½x1 x2 x3 x4 x5 x6 x7 x8  ¼ Maximize : n Dx f ð~ xÞ ¼ NDBP ¼ gx0 .291000 47.323900 42. Table 11 Comparison results for pressure vessel design problem.746500 200.812500 0.437500 0.7340 6059.679000 176.0777 6288.5639 8538.097398 48.812500 0. / Advances in Engineering Software 69 (2014) 46–61 Fig.812500 0.) ACO (Kaveh and Talataheri) Lagrangian Multiplier (Kannan) Branch-bound (Sandgren) a Optimum cost Ts Th R L 0.437500 0.937500 0.089181 55.1036 R 1 a R2 R3 R4 R5 l wh wl a a a a a a a .7456 6059. Pressure vessel (a) shematic (b) stress heatmap (c) displacement heatmap.091266 40.434500 0.125000 1.700000 176.500000 0.812500 1.59 S.3811 6059.0888 7198.6900000 117.9463 6410.125000 0. 14.437500 0.437500 0.434500 0.654050 112.625000 0.812500 0.9886598 42.625000 42. Mirjalili et al.640518 176.625000 0.572656 43.758731 84.125000 0. 6051.812500 0. Algorithm Optimum variables GWO GSA PSO (He and Wang) GA (Coello) GA (Coello and Montes) GA (Deb and Gene) ES (Montes and Coello) DE (Huang et al.329000 42.4542025 176.098087 42. 29498a 0. This again demonstrated the high performance of the GWO algorithm in solving real problems. Note that the algorithm was run by 24 CPUs on a Windows HPC cluster at Griffith University.6 33. xL > maxðxdown band Þ.26 0. kn > knH ! xGuided mode > xH . The second to fifth constraints avoid band mixing.43 Dk(nm) Order of magnitude of b2 (a/2pc2) NDBP Variable range : 0 6 x15 6 0:5.33235a 0. In addition.8 6 1. 93% and 65% improvement in bandwidth (Dk) and NDBP utilizing the GWO algorithm.2014a 0. kn < knL ! xGuided mode < xL . Detailed information about PCWs can be found in [77–80]. nbackground = 3. 15. 2R5 2R4 Si 2R3 2R2 2R1 wl Filled xH < minðxup band Þ. tion rage of ±10% [75]. . 16(b) and 17.9 103 0. Fig. The GWO algorithm was run 20 times on this problem and the best results obtained are reported in Table 12. BSPCW structure with super cell. Subject to : maxðjb2 ðxÞjÞ < 106 a=2pc2 .26837a 0. Structural parameter Wu et al. the problem is mathematically formulated for GWO as follows: Table 12 Structural parameters and calculation results. This table shows that there is a substantial. This comprehensive study shows that the proposed GWO algorithm has merit among the current meta-heuristics. First.24952a 0. Note that we consider five constraints for the GWO algorithm.48 and nfilled = 1. a ¼ x0  1550ðnmÞ. l kn ¼ 2kap Dx ¼ xH  xL . Finally. [81] GWO R1 R2 R3 R4 R5 l Wh Wl a(nm) g n – – – – – – – – 430 23 17. the corresponded group index and optimized super cell are shown in Figs.6. To handle feasibility. These figures show that the optimized structure has a very good bandwidth without band mixing as well. we assign small negative objective function values (100) to those search agents that violate the constraints.60073a 343 19. 0 6 x6 6 1.7437a 0. xL ¼ xðknL Þ ¼ xð0:9ng0 Þ. The photonic band structure of the BSPCW optimized is shown in Fig.34992a 0. 0 6 x7. 16(a). ð6:5Þ wh where : xH ¼ xðknH Þ ¼ xð1:1ng0 Þ.6 103 0. the re- . 13:157–68. [6] Beni G. Dorronsoro B. 1993. local optima avoidance. 374–77. challenging search spaces. Science. 2006. IEEE Trans 1997. Evolut Comput. [5] Kirkpatrick S. exploitation. Engelbrecht AP. IEEE Trans 2005. IEEE international conference on. 16. Twenty nine test functions were employed in order to benchmark the performance of the proposed algorithm in terms of exploration. 1999. the results on the unimodal functions showed the superior exploitation of the GWO algorithm. IEEE 2006.2013. [2] Dorigo M. exploitation. showing the applicability of the proposed algorithm in solving real problems. the results of the engineering design problems also showed that the GWO algorithm has high performance in unknown. [4] Wolpert DH. Optimization by simulated annealing. 1942–1948. p.1:67–82. Fi- Supplementary data associated with this article can be found. Dorigo M. 2010 international conference on. 1995. p. 1128–34. the convergence analysis of GWO confirmed the convergence of this algorithm. CEC 2008 (IEEE World Congress on Computational Intelligence).35 0. Conclusion This work proposed a novel SI optimization algorithm inspired by grey wolves.60 S. Optimized super cell of BSPCW. 703–12. [7] Basturk B. [10] Lin L. in Neural Networks. [3] Kennedy J. Supplementary material Fig. Finally. Soft Comput 2009. and convergence.4 0. Hashim SZM. Macready WG. Swarm intelligence: from natural to artificial systems: OUP USA.advengsoft. Moreover. Stutzle T. Birattari M. An artificial bee colony (ABC) algorithm for numeric function optimization.2 60 Δω 40 ωH 20 0. vol. 2008. Second. Measuring exploration/exploitation in particle swarms using swarm diversity. EP. Theraulaz G.216 0. [8] Olorunda O. in the online version. The results on this problem showed a substantial improvement of NDBP compared to current approaches. p. The GWO algorithm was finally applied to a real problem in optical engineering. ed.9:126–42. 1983. [11] Mirjalili S.22 0.26 0. Eberhart R. It may be noted that the results on semi-real and real problems also proved that GWO can show high performance not only on unconstrained problems but also on constrained problems.007. 1995.1016/j.22 Guided mode 0. 7. 220. GSA. the results of the classical engineering problems show the superior performance of the proposed algorithm in solving semi-real constrained problems. Wang J. [9] Alba E.45 0. p. Jr.1:28–39. Ant colony optimization. Gen M. IEEE Congress on.3 0. R5=120 nm R4=101 nm R3=92 nm R2=84 nm R1=114 nm wl=69 nm wh=206 nm l=255 nm nally. Appendix A.18 ω0 ωL 0.16 0. In: Computer and information application (ICCIA). the results of the optical buffer design problem show the ability of the GWO algorithm in solving the real problems. In: IEEE swarm intelligence symposium. 17. Second. Mirjalili et al. Auto-tuning strategy for evolutionary algorithms: balancing between exploration and exploitation. DE. Particle swarm optimization. . Evolut Comput.222 0. 12–4. Springer. A new hybrid PSOGSA algorithm for function optimization. the exploration ability of GWO was confirmed by the results on multimodal functions. In: Robots and biological systems: towards a new bionics?. we are going to develop binary and multiobjective versions of the GWO algorithm. 2008. p. Comput Intell Magaz. p. at http://dx. the results of the composite functions showed high local optima avoidance.25 0.24 80 Light line of SiO2 The Group Index—ng Normalized Frequency (ωa/2πc=a/λ) 0. First. Karaboga D. In: Proceedings.org/10. Third. DG.218 0. For future work. In: Evolutionary computation. No free lunch theorems for optimization. 671–80. The proposed method mimicked the social hierarchy and hunting behavior of grey wolves. 2010. 12.224 Normalized Frequency (ω a/2π c=a/ ) (b) Fig. The exploration/exploitation tradeoff in dynamic cellular genetic algorithms. References [1] Bonabeau E. The results showed that GWO was able to provide highly competitive results compared to wellknown heuristics such as PSO. Vecchi MP. and convergence. / Advances in Engineering Software 69 (2014) 46–61 100 0.5 Wavevector--ka/2π (a) 0 0. (a) Photonic band structure of the optimized BSPCW structure (b) The group index (ng) of the optimized BSPCW structure. local optima avoidance.214 0. sults of the unconstrained benchmark functions demonstrate the performance of the GWO algorithm in terms of exploration.doi. and ES. Swarm intelligence in cellular robotic systems. [81] Wu J. [74] Baba T. p. [25] Kaveh A. Lewis A. Adv Eng Inform 2002. p. Li Y. Evolutionary programming made faster. An improved ant colony optimization for constrained engineering design problems. Curved space optimization: a random search based on general relativity theory. Slow light engineering in polyatomic photonic crystal waveguides based on square lattice.3:82–102. 2005. [65] Deb K. In: Advanced intelligent computing theories and applications. Knowl-Based Syst 2012. Use of a self-adaptive penalty approach for engineering optimization problems. dominance. Mirjalili S. Can J Zool 1999.188:1567–79.77:1196–203. Chen YP. 1988. Test problems in optimization. Müller SD. In: Proceedings of the 2003 international conference on information and knowledge engineering (IKE’03). Wideband and low dispersion slow light in slotted photonic crystal waveguide. [73] Caer C. [32] Moghaddam FF.11:1–18.37:106–11.16:193–203. In: Presented at the Dasgupta D. Mirjalili SM. Civil Eng Syst 2000. An effective co-evolutionary particle swarm optimization for constrained engineering design problems. [71] Kannan B. 95–105. [56] Belegundu AD. [44] Gandomi AH. Optics Commun 2010. 2011. [39] Lu X. p. Koumoutsakos P. [61] Mahdavi M.12:702–13. 518–25. Int J Comput Sci Eng 2011.29:3083–90. In: Intelligent systems.23:1298–300. Appl Math Comput 2007.27:155–82. A new metaheuristic bat-inspired algorithm. 2009. Arora JS. Int J Bio-Inspired Comput 2010.41:113–27.17:319–46. Marris-Morini D. Artificial intelligence through simulated evolution. Yuan L. Photonics Technol Lett. Behav Process 2011. Comput Methods Appl Mech Eng 2005. et al. Margaritis K.116:405. Photonics Technol Lett IEEE 2014. 207–214. [27] Alatas B. Dispersion engineering of wide slot photonic crystal waveguides by Bragg-like corrugation of the slot. Light property and optical buffer performance enhancement using Particle Swarm Optimization in Oblique Ring-ShapeHole Photonic Crystal Waveguide.1007/s00521-013-1525-5. Alpha status. Proceedings 2005 IEEE. [30] Du H. 210–14. http://www. ACROA: artificial chemical reaction optimization algorithm for global optimization. 205–11. Mezura Montes E. Evolution strategy. Addison Wesley. Mirjalili S.2:78–84. Int J Numer Meth Eng 1985. [36] Roth M. [15] Storn R. / Advances in Engineering Software 69 (2014) 46–61 [12] Mirjalili S. Lewis A. Peng C. Nevada. Wu X. 68–75. Liang JJ.186:311–38. Problem definitions and evaluation criteria for the CEC 2005 special session on realparameter optimization. Prog Electromag Res 2007. [57] Coello Coello CA. Appl Math Comput 2012. [45] Pan W-T. [70] Kaveh A. Seref O. Nanyang Technological University. [50] Yang X-S. [67] Lee KS. [68] Ragsdell K. [17] Fogel D.2:465–73. Mirjalili S. MBO: Marriage in honey bees optimization – a haplometrosis polygynous swarming approach.179:2232–48. Deb K. 2009. Lin G. In: Photonics global conference (PGC). A new intelligent optimization-artificial fish swarm algorithm.: Springer. 1997. Optimal design of a welded beam via genetic algorithms.112:283–94. In: Nature inspired cooperative strategies for optimization (NICSO 2010). Yang X. Test functions for optimization needs. A tri-objective particle swarm optimizer for designing line defect photonic crystal waveguides. An effective co-evolutionary differential evolution for constrained optimization. Talatahari S.9:1–14. Nonlinear integer and discrete programming in mechanical design. [13] Holland JH. and division of labor in wolf packs. Kramer SN.20:89–99.38:13170–80. ed. Escobedo R. [47] Muro C. Le Roux X. COELLO C. GSA: a gravitational search algorithm. Comput Ind 2000. Light Technol J 2011. Vivien L. China. A novel multi-objective optimization framework for designing photonic crystal waveguides. [38] Mucherino A.186:340–56.218:11125–37. [75] Zhai Y. A new optimization method: big bang–big crunch. p. A local search optimization algorithm based on natural principles of gravitation. [58] He Q. 2009. [63] Yang XS. In: Evolutionary computation. An efficient constraint handling method for genetic algorithms. Nature-inspired metaheuristic algorithms. Constraint-handling using an evolutionary multiobjective optimization technique. A study of particle swarm optimization particle trajectories. 2012. Gene AS. [20] Koza JR. New York. [16] Yao X.2214. 2007. Khayatazad M. ed. [52] Liang J. World Congress on.77:481–506.284:5829–32. Coello CAC. [29] Kaveh A. Price K. in press. Abedi K. Chen S.1. Smith JE. Zhang J. Liu Y. Fesanghary M. 1992. p. [35] Li X. stochastic test functions and design optimisation.S. [62] Huang F. Phillips D. 2003. [21] Simon D. Binary bat algorithm. Academic Press. [59] Mezura-Montes E. S-shaped versus V-shaped transfer functions for binary Particle Swarm Optimization. Commun Nonlinear Sci Numer Simul 2012.edu. Doctor thesis. Central force optimization: a new metaheuristic with applications in applied electromagnetics. [51] Mirjalili S. Auger A. DOI: 10. 2009. [37] Pinto PC. Alavi AH. Neural Comput Appl. 264–73.267:66–72. [64] Carlos A.176:937–71. 1–4 [2012].26:146–9. 1989. [19] Rechenberg I. [53] Suganthan PN. Mirjalili S. Springer. Technical Report. Black hole: a new heuristic optimization approach for data clustering. . [14] Goldberg D. Ji Y.sg/home/EPNSugan. Swarm Evolut Comput 2013. [66] Deb K.6:132–40. A new meta-heuristic method: ray optimization. Tian H. Deb S. Monkey search: a novel metaheuristic search for global optimization. p. [49] Molga M. [79] Mirjalili SM. preprint arXiv:1208. A novel heuristic optimization method: charged system search. 2005. [76] Wang D. Saryazdi S. Las Vegas. Wang Z. [33] Yang X-S. Jianjun J. Guangxing Y. arXiv. [55] Arora JS. [69] Deb K. [43] Askarzadeh A. Comput Methods Appl Mech Eng 2000. Wasp swarm algorithm for dynamic MAX-SAT problems. A robust optimal design technique for mechanical component design.77:425–91. USA. 2012.88:192–7. NaBIC 2009. Training feedforward neural networks using hybrid particle swarm optimization and gravitational search algorithm.ntu. An empirical study about the usefulness of evolution strategies to solve constrained optimization problems.21:1583–99. 2007.29:2013–5. With Aspects of Artificial Intelligence. [82] Mirjalili S. A new meta-heuristic algorithm for continuous engineering optimization: harmony search theory and practice. et al. 2003 Introduction to evolutionary computing. 124–28. Deb K. Moghaddam RF. 2010. p.: Springer. Evolutionary algorithms in engineering applications. Lecture notes in computer science. preprint arXiv:1008. In: Adaptive and Natural Computing Algorithms. Runkler TA. Coppinger R. ASME J Eng Indust 1976. Appl Math Comput 2007. [41] Shiqin Y. Principal components analysis by the galaxy-based search algorithm: a novel metaheuristic for continuous optimisation. Evolut Comput 2003. Singapore. 2010. Mirjalili et al.124:5989–93. [28] Hatamlou A. Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation (CMAES). 162. Comput Intel Imitat Life 1994. p. 2008.11:341–59. 2004.194:3902–33. Eng Comput Int J Comput-Aided Eng 2010. ed. He Q. WRI Global Congress on. Optical buffer performance enhancement using Particle Swarm Optimization in Ring-Shape-Hole Photonic Crystal Waveguide. Evolut Comput IEEE Trans 2008.283:2815–9. 61 [48] Digalakis J. 905. Abedi K. search and machine learning. p. arXiv. 65–74. [78] Mirjalili SM. Eksin I. Cuckoo search via Lévy flights. 2001. editors.26:69–74. [80] Mirjalili SM. Optics Commun 2011. 2006. An improved harmony search algorithm for solving optimization problems. Genetic Algorithms in optimization. 350–57. Genetic programming. 2001. A new fruit fly optimization algorithm: taking the financial distress model as an example. A Study of mathematical programming methods for structural optimization.: Springer. Slow light property improvement and optical buffer capability in ring-shape-hole photonic crystal waveguide. A novel global convergence algorithm: bee collecting pollen algorithm. Expert Syst Appl 2011. Rezazadeh A. p. Optimal design of a class of welded structures using geometric programming. Geem ZW. Introduction to optimum design. Lei J. Termite: a swarm intelligent routing algorithm for mobile wireless ad-hoc networks. On benchmarking functions for genetic algorithms. AIAA J 1991. Luniver Press. [26] Formato RA. Lewis A. In: Swarm intelligence symposium. Zhou Y. [23] Erol OK. Michalewicz Z. Gao D. Zhejiang University of Zhejiang. [40] Yang X-S. In: Advances in Natural Computation. Int J Comput Math 2001. Spector L. Firefly algorithm. 2005. GCIS’09. IEEE 2011. Talatahari S. Wang L. Smutnicki C. Han J. Bernhard PJ. Damangir E. 2005. Wiley-IEEE Press. Optik – Int J Light Elect Optics 2013. Adv Eng Softw 2006. Slow light in photonic crystals. In: Eiben AE. [72] Sandgren E. [77] Mirjalili SM. Jacq J. Small-world optimization algorithm for function optimization. Krill Herd: a new bio-inspired optimization algorithm. 2003.98:1021–5. Mohd Hashim SZ. An augmented Lagrange multiplier based method for mixed integer discrete continuous optimization and its applications to mechanical design. ed. Sci Am 1992. Wang L. Comput Struct 2012. SIS 2005. J Global Optim 1997. [60] Coello Coello CA. Test functions for optimization needs. Differential evolution – a simple and efficient heuristic for global optimization over continuous spaces. p. 2005. A dolphin partner optimization. Izard N. Acta Mech 2010. editors. [18] Hansen N. p. Inf Sci 2009.: Springer. Biogeography-based optimization.37:443–73. p. Wolf-pack (Canis lupus) hunting strategies emerge from simple rules in computational simulations. Genetic algorithms. Nat Photonics 2008. [22] Webster B. Roux C (1995) Registration of nonsegmented images using a genetic algorithm. Moradian Sardroudi H. A new heuristic optimization algorithm for modeling of proton exchange membrane fuel cell: bird mating optimizer. Eng Appl Artif Intell 2007. 2009. [42] Yang X-S. [31] Shah-Hosseini H. Part I: Theory. Berlin. Novel composition test functions for numerical global optimization. Constraint-handling in genetic algorithms through the use of dominance-based tournament selection.213:267–89. [46] Mech LD. Int J Gen Syst 2008. vol. et al. Nezamabadi-Pour H. J Mech Des 1994. In: Nature & Biologically Inspired Computing. Proceedings of the 2001 congress on. 255–61. [54] van den Bergh F. Inf Sci 2012. IEEE Trans 1999. Int J Energy Res 2012.0549. Cheriet M. Sousa JM. [34] Abbass HA. Photonics and Nanostructures – Fundamentals and Applications. Inf Sci 2006. In: AIP conference proceedings. Zhuang J. Evolut Comput. Suganthan P. Hansen N. [24] Rashedi E. Engelbrecht A.
Copyright © 2024 DOKUMEN.SITE Inc.