[{"@context":"http:\/\/schema.org\/","@type":"BlogPosting","@id":"https:\/\/wiki.edu.vn\/en\/wiki14\/evolutionary-algorithm-wikipedia\/#BlogPosting","mainEntityOfPage":"https:\/\/wiki.edu.vn\/en\/wiki14\/evolutionary-algorithm-wikipedia\/","headline":"Evolutionary algorithm – Wikipedia","name":"Evolutionary algorithm – Wikipedia","description":"before-content-x4 Subset of evolutionary computation after-content-x4 In computational intelligence (CI), an evolutionary algorithm (EA) is a subset of evolutionary computation,[1]","datePublished":"2021-12-21","dateModified":"2021-12-21","author":{"@type":"Person","@id":"https:\/\/wiki.edu.vn\/en\/wiki14\/author\/lordneo\/#Person","name":"lordneo","url":"https:\/\/wiki.edu.vn\/en\/wiki14\/author\/lordneo\/","image":{"@type":"ImageObject","@id":"https:\/\/secure.gravatar.com\/avatar\/44a4cee54c4c053e967fe3e7d054edd4?s=96&d=mm&r=g","url":"https:\/\/secure.gravatar.com\/avatar\/44a4cee54c4c053e967fe3e7d054edd4?s=96&d=mm&r=g","height":96,"width":96}},"publisher":{"@type":"Organization","name":"Enzyklop\u00e4die","logo":{"@type":"ImageObject","@id":"https:\/\/wiki.edu.vn\/wiki4\/wp-content\/uploads\/2023\/08\/download.jpg","url":"https:\/\/wiki.edu.vn\/wiki4\/wp-content\/uploads\/2023\/08\/download.jpg","width":600,"height":60}},"image":{"@type":"ImageObject","@id":"https:\/\/wikimedia.org\/api\/rest_v1\/media\/math\/render\/svg\/c3c9a2c7b599b37105512c5d570edc034056dd40","url":"https:\/\/wikimedia.org\/api\/rest_v1\/media\/math\/render\/svg\/c3c9a2c7b599b37105512c5d570edc034056dd40","height":"","width":""},"url":"https:\/\/wiki.edu.vn\/en\/wiki14\/evolutionary-algorithm-wikipedia\/","wordCount":12466,"articleBody":" (adsbygoogle = window.adsbygoogle || []).push({});before-content-x4Subset of evolutionary computation (adsbygoogle = window.adsbygoogle || []).push({});after-content-x4In computational intelligence (CI), an evolutionary algorithm (EA) is a subset of evolutionary computation,[1] a generic population-based metaheuristic optimization algorithm. An EA uses mechanisms inspired by biological evolution, such as reproduction, mutation, recombination, and selection. Candidate solutions to the optimization problem play the role of individuals in a population, and the fitness function determines the quality of the solutions (see also loss function). Evolution of the population then takes place after the repeated application of the above operators.Evolutionary algorithms often perform well approximating solutions to all types of problems because they ideally do not make any assumption about the underlying fitness landscape. Techniques from evolutionary algorithms applied to the modeling of biological evolution are generally limited to explorations of microevolutionary processes and planning models based upon cellular processes. In most real applications of EAs, computational complexity is a prohibiting factor.[2] In fact, this computational complexity is due to fitness function evaluation. Fitness approximation is one of the solutions to overcome this difficulty. However, seemingly simple EA can solve often complex problems;[3][4][5] therefore, there may be no direct link between algorithm complexity and problem complexity. (adsbygoogle = window.adsbygoogle || []).push({});after-content-x4Table of ContentsImplementation[edit]Theoretical background[edit]No free lunch theorem[edit]Convergence[edit]Virtual alphabets[edit]Comparison to biological processes[edit]Applications[edit]Related techniques[edit]Other population-based metaheuristic methods[edit]Examples[edit]Gallery[edit]References[edit]External links[edit]Bibliography[edit]Implementation[edit]The following is an example of a generic single-objective genetic algorithm.Step One: Generate the initial population of individuals randomly. (First generation)Step Two: Repeat the following regenerational steps until termination: (adsbygoogle = window.adsbygoogle || []).push({});after-content-x4Evaluate the fitness of each individual in the population (time limit, sufficient fitness achieved, etc.)Select the fittest individuals for reproduction. (Parents)Breed new individuals through crossover and mutation operations to give birth to offspring.Replace the least-fit individuals of the population with new individuals.Similar techniques differ in genetic representation and other implementation details, and the nature of the particular applied problem.Genetic algorithm \u2013 This is the most popular type of EA. One seeks the solution of a problem in the form of strings of numbers (traditionally binary, although the best representations are usually those that reflect something about the problem being solved),[2] by applying operators such as recombination and mutation (sometimes one, sometimes both). This type of EA is often used in optimization problems.Genetic programming \u2013 Here the solutions are in the form of computer programs, and their fitness is determined by their ability to solve a computational problem. There are many variants of Genetic Programming, including Cartesian genetic programming, gene expression programming, grammatical evolution, linear genetic programming, multi expression programming etc.Evolutionary programming \u2013 Similar to genetic programming, but the structure of the program is fixed and its numerical parameters are allowed to evolve.Evolution strategy \u2013 Works with vectors of real numbers as representations of solutions, and typically uses self-adaptive mutation rates. The method is mainly used for numerical optimization, although there are also variants for combinatorial tasks.[6][7]Differential evolution \u2013 Based on vector differences and is therefore primarily suited for numerical optimization problems.Neuroevolution \u2013 Similar to genetic programming but the genomes represent artificial neural networks by describing structure and connection weights. The genome encoding can be direct or indirect.Learning classifier system \u2013 Here the solution is a set of classifiers (rules or conditions). A Michigan-LCS evolves at the level of individual classifiers whereas a Pittsburgh-LCS uses populations of classifier-sets. Initially, classifiers were only binary, but now include real, neural net, or S-expression types. Fitness is typically determined with either a strength or accuracy based reinforcement learning or supervised learning approach.Theoretical background[edit]The following theoretical principles apply to all or almost all EAs.No free lunch theorem[edit]The no free lunch theorem of optimization states that all optimization strategies are equally effective when the set of all optimization problems is considered. Under the same condition, no evolutionary algorithm is fundamentally better than another. This can only be the case if the set of all problems is restricted. This is exactly what is inevitably done in practice. Therefore, to improve an EA, it must exploit problem knowledge in some form (e.g. by choosing a certain mutation strength or a problem-adapted coding). Thus, if two EAs are compared, this constraint is implied. In addition, an EA can use problem specific knowledge by, for example, not randomly generating the entire start population, but creating some individuals through heuristics or other procedures.[8][9] Another possibility to tailor an EA to a given problem domain is to involve suitable heuristics, local search procedures or other problem-related procedures in the process of generating the offspring. This form of extension of an EA is also known as a memetic algorithm. Both extensions play a major role in practical applications, as they can speed up the search process and make it more robust.[8][10]Convergence[edit]For EAs in which, in addition to the offspring, at least the best individual of the parent generation is used to form the subsequent generation (so-called elitist EAs), there is a general proof of convergence[disambiguation needed] under the condition that an optimum exists. Without loss of generality, a maximum search is assumed for the proof:From the property of elitist offspring acceptance and the existence of the optimum it follows that per generation k{displaystyle k} an improvement of the fitness F{displaystyle F} of the respective best individual x\u2032{displaystyle x’} will occur with a probability 0″\/>. Thus:F(x1\u2032)\u2264F(x2\u2032)\u2264F(x3\u2032)\u2264\u22ef\u2264F(xk\u2032)\u2264\u22ef{displaystyle F(x’_{1})leq F(x’_{2})leq F(x’_{3})leq cdots leq F(x’_{k})leq cdots }I.e., the fitness values represent a monotonically non-decreasing sequence, which is bounded due to the existence of the optimum. From this follows the convergence of the sequence against the optimum.Since the proof makes no statement about the speed of convergence, it is of little help in practical applications of EAs. But it does justify the recommendation to use elitist EAs. However, when using the usual panmictic population model, elitist EAs tend to converge prematurely more than non-elitist ones.[11] In a panmictic population model, mate selection (step 2 of the section about implementation) is such that every individual in the entire population is eligible as a mate. In non-panmictic populations, selection is suitably restricted, so that the dispersal speed of better individuals is reduced compared to panmictic ones. Thus, the general risk of premature convergence of elitist EAs can be significantly reduced by suitable population models that restrict mate selection.[12][13]Virtual alphabets[edit]With the theory of virtual alphabets, David E. Goldberg showed in 1990 that by using a representation with real numbers, an EA that uses classical recombination operators (e.g. uniform or n-point crossover) cannot reach certain areas of the search space, in contrast to a coding with binary numbers.[14] This results in the recommendation for EAs with real representation to use arithmetic operators for recombination (e.g. arithmetic mean or intermediate recombination). With suitable operators, real-valued representations are more effective than binary ones, contrary to earlier opinion.[15][16]Comparison to biological processes[edit]A possible limitation[according to whom?] of many evolutionary algorithms is their lack of a clear genotype\u2013phenotype distinction. In nature, the fertilized egg cell undergoes a complex process known as embryogenesis to become a mature phenotype. This indirect encoding is believed to make the genetic search more robust (i.e. reduce the probability of fatal mutations), and also may improve the evolvability of the organism.[17][18] Such indirect (also known as generative or developmental) encodings also enable evolution to exploit the regularity in the environment.[19] Recent work in the field of artificial embryogeny, or artificial developmental systems, seeks to address these concerns. And gene expression programming successfully explores a genotype\u2013phenotype system, where the genotype consists of linear multigenic chromosomes of fixed length and the phenotype consists of multiple expression trees or computer programs of different sizes and shapes.[20][improper synthesis?]Applications[edit]The areas in which evolutionary algorithms are practically used are almost unlimited[5] and range from industry,[21][22] engineering,[2][3][23] complex scheduling,[4][24][25] agriculture,[26] robot movement planning[27] and finance[28][29] to research[30][31] and art. The application of an evolutionary algorithm requires some rethinking from the inexperienced user, as the approach to a task using an EA is different from conventional exact methods and this is usually not part of the curriculum of engineers or other disciplines. For example, the fitness calculation must not only formulate the goal but also support the evolutionary search process towards it, e.g. by rewarding improvements that do not yet lead to a better evaluation of the original quality criteria. For example, if peak utilisation of resources such as personnel deployment or energy consumption is to be avoided in a scheduling task, it is not sufficient to assess the maximum utilisation. Rather, the number and duration of exceedances of a still acceptable level should also be recorded in order to reward reductions below the actual maximum peak value.[32] There are therefore some publications that are aimed at the beginner and want to help avoiding beginner’s mistakes as well as leading an application project to success.[32][33][34] This includes clarifying the fundamental question of when an EA should be used to solve a problem and when it is better not to.Related techniques[edit]Swarm algorithms[clarification needed] include:Other population-based metaheuristic methods[edit]Hunting Search \u2013 A method inspired by the group hunting of some animals such as wolves that organize their position to surround the prey, each of them relative to the position of the others and especially that of their leader. It is a continuous optimization method[37] adapted as a combinatorial optimization method.[38]Adaptive dimensional search \u2013 Unlike nature-inspired metaheuristic techniques, an adaptive dimensional search algorithm does not implement any metaphor as an underlying principle. Rather it uses a simple performance-oriented method, based on the update of the search dimensionality ratio (SDR) parameter at each iteration.[39]Firefly algorithm is inspired by the behavior of fireflies, attracting each other by flashing light. This is especially useful for multimodal optimization.Harmony search \u2013 Based on the ideas of musicians’ behavior in searching for better harmonies. This algorithm is suitable for combinatorial optimization as well as parameter optimization.Gaussian adaptation \u2013 Based on information theory. Used for maximization of manufacturing yield, mean fitness or average information. See for instance Entropy in thermodynamics and information theory.Memetic algorithm \u2013 A hybrid method, inspired by Richard Dawkins’s notion of a meme, it commonly takes the form of a population-based algorithm coupled with individual learning procedures capable of performing local refinements. Emphasizes the exploitation of problem-specific knowledge and tries to orchestrate local and global search in a synergistic way.Examples[edit]In 2020, Google stated that their AutoML-Zero can successfully rediscover classic algorithms such as the concept of neural networks.[40]The computer simulations Tierra and Avida attempt to model macroevolutionary dynamics.Gallery[edit][41][42][43]References[edit]^ Vikhar, P. A. (2016). “Evolutionary algorithms: A critical review and its future prospects”. Proceedings of the 2016 International Conference on Global Trends in Signal Processing, Information Computing and Communication (ICGTSPICC). Jalgaon: 261\u2013265. doi:10.1109\/ICGTSPICC.2016.7955308. ISBN\u00a0978-1-5090-0467-6. S2CID\u00a022100336.^ a b c Cohoon, J; et\u00a0al. (2002-11-26). Evolutionary algorithms for the physical design of VLSI circuits (PDF). Advances in Evolutionary Computing: Theory and Applications. Springer, pp. 683-712, 2003. ISBN\u00a0978-3-540-43330-9.^ a b Slowik, Adam; Kwasnicka, Halina (2020). “Evolutionary algorithms and their applications to engineering problems”. Neural Computing and Applications. 32 (16): 12363\u201312379. doi:10.1007\/s00521-020-04832-8. ISSN\u00a00941-0643. S2CID\u00a0212732659.^ a b Mika, Marek; Walig\u00f3ra, Grzegorz; W\u0119glarz, Jan (2011). “Modelling and solving grid resource allocation problem with network resources for workflow applications”. Journal of Scheduling. 14 (3): 291\u2013306. doi:10.1007\/s10951-009-0158-0. ISSN\u00a01094-6136. S2CID\u00a031859338.^ a b “International Conference on the Applications of Evolutionary Computation”. The conference is part of the Evo* series. The conference proceedings are published by Springer. Retrieved 2022-12-23.^ Nissen, Volker; Krause, Matthias (1994), Reusch, Bernd (ed.), “Constrained Combinatorial Optimization with an Evolution Strategy”, Fuzzy Logik, Berlin, Heidelberg: Springer: 33\u201340, doi:10.1007\/978-3-642-79386-8_5, ISBN\u00a0978-3-642-79386-8^ Coelho, V. N.; Coelho, I. M.; Souza, M. J. F.; Oliveira, T. A.; Cota, L. P.; Haddad, M. N.; Mladenovic, N.; Silva, R. C. P.; Guimar\u00e3es, F. G. (2016). “Hybrid Self-Adaptive Evolution Strategies Guided by Neighborhood Structures for Combinatorial Optimization Problems”. Evol Comput. 24 (4): 637\u2013666. doi:10.1162\/EVCO_a_00187. PMID\u00a027258842. S2CID\u00a013582781.^ a b Davis, Lawrence (1991). Handbook of genetic algorithms. New York: Van Nostrand Reinhold. ISBN\u00a00-442-00173-8. OCLC\u00a023081440.^ Lienig, Jens; Brandt, Holger (1994), Davidor, Yuval; Schwefel, Hans-Paul; M\u00e4nner, Reinhard (eds.), “An evolutionary algorithm for the routing of multi-chip modules”, Parallel Problem Solving from Nature \u2014 PPSN III, Berlin, Heidelberg: Springer, vol.\u00a0866, pp.\u00a0588\u2013597, doi:10.1007\/3-540-58484-6_301, ISBN\u00a0978-3-540-58484-1, retrieved 2022-10-18^ Neri, Ferrante; Cotta, Carlos; Moscato, Pablo, eds. (2012). Handbook of Memetic Algorithms. Studies in Computational Intelligence. Vol.\u00a0379. Berlin, Heidelberg: Springer Berlin Heidelberg. doi:10.1007\/978-3-642-23247-3. ISBN\u00a0978-3-642-23246-6.^ Leung, Yee; Gao, Yong; Xu, Zong-Ben (1997). “Degree of population diversity – a perspective on premature convergence in genetic algorithms and its Markov chain analysis”. IEEE Transactions on Neural Networks. 8 (5): 1165\u20131176. doi:10.1109\/72.623217. ISSN\u00a01045-9227. PMID\u00a018255718.^ Gorges-Schleuter, Martina (1998), Eiben, Agoston E.; B\u00e4ck, Thomas; Schoenauer, Marc; Schwefel, Hans-Paul (eds.), “A comparative study of global and local selection in evolution strategies”, Parallel Problem Solving from Nature \u2014 PPSN V, Lecture Notes in Computer Science, Berlin, Heidelberg: Springer Berlin Heidelberg, vol.\u00a01498, pp.\u00a0367\u2013377, doi:10.1007\/bfb0056879, ISBN\u00a0978-3-540-65078-2, retrieved 2022-10-21^ Dorronsoro, Bernabe; Alba, Enrique (2008). Cellular Genetic Algorithms. Operations Research\/Computer Science Interfaces Series. Vol.\u00a042. Boston, MA: Springer US. doi:10.1007\/978-0-387-77610-1. ISBN\u00a0978-0-387-77609-5.^ Goldberg, David E. (1990), Schwefel, Hans-Paul; M\u00e4nner, Reinhard (eds.), “The theory of virtual alphabets”, Parallel Problem Solving from Nature, Lecture Notes in Computer Science, Berlin\/Heidelberg: Springer-Verlag (published 1991), vol.\u00a0496, pp.\u00a013\u201322, doi:10.1007\/bfb0029726, ISBN\u00a0978-3-540-54148-6, retrieved 2022-10-22^ Stender, J.; Hillebrand, E.; Kingdon, J. (1994). Genetic algorithms in optimisation, simulation, and modelling. Amsterdam: IOS Press. ISBN\u00a090-5199-180-0. OCLC\u00a047216370.^ Michalewicz, Zbigniew (1996). Genetic Algorithms + Data Structures = Evolution Programs (3rd\u00a0ed.). Berlin Heidelberg: Springer. ISBN\u00a0978-3-662-03315-9. OCLC\u00a0851375253.^ G.S. Hornby and J.B. Pollack. “Creating high-level components with a generative representation for body-brain evolution”. Artificial Life, 8(3):223\u2013246, 2002.^ Jeff Clune, Benjamin Beckmann, Charles Ofria, and Robert Pennock. “Evolving Coordinated Quadruped Gaits with the HyperNEAT Generative Encoding” Archived 2016-06-03 at the Wayback Machine. Proceedings of the IEEE Congress on Evolutionary Computing Special Section on Evolutionary Robotics, 2009. Trondheim, Norway.^ J. Clune, C. Ofria, and R. T. Pennock, “How a generative encoding fares as problem-regularity decreases”, in PPSN (G. Rudolph, T. Jansen, S. M. Lucas, C. Poloni, and N. Beume, eds.), vol. 5199 of Lecture Notes in Computer Science, pp. 358\u2013367, Springer, 2008.^ Ferreira, C., 2001. “Gene Expression Programming: A New Adaptive Algorithm for Solving Problems”. Complex Systems, Vol. 13, issue 2: 87\u2013129.^ Sanchez, Ernesto; Squillero, Giovanni; Tonda, Alberto (2012). Industrial Applications of Evolutionary Algorithms. Intelligent Systems Reference Library. Vol.\u00a034. Berlin, Heidelberg: Springer Berlin Heidelberg. doi:10.1007\/978-3-642-27467-1. ISBN\u00a0978-3-642-27466-4.^ Miettinen, Kaisa (1999). Evolutionary algorithms in engineering and computer science\u00a0: recent advances in genetic algorithms, evolution strategies, evolutionary programming, genetic programming, and industrial applications. Chichester: Wiley and Sons. ISBN\u00a00-585-29445-3. OCLC\u00a045728460.^ Gen, Mitsuo; Cheng, Runwei (1999-12-17). Genetic Algorithms and Engineering Optimization. Wiley Series in Engineering Design and Automation. Hoboken, NJ, USA: John Wiley & Sons, Inc. doi:10.1002\/9780470172261. ISBN\u00a0978-0-470-17226-1.^ Dahal, Keshav P.; Tan, Kay Chen; Cowling, Peter I. (2007). Evolutionary scheduling. Berlin: Springer. doi:10.1007\/978-3-540-48584-1. ISBN\u00a0978-3-540-48584-1. OCLC\u00a0184984689.^ Jakob, Wilfried; Strack, Sylvia; Quinte, Alexander; Bengel, G\u00fcnther; Stucky, Karl-Uwe; S\u00fc\u00df, Wolfgang (2013-04-22). “Fast Rescheduling of Multiple Workflows to Constrained Heterogeneous Resources Using Multi-Criteria Memetic Computing”. Algorithms. 6 (2): 245\u2013277. doi:10.3390\/a6020245. ISSN\u00a01999-4893.^ Mayer, David G. (2002). Evolutionary Algorithms and Agricultural Systems. Boston, MA: Springer US. doi:10.1007\/978-1-4615-1717-7. ISBN\u00a0978-1-4613-5693-6.^ Blume, Christian (2000), Cagnoni, Stefano (ed.), “Optimized Collision Free Robot Move Statement Generation by the Evolutionary Software GLEAM”, Real-World Applications of Evolutionary Computing, LNCS 1803, Berlin, Heidelberg: Springer, vol.\u00a01803, pp.\u00a0330\u2013341, doi:10.1007\/3-540-45561-2_32, ISBN\u00a0978-3-540-67353-8, retrieved 2022-12-28^ Aranha, Claus; Iba, Hitoshi (2008), Wobcke, Wayne; Zhang, Mengjie (eds.), “Application of a Memetic Algorithm to the Portfolio Optimization Problem”, AI 2008: Advances in Artificial Intelligence, Berlin, Heidelberg: Springer Berlin Heidelberg, vol.\u00a05360, pp.\u00a0512\u2013521, doi:10.1007\/978-3-540-89378-3_52, ISBN\u00a0978-3-540-89377-6, retrieved 2022-12-23^ Chen, Shu-Heng, ed. (2002). Evolutionary Computation in Economics and Finance. Studies in Fuzziness and Soft Computing. Vol.\u00a0100. Heidelberg: Physica-Verlag HD. doi:10.1007\/978-3-7908-1784-3. ISBN\u00a0978-3-7908-2512-1.^ Lohn, J.D.; Linden, D.S.; Hornby, G.S.; Kraus, W.F. (June 2004). “Evolutionary design of an X-band antenna for NASA’s Space Technology 5 mission”. IEEE Antennas and Propagation Society Symposium, 2004. 3: 2313\u20132316 Vol.3. doi:10.1109\/APS.2004.1331834. hdl:2060\/20030067398. ISBN\u00a00-7803-8302-8.^ Fogel, Gary; Corne, David (2003). Evolutionary Computation in Bioinformatics. Elsevier. doi:10.1016\/b978-1-55860-797-2.x5000-8. ISBN\u00a0978-1-55860-797-2.^ a b Jakob, Wilfried (2021), Applying Evolutionary Algorithms Successfully – A Guide Gained from Realworld Applications, KIT Scientific Working Papers, vol.\u00a0170, Karlsruhe, FRG: KIT Scientific Publishing, arXiv:2107.11300, doi:10.5445\/IR\/1000135763, S2CID\u00a0236318422, retrieved 2022-12-23^ Whitley, Darrell (2001). “An overview of evolutionary algorithms: practical issues and common pitfalls”. Information and Software Technology. 43 (14): 817\u2013831. doi:10.1016\/S0950-5849(01)00188-4. S2CID\u00a018637958.^ Eiben, A.E.; Smith, J.E. (2015). “Working with Evolutionary Algorithms”. Introduction to Evolutionary Computing. Natural Computing Series (2nd\u00a0ed.). Berlin, Heidelberg: Springer Berlin Heidelberg. pp.\u00a0147\u2013163. doi:10.1007\/978-3-662-44874-8. ISBN\u00a0978-3-662-44873-1. S2CID\u00a020912932.^ a b Slowik, Adam; Kwasnicka, Halina (2018). “Nature Inspired Methods and Their Industry Applications\u2014Swarm Intelligence Algorithms”. IEEE Transactions on Industrial Informatics. Institute of Electrical and Electronics Engineers (IEEE). 14 (3): 1004\u20131015. doi:10.1109\/tii.2017.2786782. ISSN\u00a01551-3203. S2CID\u00a03707290.^ F. Merrikh-Bayat, “The runner-root algorithm: A metaheuristic for solving unimodal and multimodal optimization problems inspired by runners and roots of plants in nature”, Applied Soft Computing, Vol. 33, pp. 292\u2013303, 2015^ Oftadeh, R.; Mahjoob, M.J.; Shariatpanahi, M. (October 2010). “A novel meta-heuristic optimization algorithm inspired by group hunting of animals: Hunting search”. Computers & Mathematics with Applications. 60 (7): 2087\u20132098. doi:10.1016\/j.camwa.2010.07.049.^ Amine Agharghor; Mohammed Essaid Riffi (2017). “First Adaptation of Hunting Search Algorithm for the Quadratic Assignment Problem”. Europe and MENA Cooperation Advances in Information and Communication Technologies. Advances in Intelligent Systems and Computing. 520: 263\u2013267. doi:10.1007\/978-3-319-46568-5_27. ISBN\u00a0978-3-319-46567-8.^ Hasan\u00e7ebi, O., Kazemzadeh Azad, S. (2015), “Adaptive Dimensional Search: A New Metaheuristic Algorithm for Discrete Truss Sizing Optimization”, Computers and Structures, 154, 1\u201316.^ Gent, Edd (13 April 2020). “Artificial intelligence is evolving all by itself”. Science | AAAS. Archived from the original on 16 April 2020. Retrieved 16 April 2020.^ Simionescu, P.A.; Dozier, G.V.; Wainwright, R.L. (2006). “A Two-Population Evolutionary Algorithm for Constrained Optimization Problems”. 2006 IEEE International Conference on Evolutionary Computation. Vancouver, BC, Canada: IEEE: 1647\u20131653. doi:10.1109\/CEC.2006.1688506. ISBN\u00a0978-0-7803-9487-2. S2CID\u00a01717817.^ Simionescu, P.A.; Dozier, G.V.; Wainwright, R.L. (2006). “A Two-Population Evolutionary Algorithm for Constrained Optimization Problems” (PDF). 2006 IEEE International Conference on Evolutionary Computation. Proc 2006 IEEE International Conference on Evolutionary Computation. Vancouver, Canada. pp.\u00a01647\u20131653. doi:10.1109\/CEC.2006.1688506. ISBN\u00a00-7803-9487-9. S2CID\u00a01717817. Retrieved 7 January 2017.^ Simionescu, P.A. (2014). Computer Aided Graphing and Simulation Tools for AutoCAD Users (1st\u00a0ed.). Boca Raton, FL: CRC Press. ISBN\u00a0978-1-4822-5290-3.External links[edit]Bibliography[edit]Ashlock, D. (2006), Evolutionary Computation for Modeling and Optimization, Springer, New York, doi:10.1007\/0-387-31909-3 ISBN\u00a00-387-22196-4.B\u00e4ck, T. (1996), Evolutionary Algorithms in Theory and Practice: Evolution Strategies, Evolutionary Programming, Genetic Algorithms, Oxford Univ. Press, New York, ISBN\u00a0978-0-19-509971-3.B\u00e4ck, T., Fogel, D., Michalewicz, Z. (1999), Evolutionary Computation 1: Basic Algorithms and Operators, CRC Press, Boca Raton, USA, ISBN\u00a0978-0-7503-0664-5.B\u00e4ck, T., Fogel, D., Michalewicz, Z. (2000), Evolutionary Computation 2: Advanced Algorithms and Operators, CRC Press, Boca Raton, USA, doi:10.1201\/9781420034349 ISBN\u00a0978-0-3678-0637-8.Banzhaf, W., Nordin, P., Keller, R., Francone, F. (1998), Genetic Programming – An Introduction, Morgan Kaufmann, San Francisco, ISBN\u00a0978-1-55860-510-7.Eiben, A.E., Smith, J.E. (2003), Introduction to Evolutionary Computing, Springer, Heidelberg, New York, doi:10.1007\/978-3-662-44874-8 ISBN\u00a0978-3-662-44873-1.Holland, J. H. (1992), Adaptation in Natural and Artificial Systems, MIT Press, Cambridge, MA, ISBN\u00a0978-0-262-08213-6.Michalewicz, Z.; Fogel, D.B. (2004), How To Solve It: Modern Heuristics. Springer, Berlin, Heidelberg, ISBN\u00a0978-3-642-06134-9, doi:10.1007\/978-3-662-07807-5.Benko, Attila; Dosa, Gyorgy; Tuza, Zsolt (2010). “Bin Packing\/Covering with Delivery, solved with the evolution of algorithms”. 2010 IEEE Fifth International Conference on Bio-Inspired Computing: Theories and Applications (BIC-TA). pp.\u00a0298\u2013302. doi:10.1109\/BICTA.2010.5645312. ISBN\u00a0978-1-4244-6437-1. S2CID\u00a016875144.Poli, R.; Langdon, W. B.; McPhee, N. F. (2008). A Field Guide to Genetic Programming. Lulu.com, freely available from the internet. ISBN\u00a0978-1-4092-0073-4. Archived from the original on 2016-05-27. Retrieved 2011-03-05.[self-published source]Price, K., Storn, R.M., Lampinen, J.A., (2005). Differential Evolution: A Practical Approach to Global Optimization, Springer, Berlin, Heidelberg, ISBN\u00a0978-3-642-42416-8, doi:10.1007\/3-540-31306-0.Ingo Rechenberg (1971), Evolutionsstrategie – Optimierung technischer Systeme nach Prinzipien der biologischen Evolution (PhD thesis). Reprinted by Fromman-Holzboog (1973). ISBN\u00a03-7728-1642-8Hans-Paul Schwefel (1974), Numerische Optimierung von Computer-Modellen (PhD thesis). Reprinted by Birkh\u00e4user (1977).Hans-Paul Schwefel (1995), Evolution and Optimum Seeking. Wiley & Sons, New York. ISBN\u00a00-471-57148-2Simon, D. (2013), Evolutionary Optimization Algorithms, Wiley & Sons, ISBN\u00a0978-0-470-93741-9Kruse, Rudolf; Borgelt, Christian; Klawonn, Frank; Moewes, Christian; Steinbrecher, Matthias; Held, Pascal (2013), Computational Intelligence: A Methodological Introduction. Springer, London. ISBN\u00a0978-1-4471-5012-1, doi:10.1007\/978-1-4471-5013-8.Rahman, Rosshairy Abd.; Kendall, Graham; Ramli, Razamin; Jamari, Zainoddin; Ku-Mahamud, Ku Ruhana (2017). “Shrimp Feed Formulation via Evolutionary Algorithm with Power Heuristics for Handling Constraints”. Complexity. 2017: 1\u201312. doi:10.1155\/2017\/7053710. (adsbygoogle = window.adsbygoogle || []).push({});after-content-x4"},{"@context":"http:\/\/schema.org\/","@type":"BreadcrumbList","itemListElement":[{"@type":"ListItem","position":1,"item":{"@id":"https:\/\/wiki.edu.vn\/en\/wiki14\/#breadcrumbitem","name":"Enzyklop\u00e4die"}},{"@type":"ListItem","position":2,"item":{"@id":"https:\/\/wiki.edu.vn\/en\/wiki14\/evolutionary-algorithm-wikipedia\/#breadcrumbitem","name":"Evolutionary algorithm – Wikipedia"}}]}]