SOFT COMPUTING Evolutionary Computing 1 What is a

  • Slides: 45
Download presentation
SOFT COMPUTING Evolutionary Computing 1

SOFT COMPUTING Evolutionary Computing 1

What is a GA? q GAs are adaptive heuristic search algorithm based on the

What is a GA? q GAs are adaptive heuristic search algorithm based on the evolutionary ideas of natural selection and genetics. q As such they represent an intelligent exploitation of a random search used to solve optimization problems. q Although randomized, GAs are by no means random, instead they exploit historical information to direct the search into the region of better performance within the search space. 2

What is a GA? q The basic techniques of the GAs are designed to

What is a GA? q The basic techniques of the GAs are designed to simulate processes in natural systems necessary for evolution, specially those follow the principles first laid down by Charles Darwin of "survival of the fittest. ". q Since in nature, competition among individuals for scanty resources results in the fittest individuals dominating over the weaker ones. 3

Evolutionary Algorithms Evolution Strategies Genetic Programming Genetic Algorithms Classifier Systems Evolutionary Programming • genetic

Evolutionary Algorithms Evolution Strategies Genetic Programming Genetic Algorithms Classifier Systems Evolutionary Programming • genetic representation of candidate solutions • genetic operators • selection scheme • problem domain 4

History of GAs q Genetic Algorithms were invented to mimic some of the processes

History of GAs q Genetic Algorithms were invented to mimic some of the processes observed in natural evolution. Many people, biologists included, are astonished that life at the level of complexity that we observe could have evolved in the relatively short time suggested by the fossil record. q The idea with GA is to use this power of evolution to solve optimization problems. The father of the original Genetic Algorithm was John Holland who invented it in the early 1970's. 5

Classes of Search Techniques DFS, BFS Tabu Search Hill Climbing Genetic Programming 6

Classes of Search Techniques DFS, BFS Tabu Search Hill Climbing Genetic Programming 6

Early History of EAs q q q q q 1954: Barricelli creates computer simulation

Early History of EAs q q q q q 1954: Barricelli creates computer simulation of life – Artificial Life 1957: Box develops Evolutionary Operation (EVOP), a non-computerised evolutionary process 1957: Fraser develops first Genetic Algorithm 1958: Friedberg creates a learning machine through evolving computer programs 1960 s, Rechenverg: evolution strategies q a method used to optimize real-valued parameters for devices 1960 s, Fogel, Owens, and Walsh: evolutionary programming q to find finite-state machines 1960 s, John Holland: Genetic Algorithms q to study the phenomenon of adaptation as it occurs in nature (not to solve specific problems) 1965: Rechenberg & Schwefel independently develop Evolution Strategies 1966: L. Fogel develops Evolutionary Programming as a means of creating artificial intelligence 1967: Holland his students extend GA ideas further 7

The Genetic Algorithm q q Directed search algorithms based on the mechanics of biological

The Genetic Algorithm q q Directed search algorithms based on the mechanics of biological evolution Developed by John Holland, University of Michigan (1970’s) q To understand the adaptive processes of natural systems q To design artificial systems software that retains the robustness of natural systems The genetic algorithms, first proposed by Holland (1975), seek to mimic some of the natural evolution and selection. The first step of Holland’s genetic algorithm is to represent a legal solution of a problem by a string of genes known as a chromosome. 8

Evolutionary Programming q q First developed by Lawrence Fogel in 1966 for use in

Evolutionary Programming q q First developed by Lawrence Fogel in 1966 for use in pattern learning Early experiments dealt with a number of Finite State Automata q FSA were developed that could recognise recurring patterns and even primeness of numbers Later experiments dealt with gaming problems (coevolution) More recently has been applied to training of neural networks, function optimisation & path planning problems 9

Biological Terminology • gene • functional entity that codes for a specific feature e.

Biological Terminology • gene • functional entity that codes for a specific feature e. g. eye color • set of possible alleles • allele • value of a gene e. g. blue, green, brown • codes for a specific variation of the gene/feature • locus • position of a gene on the chromosome • genome • set of all genes that define a species • the genome of a specific individual is called genotype • the genome of a living organism is composed of several chromosomes • population • set of competing genomes/individuals 10

Genotype versus Phenotype • genotype • blue print that contains the information to construct

Genotype versus Phenotype • genotype • blue print that contains the information to construct an organism e. g. human DNA • genetic operators such as mutation and recombination modify the genotype during reproduction • genotype of an individual is immutable (no Lamarckian evolution) • phenotype • physical make-up of an organism • selection operates on phenotypes (Darwin’s principle : “survival of the fittest”) 11

12 Courtesy of U. S. Department of Energy Human Genome Program , http: //www.

12 Courtesy of U. S. Department of Energy Human Genome Program , http: //www. ornl. gov/hgmis

Genotype Operators • recombination (crossover) • combines two parent genotypes into a new offspring

Genotype Operators • recombination (crossover) • combines two parent genotypes into a new offspring • generates new variants by mixing existing genetic material • stochastic selection among parent genes • mutation • random alteration of genes • maintain genetic diversity • in genetic algorithms crossover is the major operator whereas mutation only plays a minor role 13

Crossover • crossover applied to parent strings with probability pc : [0. 6. .

Crossover • crossover applied to parent strings with probability pc : [0. 6. . 1. 0] • crossover site chosen randomly • one-point crossover parent A 1 1 0 parent B 1 0 0 0 1 offspring A offspring B 11011 • two-point crossover parent A 1 1 0 parent B 1 0 0 0 1 offspring A offspring B 11 00 0 10000 10 01 1 14

Mutation • mutation applied to allele/gene with probability Pm : [0. 001. . 0.

Mutation • mutation applied to allele/gene with probability Pm : [0. 001. . 0. 1] • role of mutation is to maintain genetic diversity offspring: 11000 Mutate fourth allele (bit flip) mutated offspring: 1 1 0 10 0 15

Structure of an Evolutionary Algorithm mutation population of genotypes 10111 10001 01001 recombination 00111

Structure of an Evolutionary Algorithm mutation population of genotypes 10111 10001 01001 recombination 00111 11001 01011 f coding scheme selection 10 011 001 10011 10001 01 011 01001 01011 phenotype space 10001 11001 x fitness 01011 16

Pseudo Code of an Evolutionary Alg. Create initial random population Evaluate fitness of each

Pseudo Code of an Evolutionary Alg. Create initial random population Evaluate fitness of each individual Termination criteria satisfied ? no Select parents according to fitness yes stop Recombine parents to generate offspring Mutate offspring Replace population by new offspring 17

Roulette Wheel Selection • selection is a stochastic process • probability of reproduction p

Roulette Wheel Selection • selection is a stochastic process • probability of reproduction p i = fi / Sk fk 00 011 10 11 001 1 1 01 001001 01011 1 10 100 0 110010 01 01010 1001 0111 1 100 01 1 00 11 110010 10001 01101 1 100010 1 1 • selected parents : 01011, 11010, 10001 18

Genetic Programming • automatic generation of computer programs by means of natural evolution see

Genetic Programming • automatic generation of computer programs by means of natural evolution see Koza 1999 • programs are represented by a parse tree (LISP expression) • tree nodes correspond to functions : - arithmetic functions {+, -, *, /} - logarithmic functions {sin, exp} + • leaf nodes correspond to terminals : - input variables {X 1, X 2, X 3} X 1 - constants {0. 1, 0. 2, 0. 5 } tree is parsed from left to right: (+ X 1 (* X 2 X 3)) X 1+(X 2*X 3) X 2 * X 3 19

Genetic Programming : Crossover - + parent A X 1 parent B X 2

Genetic Programming : Crossover - + parent A X 1 parent B X 2 X 1 - X 2 * X 2 / - X 3 + offspring A X 2 * X 2 / X 3 X 2 - X 1 X 3 offspring B X 1 X 3 20

Areas EAs Have Been Used In Design of electronic circuits Telecommunication network design Artificial

Areas EAs Have Been Used In Design of electronic circuits Telecommunication network design Artificial intelligence Study of atomic clusters Study of neuronal behaviour Neural network training & design Automatic control Artificial life Scheduling Travelling Salesman Problem General function optimisation Bin Packing Problem Pattern learning Gaming Self-adapting computer programs Classification Test-data generation Medical image analysis Study of earthquakes 21

Goldberg (1989) l Goldberg D. E. (1989), Genetic Algorithms in Search, Optimization, and Machine

Goldberg (1989) l Goldberg D. E. (1989), Genetic Algorithms in Search, Optimization, and Machine Learning. Addison-Wesley, Reading. 22

Michalewicz (1996) l Michalewicz, Z. (1996), Genetic Algorithms + Data Structures = Evolution Programs,

Michalewicz (1996) l Michalewicz, Z. (1996), Genetic Algorithms + Data Structures = Evolution Programs, Springer. 23

Vose (1999) l Vose M. D. (1999), The Simple Genetic Algorithm : Foundations and

Vose (1999) l Vose M. D. (1999), The Simple Genetic Algorithm : Foundations and Theory (Complex Adaptive Systems). Bradford Books; 24

SOFT COMPUTING Fuzzy-Evolutionary Computing 25

SOFT COMPUTING Fuzzy-Evolutionary Computing 25

26

26

27

27

28

28

Genetic Fuzzy Systems (GFS’s) • genetic design of fuzzy systems • automated tuning of

Genetic Fuzzy Systems (GFS’s) • genetic design of fuzzy systems • automated tuning of the fuzzy knowledge base • automated learning of the fuzzy knowledge base • objective of tuning/learning process • optimizing the performance of the fuzzy system: e. g. : fuzzy modeling : minimizing quadratic error between data set and the fuzzy system outputs e. g : fuzzy control system: optimize the behavior of the plant + fuzzy controller 29

Genetic Fuzzy System for Data Modeling Evolutionary algorithm genotype Fuzzy system parameters fitness Evaluation

Genetic Fuzzy System for Data Modeling Evolutionary algorithm genotype Fuzzy system parameters fitness Evaluation scheme phenotype Fuzzy System Dataset : xi, yi 30

Fuzzy Systems Knowledge Base Rule base: Database : definition of Definition of fuzzy membership-

Fuzzy Systems Knowledge Base Rule base: Database : definition of Definition of fuzzy membership- fuzzy rules function If X 1 is A 1 and … and Xn is An then Y is B a b c 31

Genetic Tuning Process • tuning problems utilize an already existing rule base • tuning

Genetic Tuning Process • tuning problems utilize an already existing rule base • tuning aims to find a set of optimal parameters for the database : • points of membership-functions [a, b, c, d] or • scaling factors for input and output variables 32

Linear Scaling Functions Chromosome for linear scaling: • for each input xi : two

Linear Scaling Functions Chromosome for linear scaling: • for each input xi : two parameters ai, bi i=1. . n • for the output y : two parameter a 0, b 0 Genetic Algorithms: • encode each parameter by k bit using Gray code total length = 2*(n+1)*k bit a 0 100101 b 0 011111 a 1 110101 . . . Evolutionary Strategies: • each parameter ai or bi corresponds to one object variable xm m : 1… 2*(n+1) a 0 x 0, so b 0 x 1, s 1 a 1 x 2, s 2 . . . b 2*(n+1) 100101 b 2*(n+1) xm, sm 33

Descriptive Knowledge Base • descriptive knowledge base m sm me lg m neg ze

Descriptive Knowledge Base • descriptive knowledge base m sm me lg m neg ze pos y x • all rules share the same global membership functions : R 1 : if X is sm then Y is neg R 2 : if X is me then Y is ze R 3 : if X is lg then Y is pos 34

Approximate Knowledge Base • each rule employs its own local membership function R 1

Approximate Knowledge Base • each rule employs its own local membership function R 1 : if X is then Y is • tradeoff: more degrees of freedom and therefore better approximation but intuitive meaning of fuzzy sets gets lost 35

Tuning Membership Functions • encode each fuzzy set by characteristic parameters Trapezoid: <a, b,

Tuning Membership Functions • encode each fuzzy set by characteristic parameters Trapezoid: <a, b, c, d> Gaussian: N(m, s) (x) 1 0 (x) 1 a b c s 0 d x m x Triangular: <a, b, c> (x) 1 0 a b c x x 36

Approximate Genetic Tuning Process • a chromosome encodes the entire knowledge base, database and

Approximate Genetic Tuning Process • a chromosome encodes the entire knowledge base, database and rulebase Ri : if x 1 is Ai 1 and … xn is Ain then y is Bi encoded by the i-th segment Ci of the chromosome using triangular membership-functions (a, b, c) Ci = (ai 1, bi 1, ci 1, . . . , ain, bin, cin, ai, bi, ci, ) each parameter may be binary or real-coded The chromosome is the concatenation of the individual segments corresponding to rules : C 1 C 2 C 3 C 4. . . Ck 37

Descriptive Genetic Tuning Process • the rule base already exists • assume the i-th

Descriptive Genetic Tuning Process • the rule base already exists • assume the i-th variable is composed of Ni terms Ci m = (ai 1, bi 1, ci 1, . . . , ai. Ni, bi. Ni, ci. Ni ) A 1 ai 1, bi 1, ci 1, ai 2, A 2 A 3 bi 2, ci 2 xi ai 3, bi 3, ci 3 The chromosome is the concatenation of the individual segments corresponding to variables : C 1 C 2 C 3 C 4. . . Ck 38

Descriptive Genetic Tuning • in the previous coding scheme fuzzy sets might change their

Descriptive Genetic Tuning • in the previous coding scheme fuzzy sets might change their order and optimization is subject to the constraints : aij < bij < cij A 3 A 2 A 1 x 2 x 3 • encode the distance among the center points of triangular fuzzy sets and choose the border points such that S mi = 1 39

Fitness Function for Tuning • minimize quadratic error among training data (xi, yi) and

Fitness Function for Tuning • minimize quadratic error among training data (xi, yi) and fuzzy system output f(xi) E = Sumi (yi-f(xi))2 Fitness = 1 / E (maximize fitness) • minimize maximal error among training data (xi, yi) and fuzzy system output f(xi) E = maxi (yi-f(xi))2 Fitness = 1 / E (maximize fitness) 40

Genetic Learning Systems • genetic learning aim to : • learn the fuzzy rule

Genetic Learning Systems • genetic learning aim to : • learn the fuzzy rule base or • learn the entire knowledge base • three different approaches • Michigan approach : each chromosome represents a single rule • Pittsburgh approach : each chromosome represents an entire rule base / knowledge base • Iterative rule learning : each chromosome represents a single rule, but rules are injected one after the other into the knowledge base 41

Michigan Approach Population: 11001 00101 10111 11100 01000 11101 : : : X A

Michigan Approach Population: 11001 00101 10111 11100 01000 11101 : : : X A 6 R 1: if x is A 1 …. then Y is B 1 R 2: if x is A 2 …. then Y is B 2 R 3: if x is A 3 …. then Y is B 3 R 4: if x is A 4 …. then Y is B 4 R 5: if x is A 5 …. then Y is B 5 R 6: if x is A 6 …. then Y is B 6 A 1 A 4 A 3 Y A 5 A 2 B 1 B 2 B 5 B 6 Individual: B 4 B 3 42

Cooperation vs. Competition Problem • we need a fitness function that measures the Fitness

Cooperation vs. Competition Problem • we need a fitness function that measures the Fitness = number of correct classifications minus number of incorrect classifications Y R 2 : : ifif xx is is small med. then Y Y is is neg. zero R 1 large R 3 zero R 4 : if x is small then Y is pos. F = 2. 5 F=2. 7 F=-0. 4 F=-1. 6 neg ze pos accuracy of an individual rule as well as the quality of its cooperation with other rules small medium large X 43

Michigan Approach • steady state selection: • pick one individual at random • compare

Michigan Approach • steady state selection: • pick one individual at random • compare it with all individuals that cover the same input region • remove the “relatively” worst one from the population • pick two parents at random independent of their fitness and generate a new offspring competitors: 11001 : R 1: if x is A 1 …. then Y is B 1 00101 : R 2: if x is A 2 …. then Y is B 2 11001 : R 1: if x is A 1 …. then Y is B 1 10111 : R 3: if x is A 3 …. then Y is B 3 11100 : R 4: if x is A 4 …. then Y is B 4 01000 : R 5: if x is A 5 …. then Y is B 5 11101 : R 6: if x is A 6 …. then Y is B 6 removed from the population 44

Thanks for your attention! That’s all. 45

Thanks for your attention! That’s all. 45