0% found this document useful (0 votes)
63 views78 pages

Evolutionary Computation and Its Applications: Dr. K.Indira

The document discusses evolutionary computation and its applications. Evolutionary computation uses techniques inspired by biological evolution, such as inheritance, mutation, selection, and crossover. It has been successfully applied to problems in various domains including aircraft design, routing, tracking, and game playing. Specific applications discussed include robotics, scheduling, machine learning, pattern recognition, and data mining. Association rule mining is presented as a data mining technique to find relationships between data items. Evolutionary algorithms like genetic algorithms and particle swarm optimization are introduced as metaheuristic optimization techniques.

Uploaded by

Indira Sivakumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
63 views78 pages

Evolutionary Computation and Its Applications: Dr. K.Indira

The document discusses evolutionary computation and its applications. Evolutionary computation uses techniques inspired by biological evolution, such as inheritance, mutation, selection, and crossover. It has been successfully applied to problems in various domains including aircraft design, routing, tracking, and game playing. Specific applications discussed include robotics, scheduling, machine learning, pattern recognition, and data mining. Association rule mining is presented as a data mining technique to find relationships between data items. Evolutionary algorithms like genetic algorithms and particle swarm optimization are introduced as metaheuristic optimization techniques.

Uploaded by

Indira Sivakumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 78

Evolutionary Computation and

Its Applications

Dr. K.INDIRA
Principal
E.S. Engg. College
Villupuram

INTRODUCTION
Evolutionary Computation is the field of study
devoted to the design, development and analysis
for solving problem based on natural selection
(simulated evolution)
Evolution has proven to be a powerful search
process
Evolutionary Computation has been successfully
applied to a wide range of problems including:
Aircraft Design,
Routing in Communications Networks,
Tracking Windshear,
Game Playing (Checkers [Fogel])
2

APPLICATIONS AREAS

Robotics,
Air Traffic Control,
Design,
Scheduling,
Machine Learning,
Pattern Recognition,
Job Shop Scheduling,
VLSI Circuit Layout,
Strike Force Allocation,
3

APPLICATIONS AREAS

Theme Park Tours (Disney Land/World)


Market Forecasting,
Egg Price Forecasting,
Design of Filters and Barriers,
Data-Mining,
User-Mining,
Resource Allocation,
Path Planning,
Etc.
4

DATA MINING

Extraction

of

interesting

information

or

patterns from data in large databases is known


as data mining.

ASSOCIATION RULE MINING

Association rule mining finds interesting


associations and/or correlation relationships
among large set of data items.

ASSOCIATION RULE MINING

Proposed by Agrawal et al in 1993.


It is an important data mining model
studied extensively by the database
and data mining community.
Assume all data are categorical.
No good algorithm for numeric data.
Initially used for Market Basket
Analysis to find how items purchased
by customers are related.
Bread Milk
100%]

[sup = 5%, conf =


7

The model: data


I = {i1, i2, , im}: a set of items.
Transaction t :
t a set of items, and t I.
Transaction Database T: a set of
transactions T = {t1, t2, , tn}.

Transaction Data: Supermarket Data


Market basket transactions:
t1:
t2:

tn:

{bread, cheese, milk}


{apple, eggs, salt, yogurt}

{biscuit, eggs, milk}

Concepts:
An item: an item/article in a basket
I: the set of all items sold in the store
A transaction: items purchased in a
basket; it may have TID (transaction ID)
A transactional dataset: A set of
transactions
9

The Model: Rules

A transaction t contains X, a set


of items (itemset) in I, if X t.
An association rule is an
implication of the form:
X Y, where X, Y I, and X Y
=
An itemset is a set of items.
E.g., X = {milk, bread, cereal} is an
itemset.
10

Rule strength measures


Support: The rule holds with support
sup in T (the transaction data set) if
sup% of transactions contain X Y.
sup = Pr(X Y).

Confidence: The rule holds in T with


confidence conf if conf% of
tranactions that contain X also contain
Y.
conf = Pr(Y | X)

An association rule is a pattern that


states when X occurs, Y occurs with
certain probability.
11

Support and Confidence

Support count: The support


count of an itemset X, denoted
by X.count, in a data set T is the
number of transactions in T that
contain X. Assume T has n
( X Y ).count
transactions.
support
n
Then,
( X Y ).count
confidence
X .count
12

EVOLUTIONARY COMPUTING
Evolutionary computing techniques mostly involve Metaheuristic
Optimization Algorithms.
Evolutionary algorithms
Gene expression programming
Genetic Algorithm
Genetic programming
Evolutionary programming
Evolution strategy
Differential evolution
Differential search algorithm
Eagle strategy
Swarm intelligence
Ant colony optimization
Particle Swarm Optimization
Bees algorithm
Cuckoo search
13

EVOLUTIONARY COMPUTING
Evolutionary computing techniques mostly involve Metaheuristic
Optimization Algorithms.
Evolutionary algorithms
Gene expression programming
Genetic Algorithm
Genetic programming
Evolutionary programming
Evolution strategy
Differential evolution
Differential search algorithm [6]
Eagle strategy
Swarm intelligence
Ant colony optimization
Particle Swarm Optimization
Bees algorithm
Cuckoo search
14

GA AND PSO : AN INTRODUCTION

Genetic

Algorithm

(GA)

and

Particle

Swarm

Optimization (PSO) are effective population based


stochastic search algorithms, which include heuristics
and an element of nondeterminism in traversing the
search space.

15

DATASETS
University of California Irvine
Repository
Lenses
Habermans Survival
Car Evaluation
Post operative care
Zoo

16

DATASETS
Dataset Name

No. of
Instances

No. of
Attributes

Attribute
characteristics

Lenses

24

Categorical

Habermans
Survival

306

Integer

Car Evaluation

1728

Categorical

Post Operative
Patient

90

Categorical,
Integer

Zoo

101

17

Categorical,
Integer
17

SYSTEM TEMPLATE

Propo
sed
Syste
m

INPUT
TEMPLATE

OUTPUT
TEMPLATE
18

GENETIC ALGORITHM

A Genetic Algorithm (GA) is a procedure used to


find approximate solutions to search problems
through the application of the principles of
evolutionary biology.

19

GENETIC ALGORITHM
POPULATION

SELECTION

MUTATION

CROSSOVER

20

Flowchart of ARM using GA

21

GA COMPONENTS

A problem to solve, and ...


Encoding technique
(gene,
chromosome)

Initialization procedure
(creation)

Evaluation function
(environment)

Selection of parents
(reproduction)

Genetic operators
recombination)

(mutation,
22

SIMPLE GA
{

initialize population;
evaluate population;
while Termination Criteria Not Satisfied
{

select parents for reproduction;


perform recombination and
mutation;
evaluate population;
}
}

23

GA CYCLE OF REPRODUCTION

reproduction

children

modified
children

parents

population

modification

evaluated children

evaluation

deleted
members

discard
24

POPULATION

population
Chromosomes could be:
Bit strings
(0101 ... 1100)
Real numbers
0.0 89.2)
Permutations of element
E1 E15)
Lists of rules
R22 R23)
Program elements

(43.2 -33.1 ...


(E11 E3 E7 ...
(R1 R2 R3 ...
(genetic

25

REPRODUCTION

reproduction

children

parents

population
Parents are selected at random
with selection
chances biased in relation to
chromosome evaluations.
26

SELECTION TYPES

Random Selection
Tournament Selection
Rank based Selection
Rolette Wheel Selection

27

CHROMOSOME MODIFICATION

children

modification
modified children

Modifications are stochastically


triggered
Operator types are:
Mutation
Crossover (recombination)
28

MUTATION

Before:
After:
Before:
0.1)

(1 0 1 1 0 1 1 0)
(0 1 1 0 0 1 1 0)
(1.38 -69.4 326.44

After:
(1.38 -67.5 326.44
0.1) movement in the search space
Causes
(local or global)
Restores lost information to the population

29

CROSSOVER

Crossover is a critical feature of genetic


algorithms:
It greatly accelerates search early
in evolution of a population
30

CROSSOVER TYPES

One-Point Crossover
Two-Point Crossover
Uniform Crossover

31

1-point example:
Parent1: 1, 3, 4, 3, 6, 1, 3, 6, 7, 3, 1,
4
Parent2: 3, 5, 2, 6, 7, 1, 2, 5, 4, 2, 2,
8
Random choice of k = 6
Child:
1, 3, 4, 3, 6, 1, 2, 5, 4, 2, 2, 8

2-point example:
Parent1: 1, 3, 4, 3, 6, 1, 3, 6, 7, 3, 1,
4
Parent2: 3, 5, 2, 6, 7, 1, 2, 5, 4, 2, 2,
8
Random choices: j = 3, k = 10

32

Uniform crossover example:


Parent1: 1, 3, 4, 3, 6, 1, 3, 6, 7, 3, 1,
4
Parent2: 3, 5, 2, 6, 7, 1, 2, 5, 4, 2, 2,
8
If our random choices in order were:
121221121221
the child would be:
Child:
1, 5, 4, 6, 7, 1, 3, 5, 7, 2, 2, 4
33

EVALUATION

evaluated
children

modified
children

evaluation

The evaluator decodes a chromosome


and assigns it a fitness measure
The evaluator is the only link between
a classical GA and the problem it is
solving
34

DELETION

population
discarded members

discard
Generational GA:
entire populations replaced with each
iteration
Steady-state GA:
a few members replaced each
generation
35

RESEARCH DIRECTIONS

Parameter Tuning in GA
GA with Elitism
Adaptive GA
Local Search

36

Concept of Elitism
Populati
on

Elite

Elitis
m

Selecti
on
Crossover
Mutation
Mating
Pool

New
Solution
s

37

Adaptive GA
Methodology
Selection

Roulette Wheel

Crossover Probability

Mutation Probability

Fitness Function

Population

Fixed

Fixed

38

PARTCILE SWARM OPTIMIZATION

PSOs mechanism is inspired by the social and


cooperative behavior displayed by various
species like birds, fish etc including human
beings.

39

PARTCILE SWARM OPTIMIZATION

Updation of velocity
of particle in each
Iteration

Generatio
n1

Generatio
n2
Generatio
nN

Target
(Solutio
n)

-Particle

- Best
particle of the
swarm

40

Introduction to the PSO: Algorithm


1. Create a population of agents (particles)
uniformly distributed over X
2. Evaluate each particles position according
to the objective function
3. If a particles current position is better
than its previous best position, update it
4. Determine the best particle (according to
the particles previous best positions)

41

Introduction to the PSO: Algorithm


5. Update particles velocities:

6. Move particles to their new positions:

7. Go to step 2 until stopping criteria are


satisfied
42

Velocity Updation
in PSO

43

PSO STATES
Partic
le

Exploration

Best
Particle of
Swarm

Exploitation
44

PSO STATES

Convergence

Jumping Out
45

Introduction to the PSO: Algorithm


- Example

46

Introduction to the PSO: Algorithm


- Example

47

Introduction to the PSO: Algorithm


- Example

48

Introduction to the PSO: Algorithm


- Example

49

Introduction to the PSO: Algorithm


- Example

50

Introduction to the PSO: Algorithm


- Example

51

Introduction to the PSO: Algorithm


- Example

52

Introduction to the PSO: Algorithm


- Example

53

Introduction to the PSO: Algorithm


Characteristics
Advantages
Simple implementation
Easily parallelized for concurrent processing
Very few algorithm parameters
Very efficient global search algorithm

Disadvantages
Tendency to a fast and premature
convergence in mid
optimum points
Slow convergence in refined search stage

54

FLOWCHART OF PSO
Flow chart depicting the General PSO Algorithm:
Start

For each particles position (p)


evaluate fitness
If fitness(p) better than
fitness(pbest) then pbest= p
Set best of pBests as gBest
Update particles velocity and
position
Stop: giving gBest, optimal solution.

Loop until max iter

Loop until all


particles exhaust

Initialize particles with random position


and velocity vectors.

55

RESEARCH DIRECTIONS

Introduction of chaotic Maps


Neighborhood selection in PSO
Adaptive PSO (non data dependent)
Data dependent adaptation in PSO
Memetic PSO with Shuffled Frog Leaping Algorithm
Quantum Behaved PSO for ARM
Hybridization of GA and PSO

56

CHAOTIC PSO
Methodology
The new chaotic map model is formulated as

Initial point u0 and v0 to 0.1


The velocity of each particle is updated by

57

Neighborhood Selection
Methodology
The concept of local best particle (lbest) replacing the
particle best (pbest) is introduced
The neighborhood best (lbest) selection is as follows;
Calculate the distance of the current particle from
other particles
Find the nearest m particles as the neighbor of the
current particle based on distance calculated
Choose the local optimum lbest among the
neighborhood in terms of fitness values

58

SAPSO Non Data Dependent


The Inertia Weight in the velocity update equation is made
adaptive.
SAPSO1 :
SAPSO2 :
SACPSO :

where, g is the generation index and G is a redefined maximum


number of generations. Here, the maximal and minimal weights max and
min are set to 0.9 and 0.4, based on experimental study.

59

Adaptive PSO
Estimation of Evolutionary State done using distance
measure di and estimator e

Classify into which state particle belongs and adapt the


acceleration coefficients and Inertia Weight
Exploration
Exploitation
Convergence
Jumping Out

60

APSO

Adapt the acceleration coefficients as given in table


State

Acceleration Coefficient
c1

c2

Exploration

Increase by

Decrease by

Exploitation

Increase by

Decrease by

Convergence

Increase by

Increase by

Jumping out

Decrease by

Increase by

61

MINING AR USING APSO


The Inertia Weight is adjusted as given in equation

62

HYBRID GA/ PSO (GPSO) MODEL


Genetic Algorithm Particle Swarm
Optimization
Global Optimization Converges Easily

Advantages

GA works on a
population of
possible solution

PSO have no
overlapping and
mutation
calculation

they
do assure
not tend to
Cannot
be
easily trapped
constant
by
local optima
optimisation

Memory
The method easily
suffers from the
partial optimism

response times

Disadvantages

Mutation and
Crossover at times
creates children
faraway from good
solutions

Weak local search


ability

63

HYBRID GA/ PSO (GPSO) MODEL

Initial
Population

Updated
Population

Ranked
Population
Upper

Genetic
Genetic
Algorithm
Algorithm
Evaluate
Fitness

Lower

Particle
Particle
Swarm
Swarm
Optimization
Optimization

64

HYBRID GA/ PSO (GPSO) MODEL


For 1 to Elite
x <- copy(x_best)
For 1 to (pop_sizeElite) * breed_Ratio
x <- Select an Individual
x <- Update Velocity
x <- Update Position
For 1 to (pop_sizeElite)*(1Breed_Ratio
x1 <- Select an Individual
x2 <- Select an Individual
Crossover(x1, x2)
Mutate(x1, x2)

65

Mining AR using PSO +SFLA

Shuffled Frog Leaping Algorithm (SFLA) is adopted to


perform the local search
Here the particles are allowed to gain some
experience, through a local search, before being
involved in the evolutionary process
The shuffling process allows the particles to gain
information about the global best.

66

FLOWCHART FOR PSO


+SFLA
Generation of initial population(P) and evaluating
the fitness of each particle
Velocity and position updation of particles
Sorting the population in descending order in
terms of fitness value
Distribution of frog into M memeplexes

SFLA

Iterative Updating of worst frog in each


memeplexes
Combining all frogs to form a new population
Termination
criteria satisfied?
Determine the best solution
67

SHUFFLED FROG LEAPING


ALGORITHM (SFLA)

68

Formation of Memeplexes
Sorted
Frogs
Frog 1
Frog 2
Frog 3
Frog 4

Memeplex
1

Memeplex
2

Frog 5
Frog 6

Memeplex
3

Frog 7
Frog 8

69

Updation of Worst Particles

The position of the particles with worst fitness is


modified using

Xb - Position of the group best /global best


Xw - Position of the worst frog in the group
Di - Calculated new position of the worst frog

70

QUANTUM BEHAVED PSO

In PSO, the state of a particle is depicted by its


position vector (xi) and velocity vector (vi).
In Quantum Laws of Mechanics, according to
Uncertainty Principle, xi and vi cannot be determined
simultaneously.
In the quantum model of PSO, the state of a particle
is depicted by wave- function (x,t).

71

QUANTUM
INFERENCES
BEHAVED PSO

72

QPSO Methodology
The particles movement is by:

Where,
p = (c * pid + (1-c) * pgd)
c = (c1 * r1)/ (c1*r1 + c2*r2)
is the contraction-expansion coefficient [0,1]

73

QPSO FLOWCHART
Start

Quantum Behaviour

Initialize the
swarm
Calculate mean best
(mbest)
Update particle position
Update local best
Update global best

No

Terminati
on criteria
reached
YES

Stop
74

Referenc
Jing Li, Han Rui-feng,es
A Self-Adaptive Genetic Algorithm Based On

Real- Coded, International Conference on Biomedical Engineering and


computer Science , Page(s): 1 - 4 , 2010

Chuan-Kang Ting, Wei-Ming Zeng, Tzu- Chieh Lin, Linkage Discovery


through Data Mining, IEEE Magazine on Computational Intelligence,
Volume 5, February 2010.
Caises, Y., Leyva, E., Gonzalez, A., Perez, R., An extension of the
Genetic
Iterative Approach for Learning Rule Subsets , 4th
International Workshop on Genetic and Evolutionary Fuzzy Systems,
Page(s): 63 - 67 , 2010
Shangping Dai, Li Gao, Qiang Zhu, Changwu Zhu, A Novel Genetic
Algorithm Based on Image Databases for Mining Association Rules, 6th
IEEE/ACIS International
Conference on Computer and Information
Science, Page(s): 977 980, 2007
Peregrin, A., Rodriguez, M.A., Efficient Distributed Genetic Algorithm
for Rule Extraction,. Eighth International Conference on
Hybrid
Intelligent Systems, HIS '08. Page(s): 531 536, 2008

References
Mansoori,
E.G., Zolghadri, M.J., Katebi, S.D., SGERD:
Contd..

A Steady-State Genetic Algorithm for Extracting Fuzzy


Classification Rules From Data, IEEE Transactions on
Fuzzy Systems, Volume: 16 , Issue: 4 , Page(s): 1061
1071, 2008..

Xiaoyuan Zhu, Yongquan Yu, Xueyan Guo, Genetic


Algorithm Based on Evolution Strategy and the
Application in Data Mining,
First International
Workshop on Education Technology and Computer
Science, ETCS '09, Volume: 1 , Page(s): 848 852,
2009
Hong Guo, Ya Zhou,
An Algorithm for Mining
Association Rules Based on Improved Genetic
Algorithm and its Application, 3rd International
Conference on Genetic and Evolutionary Computing,
WGEC '09, Page(s): 117 120, 2009
Genxiang

Zhang,

Haishan

Chen,

Immune

76

References
Caises,Contd..
Y., Leyva, E., Gonzalez,

A., Perez, R., An


extension of the Genetic
Iterative Approach for
Learning Rule Subsets , 4th International Workshop
on Genetic and Evolutionary Fuzzy Systems, Page(s):
63 - 67 , 2010

Xiaoyuan Zhu, Yongquan Yu, Xueyan Guo, Genetic


Algorithm Based on Evolution Strategy and the
Application in Data Mining,
First International
Workshop on Education Technology and Computer
Science, ETCS '09, Volume: 1 , Page(s): 848 852, 2009
Miguel Rodriguez, Diego M. Escalante, Antonio
Peregrin, Efficient Distributed Genetic Algorithm for
Rule extraction, Applied Soft Computing 11 (2011) 733
743.
Hamid Reza Qodmanan , Mahdi Nasiri, Behrouz MinaeiBidgoli, Multi objective association rule mining with
genetic algorithm without specifying minimum support

Thank
You

You might also like