Genetic Algorithms
Genetic Algorithms
2- Given a medical cancer information system with governing variables x1, x2 and x3. It is required to
infer the decision D. The following information is provided,
x1 range 0..100 with fuzzy sets L, M, H.
x2 range 0..100 with fuzzy sets L, M, H.
x3 range 0..100 with fuzzy sets L, M, H.
and D with decisions Malignant: M and Benign: B.
The following decision blocks apply,
DB1:
IF x1=L AND x2=L THEN y=L
IF x1=M AND x2=H THEN y=H
DB2:
IF x3=L AND y=L THEN D=B
IF x3=M AND y=H THEN D=M
Intermediate variable y is
y range 0..100 with fuzzy sets VL, L, M, H, VH
determine the decision D for x1= 30, x2=70 and x3=30.
(6 points)
GeneticAlgorithms
3- Taking the reproductive schema growth equation of schema theory,
eta(S, t+1) = eta(S, t). eval(S, t)/ averagePopFitness(t) [1-Pc.d(S)/(m-1)-o(S).Pm]
assume that o(S).Pm = 0 and d(S) = m-1, then the equation becomes:
eta(S, t+1) = eta(S, t). eval(S, t)/ averagePopFitness(t) [1-Pc]
Discuss the mechanics of the algorithm when Pc = 0 and Pc = 1 under the following conditions:
a- low population size
b- high population size
c- Elitism
(6 points)
4- Given a population of PopSize Individuals, which are bit-strings of length L. Let the frequency of
allele 1 be 0.3 at position i, that is 30% of all individuals contains a 1 and 70% a 0. How does this allele
frequency change after performing k crossover operations with one-point crossover?
(6 points)
5- Calculate the probability that a binary chromosome with length L will not be changed by applying
the usual bit-flip mutation with Pm=1/L.
(6 points)
Neural Networks
6- a-Derive the Generalized Delta Rule for output layer.
b-Discuss the behavior of learning rate and momentum terms on the training process.
(6 points)
X1
+1
+1
-1
Y
-1
+1
X2
+1
1 x 0
f ( x)
0 x 0
Compute the outputs Y for inputs (X1, X2) equal to the following,
(0,0), (0,1), (1,0), (1,1).
What function do you think this network emulates.
(6 points)
8- A fragment of a NN comprising 4 neurons is shown below. N1, N2 and N3 are on the hidden layer
and N4 is on the output layer. I(N1)=0.9 and o(N4)=0.5, error e at N4=0.3. weights w14=0.6, w24=0.4
and w34=0.7 and learning rate = 0.03. Update the value of w14 by backpropagation algorithm. Also
compute the back-propagated error at neuron N1 (6 points)
N1
W14
W24
N2 N4
W34
N3
9- Differentiate between linear and nonlinear activation functions in the performance of training
feedforward neural networks.
(6 points)
Hybrid Systems
10- Show how fuzzy rules that model a particular system can be evolved using genetic algorithms.
(6 points)
11- Show how can genetic algorithms be used for training neural networks.
(6 points)
AMR BADR