0% found this document useful (0 votes)
2 views

Week08_Assignment08_Solution

The document discusses various concepts related to Soft Computing Techniques, particularly focusing on NSGA-II and Vector Optimized Evolution Strategy (VOES). It covers non-dominated rank comparison, crowding distance, selection strategies, and the advantages of VOES in multi-objective optimization. Additionally, it includes mathematical formulations and examples related to functional values and selection processes in genetic algorithms.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Week08_Assignment08_Solution

The document discusses various concepts related to Soft Computing Techniques, particularly focusing on NSGA-II and Vector Optimized Evolution Strategy (VOES). It covers non-dominated rank comparison, crowding distance, selection strategies, and the advantages of VOES in multi-objective optimization. Additionally, it includes mathematical formulations and examples related to functional values and selection processes in genetic algorithms.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 1

Soft Computing Techniques

Week-08 Assignment-08 (Solution)

1. (d). In Non-dominated rank comparison for given two solutions i and j, solution i is preferred to
solution j if Ri < Rj .

2. (a). The crowding distance of a solution in NSGA-II is calculated by measuring the Euclidean distance
between consecutive solutions in the objective space.

3. (b). Solutions with lower crowding distances during selection in NSGA-II are less likely to get selected.

4. (a). NSGA follows Stochastic remainder selection to create mating pool.

5. (d). The strategy of the crowding distance is followed by NSGA-II to make the solution diverse.

6. (a) and (c). The advantages of Vector Optimized Evolution Strategy (VOES) are improved convergence,
i.e. VOES converge faster than traditional ES algorithms, and multi-objective optimization i.e. VOES
can handle multi-objective optimization problems, which is very useful in many real-world applications.

7. 0.3, range is 0.299 to 0.301. We know that


X
wi = 1
i
=⇒ w1 + w2 + w3 = 1
=⇒ 0.2 + 0.5 + w3 = 1
=⇒ w3 = 0.3

8. (c). Using WBGA, F(x) is given by,


F (x) = w1 f1 (x) + w2 f2 (x)
=⇒ F (x) = 0.5(x2 ) + 0.5 −(x − 2)2


=⇒ F (x) = 0.5x2 − 0.5(x − 2)2

9. (a). We will find the functional values of the given individuals x = 1, 2, 3, 4

Individual x Functional value F(x)


1 F (1) = 0.5(1)2 − 0.5(1 − 2)2 =0
2 F (2) = 0.5(2)2 − 0.5(2 − 2)2 =2
3 F (3) = 0.5(3)2 − 0.5(3 − 2)2 =4
4 F (4) = 0.5(4)2 − 0.5(4 − 2)2 =6
Since we have to minimize the given function, so individual x = 1 is selected.

10. (b). Niched-Pareto Genetic Algorithm uses a tournament selection scheme based on Pareto dominance.

You might also like