0% found this document useful (0 votes)
30 views85 pages

Transform and Conquer

Uploaded by

mahesh31soni
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
30 views85 pages

Transform and Conquer

Uploaded by

mahesh31soni
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 85

Transform and Conquer

Objectives
To learn and appreciate the following concepts:

Presorting
Balanced Search Trees
Heaps and Heapsort
Problem Reduction

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 2


Session outcome
• At the end of session the student will be able to understand:

• What is presorting

• Understanding balanced search trees

• Importance of heaps and heapsort

• Problem reduction

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 3


Transform and Conquer

•Two stages
1. The transformation stage - the problem’s instance is modified to be, for
one reason or another, more amenable to solution.
2. The conquering stage, - it is solved.

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 4


The three major variations of the transform & conquer differ by the way a given instance is transformed:

1. INSTANCE SIMPLIFICATION:
Transformation of an instance of a problem to an instance of the same
problem with some special property that makes the problem easier to
solve.
Example: list pre-sorting, AVL trees

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 5


The three major variations of the transform & conquer differ by the way a given instance is transformed:

2. REPRESENTATION CHANGE:
Changing one representation of a problem’s instance into another
representation of the same instance.
Example: 2-3 trees, heaps and heapsort

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 6


The three major variations of the transform & conquer differ by the way a given instance is transformed:

3.PROBLEM REDUCTION:
Transforming a problem given to another problem that can be solved by a
known algorithm.
Example: reduction to graph problems, reduction to linear programming

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 7


Presorting
• Presorting is an example for instance simplification
• Presorting is an old idea in computer science. The time efficiency of
algorithms that involve sorting may depend on the efficiency of the
sorting algorithm being used.
• Some other simple presorting examples

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 8


Presorting- Checking element uniqueness
• EXAMPLE 1: Checking element uniqueness in an array If this element
uniqueness problem looks familiar to you, it should considered a brute-
force algorithm.
• The brute-force algorithm compared pairs of the array’s elements until
either two equal elements were found or no more pairs were left. Its
worst-case efficiency was in (n^2).
• Aiternatively,we can sort the array first and then check only its
consecutive elements: if the array has equal elements, a pair of them
must be next to each other, and vice versa.

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 9


Presorting- Checking element uniqueness
• The algorithm for presort Element Uniqueness is:

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 10


13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 11
Presorting- Checking element uniqueness

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 12


Presorting- Checking element uniqueness

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 13


Presorting- Checking element uniqueness

• The running time of this algorithm is the sum of the time spent on sorting
and the time spent on checking consecutive elements.
• Since the former requires at least n log n comparisons and the latter
needs no more than n − 1 comparisons, it is the sorting part that will
determine the overall efficiency of the algorithm.
• So, if we use a quadratic sorting algorithm here, the entire algorithm will
not be more efficient than the brute-force one. But if we use a good
sorting algorithm, such as mergesort, with worst-case efficiency in (n log
n), the worst-case efficiency of the entire presorting-based algorithm will
be also in (n log n):

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 14


Presorting- Computing a mode

• EXAMPLE 2: A mode is a value that occurs most often in a list of numbers.


• If several different values occur most often any of them can be considered
the mode.
• Computing a mode A mode is a value that occurs most often in a given list
of numbers. For example, for 5, 1, 5, 7, 6, 5, 7, the mode is 5. (If several
different values occur most often, any of them can be considered a mode.)
• The brute-force approach to computing a mode would scan the list and
compute the frequencies of all its distinct values, then find the value with
the largest frequency.

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 15


Presorting- Computing a mode
• we can store the values already encountered, along with their
frequencies, in a separate list.
• On each iteration, the ith element of the original list is compared with
the values already encountered by traversing this auxiliary list.
• If a matching value is found, its frequency is incremented; otherwise, the
current element is added to the list of distinct values seen so far with a
frequency of 1.

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 16


Presorting- Computing a mode
• It is not difficult to see that the worst-case input for this algorithm is a list with no
equal elements.
• For such a list, its ith element is compared with i − 1 elements of the auxiliary list
of distinct values seen so far before being added to the list with a frequency of 1.
• As a result, the worst-case number of comparisons made by this algorithm in
creating the frequency list is:

• The additional n − 1 comparisons needed to find the largest frequency in the


auxiliary list do not change the quadratic worst-case efficiency class of the
algorithm.
13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 17
Presorting- Computing a mode

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 18


Presorting- Computing a mode

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 19


Presorting-Searching problem
• EXAMPLE 3: Searching problem Consider the problem of searching for a
given value v in a given array of n sortable items. The brute-force solution
here is sequential search, which needs n comparisons in the worst case.
If the array is sorted first, we can then apply binary search, which
requires only log2 n + 1 comparisons in the worst case. Assuming the
most efficient n log n sort, the total running time of such a searching
algorithm in the worst case will be:

• which is inferior to sequential search. The same will also be true for the
averagecase efficiency. Of course, if we are to search in the same list
more than once, the time spent on sorting might well be justified.

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 20


Presorting - TRY
1. Consider the problem of finding the distance between the two closest
numbers in an array of n numbers. (The distance between two numbers x
and y is computed as |x − y|.)
• a. Design a presorting-based algorithm for solving this problem and
determine its efficiency class.
• b. Compare the efficiency of this algorithm with that of the brute-force
algorithm.

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 21


Presorting - TRY
2. Given a set of n ≥ 3 points in the Cartesian plane, connect them in a simple
polygon, i.e., a closed path through all the points so that its line segments (the
polygon’s edges) do not intersect (except for neighboring edges at their common
vertex). For example:

• a. Does the problem always have a solution? Does it always have a unique
solution?
• b. Design a reasonably efficient algorithm for solving this problem and indicate
its efficiency class.
13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 22
Balanced Search Trees
• Computer scientists have expended a lot of effort in trying to find a structure that preserves the
good properties of the classical binary search tree—principally, the logarithmic efficiency of the
dictionary operations and having the set’s elements sorted—but avoids its worst-case degeneracy.
They have come up with two approaches:
(a) The first approach is of the instance-simplification variety:
• An unbalanced binary search tree is transformed into a balanced one. Because of this, such trees are
called self-balancing.
• Specific implementations of this idea differ by their definition of balance.
• An AVL tree requires the difference between the heights of the left and right subtrees of every node
never exceed 1.
• A red-black tree tolerates the height of one subtree being twice as large as the other subtree of the
same node.
• If an insertion or deletion of a new node creates a tree with a violated balance requirement, the tree
is restructured by one of a family of special transformations called rotations that restore the balance
required.
13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 23
Balanced Search Trees
(b) The second approach is of the representation-change variety:
• Allow more than one element in a node of a search tree.
• Specific cases of such trees are 2-3 trees, 2-3-4 trees, and more general
and important B-trees.
• They differ in the number of elements admissible in a single node of a
search tree, but all are perfectly balanced.

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 24


Balanced Search Trees – AVL Trees
• AVL trees were invented in 1962 by two Russian scientists, G. M. Adelson-Velsky and E. M.
Landis [Ade62], after whom this data structure is named.

• DEFINITION: An AVL tree is a binary search tree in which the balance factor of every node,
which is defined as the difference between the heights of the node’s left and right
subtrees, is either 0 or +1 or −1. (The height of the empty tree is defined as −1. Of course,
the balance factor can also be computed as the difference between the numbers of levels
rather than the height difference of the node’s left and right subtrees.)
13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 25
Balanced Search Trees – AVL Trees
• If an insertion of a new node makes an AVL tree unbalanced, we
transform the tree by a rotation.
• A rotation in an AVL tree is a local transformation of its subtree rooted at
a node whose balance has become either +2 or −2.
• If there are several such nodes, we rotate the tree rooted at the
unbalanced node that is the closest to the newly inserted leaf.
• There are only four types of rotations; in fact, two of them are mirror
images of the other two.

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 26


Balanced Search Trees – AVL Trees
• The four rotations are shown in below example:

The first rotation type is called the single right rotation, or R-rotation. Note
that this rotation is performed after a new key is inserted into the left
subtree of the left child of a tree whose root had the balance of +1 before the
insertion.
The symmetric single left rotation, or L-rotation, is the mirror image of the
single R-rotation. It is performed after a new key is inserted into the right
subtree of the right child of a tree whose root had the balance of −1 before
the insertion

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 27


Balanced Search Trees – AVL Trees

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 28


Balanced Search Trees – AVL Trees
• The second rotation type is called the double left-right rotation
(LRrotation). It is, in fact, a combination of two rotations: we perform the
L-rotation of the left subtree of root r followed by the R-rotation of the
new tree rooted at r.
• It is performed after a new key is inserted into the right subtree of the
left child of a tree whose root had the balance of +1 before the insertion.
• The double right-left rotation (RL-rotation) is the mirror image of the
double LR-rotation and is left for the exercises.

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 29


Balanced Search Trees – AVL Trees

The height h of any AVL tree with n nodes satisfies the


inequalities log2 n ≤ h < 1.4405 log2(n + 2) − 1.3277.

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 30


Balanced Search Trees – 2 – 3 Trees
• A 2-3 tree is a tree that can have nodes of two kinds: 2-nodes and 3-
nodes.
• A 2-node contains a single key K and has two children: the left child
serves as the root of a subtree whose keys are less than K, and the right
child serves as the root of a subtree whose keys are greater than K. (In
other words, a 2-node is the same kind of node we have in the classical
binary search tree.)
• A 3-node contains two ordered keys K1 and K2 (K1 < K2) and has three
children. The leftmost child serves as the root of a subtree with keys less
than K1, the middle child serves as the root of a subtree with keys
between K1 and K2, and the rightmost child serves as the root of a
subtree with keys greater than K2
13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 31
Balanced Search Trees – 2 – 3 Trees

Two kinds of nodes of a 2-3 tree

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 32


Properties of 2-3 tree:
• Data is stored in sorted order.
• It is a balanced tree.
• All the leaf nodes are at same level.
• Each node can either be leaf, 2 node, or 3 node.
• Always insertion is done at leaf.

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 33


Search
To search a key K in given 2-3 tree T, we follow the
following procedure:
• Base cases:
1.If T is empty, return False (key cannot be found in the
tree).
2.If current node contains data value which is equal to K,
return True.
3.If we reach the leaf-node and it doesn’t contain the
required key value K, return False.

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 34


13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 35
13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 36
13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 37
13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 38
Insertion: There are 3 possible cases in insertion which have been discussed below:

• Case 1: Insert in a node with only one data element

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 39


Case 2: Insert in a node with two data elements
whose parent contains only one data element.

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 40


13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 41
13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 42
Case 3: Insert in a node with two data elements whose parent also contains two data elements

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 43


13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 44
13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 45
13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 46
Balanced Search Trees – 2 – 3 Trees

Construction of a 2-3 tree for the list 9, 5, 8, 3, 2, 4, 7.

The time efficiencies of searching, insertion, and deletion are all in (log n) in both the worst and average case.

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 47


Balanced Search Trees - TRY
1. Which of the following binary trees are AVL trees?

2.a. For n = 1, 2, 3, 4, and 5, draw all the binary trees with n nodes that satisfy the balance requirement of
AVL tree.
b. Draw a binary tree of height 4 that can be an AVL tree and has the smallest number of nodes among all
such trees.
3. a. Construct a 2-3 tree for the list C, O, M, P, U, T, I, N, G. Use the alphabetical order of the letters and
insert them successively starting with the empty tree.
b. Assuming that the probabilities of searching for each of the keys (i.e., the letters) are the same, find the
largest number and the average number of key comparisons for successful searches in this tree.
13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 48
What is a heap?
• A heap is a complete binary tree, and the binary tree is a
tree in which the node can have the atmost two children.
• A complete binary tree is a binary tree in which all the
levels except the last level, i.e., leaf node, should be
completely filled, and all the nodes should be left-justified.

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 49


Heaps and Heapsort
• A priority queue is a multiset of items with an orderable characteristic
called an item’s priority, with the following operations:
 finding an item with the highest (i.e., largest) priority .
 deleting an item with the highest priority .
 adding a new item to the multiset.

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 50


Notion of the Heap
• DEFINITION: A heap can be defined as a binary tree with keys assigned to
its nodes, one key per node, provided the following two conditions are
met:
• 1. The shape property—the binary tree is essentially complete (or simply
complete), i.e., all its levels are full except possibly the last level, where
only some rightmost leaves may be missing.
• 2. The parental dominance or heap property—the key in each node is
greater than or equal to the keys in its children. (This condition is
considered automatically satisfied for all leaves.

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 51


Heaps and Heapsort

Heap and its array representation

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 52


Heaps and Heapsort
Here is a list of important properties of heaps, which are not difficult to prove:

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 53


13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 54
Heaps and Heapsort
Bottom-up construction of a heap for the list 2, 9, 7, 6, 5, 8. The double-headed arrows show key comparisons
verifying the parental dominance:

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 55


Heaps and Heapsort
Bottom-up construction of a heap for the list 2, 9, 7, 6, 5, 8. The double-headed arrows show key comparisons
verifying the parental dominance:

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 56


Heaps and Heapsort

Deleting the root’s key from a heap. The key to be deleted is swapped with the last key after which the smaller tree is
“heapified” by exchanging the new key in its root with the larger key in its children until the parental dominance
requirement is satisfied
13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 57
Insertion and Deletion in Heaps

• Process of Deletion:
• Since deleting an element at any intermediary position in
the heap can be costly, so we can simply replace the
element to be deleted by the last element and delete the
last element of the Heap.
• Replace the root or element to be deleted by the last
element.
• Delete the last element from the Heap.
• Since, the last element is now placed at the position of the
root node. So, it may not follow the heap property.
Therefore, heapify the last node placed at the position of
root.
13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 58
13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 59
Process:
The last element is 4.

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 60


13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 61
Insertion in Heaps:

• Process of Insertion: Elements can be inserted to the


heap following a similar approach as discussed above for
deletion. The idea is to:
• First increase the heap size by 1, so that it can store the
new element.
• Insert the new element at the end of the Heap.
• This newly inserted element may distort the properties of
Heap for its parents. So, in order to keep the properties of
Heap, heapify this newly inserted element following a
bottom-up approach.

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 62


13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 63
13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 64
13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 65
13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 66
Heaps and Heapsort
• heapsort—an interesting sorting algorithm discovered by J. W. J. Williams
[Wil64]. This is a two-stage algorithm that works as follows:
• Stage 1 (heap construction): Construct a heap for a given array.
• Stage 2 (maximum deletions): Apply the root-deletion operation n − 1
times to the remaining heap.

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 67


Heaps and Heapsort
Since we already know that the heap construction stage of the
algorithm is in O(n), we have to investigate just the time
efficiency of the second stage. For the number of key
comparisons, C(n), needed for eliminating the root keys from
the heaps of diminishing sizes from n to 2, we get the following
inequality:

This means that C(n) ∈ O(n log n)for the second stage of
heapsort. For both stages, we get O(n) + O(n log n) = O(n log n).
A more detailed analysis shows that the time efficiency of
heapsort is, in fact, in (n log n) in both the worst and average
cases
13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 68
Heaps and Heapsort - TRY
1. a. Construct a heap for the list 1, 8, 6, 5, 3, 7, 4 by the bottom-up algorithm.
• b. Construct a heap for the list 1, 8, 6, 5, 3, 7, 4 by successive key insertions
(top-down algorithm).
• c. Is it always true that the bottom-up and top-down algorithms yield the
same heap for the same input?

2. Sort the following lists by heapsort by using the array representation of


heaps.
a. 1, 2, 3, 4, 5 (in increasing order)
b. 5, 4, 3, 2, 1 (in increasing order)
c. S, O, R, T, I, N, G (in alphabetical order)
13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 69
Problem Reduction
• If you need to solve a problem, reduce it to another problem that you
know how to solve .

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 70


Problem Reduction
• Examples:
• Computing the least common multiple
• Counting paths in a graph
• Reduction of maximization problems: Maximization and minimization
problem.
• Reduction in graph problems.
• Linear Programming.

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 71


Computing the Least Common Multiple
• Recall that the least common multiple of two positive integers m and n,
denoted lcm(m, n ), is defined as the smallest integer that is divisible by
both m and n.
• lcm(24, 60) = 120, and lcm(11, 5) =55.

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 72


Calculate the least common multiple (LCM) of two numbers using the formula:

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 73


Example:

1.First, we need to find the greatest common divisor (GCD)


of m and n. We can use Euclid's algorithm for this.
LCM(24,60)?
2.The GCD of 24 and 60 is 12.
3.Now, plug the values into the formula for LCM:
• LCM(24,60)=(24×60)/12
• Calculate the numerator:
• 24×60=1440
4.Divide the product by the GCD:
• LCM(24,60)=1440/12=120
• So, the LCM of 24and 60is 120.
13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 74
Counting paths in a graph

• As our next example, we consider the problem of counting paths


between two vertices in a graph.
• It is not difficult to prove by mathematical induction that the number of
different paths of length k > 0 from the ith vertex to the jth vertex of a
graph (undirected or directed) equals the (i, j)th element of A^k where A
is the adjacency matrix of the graph. (Incidentally, the exponentiation
algorithms we discussed before for computing powers of numbers are
applicable to matrices as well.)
• Thus, the problem of counting a graph's paths can be solved with an
algorithm for computing an appropriate power of its adjacency matrix.

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 75


Counting paths in a graph

• Its adjacency matrix A and its square A^2 indicate the number of paths of
length 1 and 2, respectively, between the corresponding vertices of the
graph.
• In particular, there are three paths of length 2 that start and end at vertex
a: a - b- a, a - c- a, and a - d -a, but only one path of length 2 from a to c: a -
d -c.
13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 76
Reduction of maximization problems: Maximization and minimization problem.

• Our next example deals with solving optimization problems.


• If a problem asks to find a maximum of some function, it is said to be a
maximization problem;
• if it asks to find a function's minimum, it is called a minimization problem.

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 77


Reduction of maximization problems: Maximization and minimization problem.

• Suppose now that you have to find a minimum of some function f (x) and
you know an algorithm for maximizing the function.
• The answer lies in the simple formula
• min f(x) =-max[-f(x)].
• This formula suggests that to find the minimum of a
function f(x), you can maximize its negative −f(x) instead.
Then, to obtain the correct minimum value of the original
function, you simply change the sign of the maximum
value found.

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 78


13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 79
Reduction of maximization problems: Maximization and minimization problem.

• Similarly
• max f(x) =-min[-f(x)] is valid as well; it shows how a maximization
problem can be reduced to an equivalent minimization problem.
• This property simplifies optimization problem-solving by allowing
algorithms designed for one type of optimization (maximization or
minimization) to be adapted easily to solve the other type.

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 80


Linear Programming.

• linear programming problem, which is a problem of optimizing a linear


function of several variables subject to constraints in the form of linear
equations and linear inequalities.

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 81


Linear Programming.

• Problem Statement: The university endowment needs to invest $100 million


in stocks, bonds, and cash to maximize its return. Each investment type offers
a different annual return: 10% for stocks, 7% for bonds, and 3% for cash.
• Constraints:
• The amount invested in stocks cannot exceed one-third of the amount
invested in bonds.
• At least 25% of the total amount invested in stocks and bonds must be
invested in cash.
• The total amount invested in all three types of investments must sum up to
$100 million.
• How should the managers invest the money to maximize the return?

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 82


• Objective Function: The goal is to maximize the return
on investment, which is represented by the sum of returns
from each investment type multiplied by the amount
invested in each type.
• Maximize: 0.10x+0.07y+0.03z
• where x, y, and z represent the amounts (in millions of
dollars) invested in stocks, bonds, and cash, respectively.
• Constraints:x+y+z=100 (Total investment sum constraint)
• x≤ 1/3*y(Limit on stock investment relative to bonds)
z≥0.25(x+y) (At least 25% of stocks and bonds invested in cash)
x≥0,y≥0, z≥0 (Non-negativity constraints)
13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 83
EXAMPLE 2 :the knapsack problem

13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 84


Reduction to Graph Problems
• Using State space graph.
• States: Nodes or vertices in the graph represent individual states or
configurations of a system. Each state captures relevant information about
the problem being modeled.
• Transitions: Directed edges or arcs between states represent possible
transitions or actions that can be taken to move from one state to another.
These transitions typically correspond to allowable moves, decisions, or
events in the problem domain.
• Initial State: One of the states in the graph is designated as the initial state,
representing the starting point of the system or problem.
• Goal State(s): One or more states in the graph are designated as goal states,
representing the desired outcomes or solutions of the problem.
13/11/2024 IT_2222 Design and Analysis of Algorithms- 2024 85

You might also like