0% found this document useful (0 votes)
21 views76 pages

Alg Wk4 - 7

Uploaded by

227567
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
21 views76 pages

Alg Wk4 - 7

Uploaded by

227567
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 76

Divide & conquer

Multiplication
Counting
Greedy
.Graph
.Shortest
Wk4/5/6/7

1
Divide and Conquer
1) Divide your problem into subproblems
2) Solve the subproblems recursively, that is, run
the same algorithm on the subproblems (when
the subproblems are very small, solve them
from scratch)
3) Combine the solutions to the subproblems
into a solution of the original problem

2
Divide and Conquer
Traditional 𝑥 𝑛
= x.x.x….x (n times)??

In divide and conquer we assume n = 2𝑖

A) Incremental approach, // traditional


power (x,n)
{
f=1;
for i=1 to n
f= f*x;
return f;
}

3
Divide
b) Divide and conquer:
and conquer
𝑥 𝑛 = 𝑥 𝑛/2 * 𝑥 𝑛/2 .
𝑥 𝑛/2 = 𝑥 𝑛/4 * 𝑥 𝑛/4 .
𝑥 2= 𝑥1 * 𝑥1.

DividePower (x, n)
{
if (n = = 1)
return x;
else { h = dividepower (x,n/2)
return h*h
}
}
Which is bet= dividepower (x,n/2)*dividepower (x,n/2); 4
Divide power
T(n) = a T(n/b) +D(n) + C(n).
=1. T(n/2) +c1 + c2 .

T(n) = T(n/2) +c.


Let n = 2𝑘 → k = lg 2 n then
tk = tk-1 + 𝑐 .
tk - tk-1 = 𝑐 . 1𝑘 . 𝑘 0 → (x-1) (x-1) = 0. → r1 = r2 = 1.
tk =c1 . 1𝑘 + c2 .k. 1𝑘
tk =c2 .k.
tn =c2 .lg 2 n = Ɵ(lg 2 n).

5
example
T(n) = 7T(n/2) + c . n2 .
A=7, B=2, i=2, a>bi → nlg2a → Ɵ(n3-ɛ) → Ɵ(n2.81)

𝑘 2
tk =7tk-1 +(2) .
𝑘2
tk - 7tk-1 =(2) . So (x-7) (x-4) = 0.
tk = c1 7𝑘 + c24𝑘 .
𝑘
tk = c1 2 (3− ɛ) 2
+ c22 .
𝑘

𝑘 3−ɛ 𝑘 2
tk = c1 2 + c2 2 .
t = c 𝑛3−ɛ + c 𝑛2 = Ɵ(n2.81).
k 1 2

6
Strassen’s algorithm:
Matrix multiplication:
Cn*n = An*n * Bn*n

n2 = n*n = ‫عدد مرات الضرب في الصف والعمود الواحد‬


n3 = n* n2 = ‫عدد مرات الضرب للصف الواحد في كل عمود‬
[n2 +n2 +…………… + n2 ], n times = n3
Or

We have n.n results where each element requires n multiplications

7
Matrix multiplication

→ Ɵ(n3 ).
8
A simple divide and conquer algorithm
C = A.B, n=2i in each divide step, we divide n*n matrices into four
n/2 * n/2 matrices.

𝐴11 𝐴12 𝐵11 𝐵12 𝐶11 𝐶12


An*n = , B= , C=
𝐴21 𝐴22 𝐵21 𝐵22 𝐶21 𝐶22

𝐶11 = 𝐴11. 𝐵11 + 𝐴12. 𝐵21 𝐶12 = 𝐴11. 𝐵12 + 𝐴12. 𝐵22

𝐶21 = 𝐴21. 𝐵11 + 𝐴22. 𝐵21 𝐶22 = 𝐴21. 𝐵12 + 𝐴22. 𝐵22

Suppose A11 is sub matrix of n/2 . n/2

9
a b e g
c d f h

10
11
T(n)
==================

Or

12
T(n)
Suppose A11 is sub matrix of n/2 . n/2
T(n) = 8T(n/2) + c . n2 .
A=8, B=2, i=2, a>bi → nlg2a → Ɵ(n3).

𝑘2
tk =8tk-1 +(2) .
𝑘2
tk - 8tk-1 =(2) . So (x-8) (x-4) = 0.
tk = c1 8𝑘 + c24𝑘 .
3𝑘 2𝑘
tk = c1 2 + c22 .
𝑘3 𝑘2
tk = c1 2 + c22 .
tk = c1 𝑛3 + c2𝑛2 = Ɵ(n3).

13
SQR – MM
SQR – MM (A,B)
{ n = A.rows. Let c be n*n matrix
If n=1
C11 =a11 . b11
Else
Partition A, B, C, as in equation 4.9/page 76.
C11 SQR-MM(A11, B11) + SQR-MM (A12, B21)
C12=……… as addressed previously
C21=…………..
C22=……….
Return C;
}

14
Strassen's matrix multiplication
(method A)
P= (A11+A22).(B11+B22).
Q= (A21+A22).B11
R= A11.(B12-B22).
S= A22.(B21-B11).
T= (A11+A12).B22.
U= (A21- A11).(B11+B12).
V= (A12- A22).(B21+B22).

C11 = P+S-T+V.
C12=R+T
C21=Q+S
C22=P+R-Q+U
T(n)=7T(n/2) +dn2 .
=Ɵ(nlg7) ≈Ɵ(n2.81).

15
Based on algorithm/ page 79:
Strassen's matrix multiplication
(method B)

Divide A, B, C into n/2 * n/2 sub matrices as An*n


𝐴11 𝐴12 𝐵11 𝐵12
= , B= ,
𝐴21 𝐴22 𝐵21 𝐵22
𝐶11 𝐶12
C=
𝐶21 𝐶22
• Create 10 matrices s1, s2 ,…..s10 each is n/2 * n/2
• Compute 7 matrices p1, p2, …,p7 each pi = n/2 * n/2.
• Compute C11, C21, C12, C22.

16
10 matrices

Since we must add or subtract n/2 * n /2 matrices


10 times, ………. Ɵ(n2).
17
Recursive p1 to p7
In step 3, recursively multiply n/2 * n/2 matrices seven times

Compute Cij
C11 = P5+P4-P2+P6 …………………………C1
C12 = P1+P2 ………………………………..C2
C21 = P3+P4………………………………….C3
C22 = P5+P1-P3-P7…………………….…….C4
=Ɵ(nlg7) ≈Ɵ(n2.81). 18
Quiz:
solve the recurrence T(n) = 7T(n/2) + d n2 .

19
Trees (revisited)
• Some tree properties: full binary tree
• In a full tree, every node must have either zero
or two children.
• In a complete tree, every non-leaf node must
have exactly two children, and the nodes in the
last level must be left-filled and can have
between zero and two children

20
Trees (revisited)
Complete binary tree: when tree contains no
gaps, then every full is complete →left to right
except the last nodes in last level.

21
For full binary tree:

height 2 3 4

nodes 3 7 15

The height of the tree is the number of vertices in the tree from the
root to the deepest node (= 2h-1)
22
Sorting in linear time (Ɵ(n) worst case),
Any comparison sort must make Ω(n lgn) comparison
.

Sorting Ɵ(worst case) Average Best case


algorithm

Insertion Ɵ(n2) Ɵ(n2) Ɵ(n)

Quick Ɵ(n2) Ɵ(n lgn) Ɵ(n lgn)

Merge Ɵ(n lgn) Ɵ(n lgn) Ɵ(n lgn)

23
Sorting algorithm:

• Comparison based: sorting → Ω(n lgn)


• Non-comparison based: sorting + additional
construction on data.

24
Decision Tree Model:
i.e: a1 , a2 , a3 : comparison based.
‫ عناصر‬3 ‫ احتماالت ترتيب‬# of possible output = n! = 3!
a1 , a2 , a3
a1 , a3 , a2
a3 , a1 , a2
a 2 , a 1 , a3
a2 , a3 , a1
a3 , a2 , a1
25
Decision tree model (nlgn)
min=6 max=8 = 23 = 2h

26
lower bound for the worst case , page 193
Theorem: 8.1
Any comparison sort algorithm requires Ω(n lgn)
comparisons in the worst case, because each of n!
permutation of the input appears as some leaf, we
have n! ≤ l, since binary tree of height h has no more
than 2h leaves
n! ≤ l ≤ 2h
h≥ lg (n!) =nlgn-nlge+lge= Ω(nlgn)

27
Theorem: 8.1
# of comparisons = lg n! → #of leaves in 3rd level
= 23 =8
𝑛 ℎ
n! ≈ 2𝜋𝑛 ( ) (1+Ɵ(1/n))
2
h ≥ lgn!
𝑛 𝑛
h ≈ (lg( 2𝜋𝑛 ( ) ))
2
h ≥n lgn.

28
Counting sort/ page 194:
Assume that each of the n inputs range from 0 to k, for some integer k,
The sort runs in Ɵ(n).
A: input B: sorted output. C: temp.
Counting sort (A, B, K)
{Let C[0……K] be new array
for i=0 to k
C[i] = 0,
For j=1 to A.length
C[A[j]] = C[A[j]] +1
For i=1 to k
C[i] = c[i] + c[i-1]
For j = A.length down to 1
B[c[A[j]]] = A[j]
C[A[j]] = C[A[j]] -1 }

5n+4k
29
Counting sort
• It is stable because the order is equal number
are same as in original.
• T(n, k ) = The time complexity of counting
sort algorithm is O(n+k) where n is the number
of elements in the array and k is the range of
the elements = Ɵ(n+k)
• Assumption: k≤ c.n
• Ɵ(n+ cn) = Ɵ (n)

30
Counting Sort

A 4 1 3 5 1

C 0 2 0 1 1 1

B 1 1 3 4 5

for i=0 to k
C[i] = 0,
for j=1 to A.length
C[A[ j ]] = C[A[ j ]] +1
for i=1 to k
C[ i ] = C[ i ] + C[ i-1 ]
for j = A.length down to 1
B[C[A[ j ]]] = A[ j ]
C[A[ j ]] = C[A[ j ]] -1
31
Counting sort
1 2 3 4 5 6 7 8
• A 2 5 3 0 2 3 0 3

0 1 2 3 4 5
• C 2 0 2 3 0 1
2 2 4 7 7 8

1 2 3 4 5 6 7 8
• B 0 0 2 2 3 3 3 5

32
Counting sort, cont…..
• B[C[A[j]]] = A[j]
• C[A[j]] = C[A[j]]-1
• A[8] =3
• C[3] = 7
• B [7] = 3
• C[3] = --

33
Lower bound
• Comparison sort: Ω(n lgn)
• Linear: Ɵ(n)
• Do not use counting sort for large ranges
• Stable.

34
Radix sort: linear sort

• Is the algorithm used by sorting the least


significant digit first, then combine the order
into a single deck.

35
Radix

2 5 3 7 2 1 7 2 1 1 8 9
9 4 2 9 4 2 3 2 4 2 5 3
1 8 9 2 5 3 9 4 2 3 2 4
5 6 5 3 2 4 2 5 3 5 6 5
7 2 1 5 6 5 5 6 5 6 7 8
3 2 4 6 7 8 6 7 8 7 2 1
6 7 8 1 8 9 1 8 9 9 4 2

36
Radix sort

37
RadixSort (A,n,d) , d: number of digits

For i=1 to d
Counting sort (A,n) → Ɵ(d n) =Ɵ(n)

38
Prove/disprove:
Is n3 +3n2 = Ɵ(n2)??

c1 . n2 ≤ n3 +3n2 ≤ c2 n2
c1 ≤ n +3 ≤ c2 : n →∞
Let n=5: c1 ≤ 8 ≤ c2 , c1 = 7, c2 =9
7 n2 < n3 +3n2 < 9 n2 this is wrong since n3 >n2

39
Recursive insertion sort
Rec. insertion sort
T(n) = T(n-1) + T(n)
T(n) = T(n-1) +n ……….. tn –tn-1 =n
T(n) = Ɵ(n2).

Can insertion work properly on Divide&Conquer??


No, insertion sort does not use the Divide and Conquer approach.
Divide and Conquer is an algorithmic paradigm that involves
breaking a problem into sub-problems, solving them
independently, and then combining their solutions to solve the
original problem

40
Question: Is linear search divide and conquer?
Recursive: linear search
Recursive: linear search
T(n) = T(n-1) + 1
tn –tn-1 =1

linear search (a, n, x) //iterative


{
For i= 1 to n
If (a[i] ==x)
Return i;
Return -1
} // O(n)

41
Recursive:
linear search (a, n, x)
r_search (a, n, x)// recursive
{ if a[n-1] = x
return n-1;
else
r_search(a, n-1, x)
}
T(n) =1+T(n-1).

42
Greedy
Greedy design Techniques (for optimization problems)
A greedy algorithm is an algorithmic strategy that makes the best
optimal choice at each small stage with the goal of this eventually
leading to a globally optimum solution.
This means that the algorithm picks the best solution at the
moment without regard for consequences. It picks the best
immediate output, but does not consider the big picture, hence it
is considered greedy

43
Different Types of Greedy Algorithm
• Selection Sort.
• Knapsack Problem.
• Minimum Spanning Tree.
• Single-Source Shortest Path Problem.
• Job Scheduling Problem.
• Prim's Minimal Spanning Tree Algorithm.
• Kruskal's Minimal Spanning Tree Algorithm.
• Dijkstra's Minimal Spanning Tree Algorithm
• etc
44
Idea:
To make each choice in locally optimal manner, where we make set
of choices to arrive the optimal solution looks best at the moment.
• Objective function: max profit/ min cost
constraints
• feasible solution: solutions that comply with
constraints.
• Optimal solution: one of the solutions that
maximize the objective function.
• Greedy: improves the objective function as
much as possible at each stage.
• Greedy do not always lead to optimal solution.
45
Is 0/1 knapsack problem greedy?
• The fractional knapsack is best solved by a greedy
approach. The 0-1 approach is best solved by a dynamic
programming approach. For the 0-1 problem (for
example), consider the most valuable load with weights at
most W kg

• A simple solution is to consider all subsets of items and


calculate the total weight and profit of all subsets.

• The problem is to maximize the sum of the values of the


items in the knapsack so that the sum of the weights
is less than or equal to the knapsack's

46
Fractional knapsack problem
knapsack (Fractional knapsack problem) <FRKP>
Given n objects and a knapsack, wi is the weight of object i.
M: Knapsack has capacity M.
pi xi : for xi , where 0 ≤ xi ≤ 1, if object i is placed into the
knapsack then profit pi xi is earned.
Objective: is to obtain a filling of knap that maximizes the
total profit, that is at most M.

So, the problem: maximize p = σ𝒏𝒊=𝟏 𝐩𝐢 𝐱𝐢 ……………(1)


(profit)
Subject to: σ𝑛𝑖=1 wi xi ≤ M …………. (2) ,
where 0≤ xi ≤1, 1≤i≤n. …………… (3)

47
Example
N=3, M=20,
p1 = 25, p2 = 24, p3 = 15, pi
w1 = 18, w2 = 15, w3 = 10. xi ===➔M
Find profit? Feasible solutions
X1 X2 X3 2wi xi 2pi xi
i ½ 1/3 ¼ 16.5 24.25
ii 1 2/15 0 20 28.2
Vector: iii 0 2/3 1 20 31
(x1, x2, x3 ) iv 0 1 1/2 20 31.5
(0, 1, 1/2)
Sort object in non-increasing order of pi /wi solution, the
last row in table is the optimal solution.
Objects: [x2, x3, x1] ➔[1, 1/2, 0]
48
Example
Given 3 objects
p1 = 30, p2 = 20, p3 = 2,
w1 = 10, w2 = 5, w3 = 1, M=10,
Find p with given constraints.
Feasible solution:
𝑛
σ
(1,0,0), p= 𝑖=1 pi xi = 1*30+ 0*20 + 0*2 →p=30
(0,1,0) →p=20.
(0,0,1) → p=2.

49
Combinations: n!
All possible combinations:
Feasible solution:
1 2 3 ȁ 2 1 3 ȁ 3 1 2
1 3 2 2 3 1 3 2 1

1 0 0 ȁ 1 5/10 0
ȁ
1 9/10 0
1 0 0 1 1 4/10 1 1 4/10
30 35 29
Profit:
30 34 34
Given 3 objects
p1 = 30, p2 = 20, p3 = 2,
w1 = 10, w2 = 5, w3 = 1, M=10,

50
density
Greedy: p1 = 30, p2 = 20, p3 = 2,
w1 = 10, w2 = 5, w3 = 10

w1 = 10, p1 = 30, 1unit = 3 profit.


w2 = 5, p2 = 20, 1 unit = 4 profit.
So compute density = pi /wi = unit-profit

51
Cont…
object Pi Wi density
01 30 10 3
02 20 5 4
03 2 1 2
Take objects in non-increasing order:
object density
02 4
01 3
03 1

52
Cont…
Vector (x1 , x2 , x3 ) = (5/10, 1, 0)
σ𝑛𝑖=1 wi xi = 5+5 = 10
σ𝑛𝑖=1 pi xi = 5/10 *30 + 1*20 + 0*2.

Knapsack: Ɵ(nw)

53
Other greedy algorithms:

Counting coins, Huffman, MST, Dijkstra’s, Prims,


Kruskals.

Dijkstra’s for finding shortest paths.


Kruskals for finding Min-cost-spanning tree,
MCSP.
Prims for finding Min-cost-spanning tree, MCSP.

54
GRAPHS\
Well known Graph traversals: dfs, bfs.
[Some graph properties]
Articulation point: deletion of v and its incident edges
produces a graph G’ that has at least two connected
components.
Bi connected components (connected graph with no
articulation),
Biconnected components are defined as a maximal
biconnected subgraph.
An Euler tour of a strongly connected, directed graph
G is a cycle that traverses each edge of G exactly
once, although it may visit a vertex more than once.

55
Representation
• Adjacency Matrix
• Adjacency List

56
Cut points?

57
Bi-connected
• .

58
Spanning Trees
Spanning tree: for graph G, is a free tree that connects all verities of G.
So, it is a subset of Graph G, such that all the vertices are connected using minimum possible number of
edges

MST of G(V,E): tree with minimum (least) cost, that connects all nodes at minimal cost, edges are weighted.

A minimum spanning tree (MST) is a subset of the edges of a connected, edge-weighted graph that
connects all the vertices together without any cycles and with the minimum possible total edge weight.

V= {a,b,c,d,e}, E : are connection between nodes {<a,e>,<a,b>,..}.


No cycles
Edges = |v|-1
Might not be unique

Undirected weighted, not complete.


Spanning tree: is tree such that all vertices are included.

59
Spanning Trees
• Complete graph: edge between every two vertices.
• Spanning tree such that each vertex has one
parent

8
Cost:30 cost:23
---

etc.

Number of edges in MST=n-1

60
A minimum spanning tree (MST)
Examples
• Kruskal's algorithm and Prim's algorithm for finding
minimum spanning trees
• Finding shortest path
• Huffman coding

61
Prim's and Kruskal's
Prim's and Kruskal's algorithms are greedy
algorithms used for finding the minimum
spanning tree of a given weighted graph.

Prim's algorithm adds nodes while Kruskal's


algorithm adds edges which calculates the
minimum spanning tree.

62
Kruskal’s algorithm
• Sort the edge of G in ascending order of cost ….Ɵ(E lg E).
• Let T={….}. … c
• While T has fewer than n-1 edges………(|v|).
-Let e denotes the next edge of G in order of cost. (|v|-1).
- If T∪ {e} does not create a cycle then T = T∪ {e} →…|v|2 , VlgV

63
Example
edge weight Action result

0,5 10 Initial
2,3 12 Add
1,6 14 Add
1,2 16 Add
3,6 18 Discard
3,4 22 Add
4,6 24 Discard
4,5 25 Add
0,1 28 Not
considered

T(V,E) = c1 .E lg E+ c2 + c3.v| + c4.|v-1| + c5.v2


= Ɵ(max (ElgE, V2 )

64
Dense and sparse graph/tree
• In dense T(V,E)= Ɵ(max (E lgE, V2 ).
• In spars Tree, E≈V = Ɵ(max (V lgV, V2 ).
• Most used: ElgE

• Sparse | E|=O(|V|)
• In dense, |E|<=O(|v2|)

65
Prim
1. Select a starting vertex
2. Repeat
1. Select an edge 'e' connecting the tree vertex and nearest
vertex that has minimum weight (VlgV)
2. Add the selected edge and the vertex to the minimum spanning
tree T (ElgV)
3. EXIT

• O(VlgV+ ElgV)= O(ElgV)

• ================== O(ElgV) ==========

66
prim

67
Other algorithms
• As you have been learnt in Data Structures
– DFS
– BFS

To be discussed in class briefly


DFS
Stack manner (select next v alphabeticallyv)

69
dfsw
Weight-order
BFS
• Queue Manner (select v alphabetically)

71
bfs
Shortest path,. Cont..
• SSSD
• SSMD
• MSSD
• MSMD, all pairs shortest paths, Floyd’s
Dijkstra’s
Complexity: O(|E|lg|V|+|V|)

Algorithm
1. The very first step is to mark all nodes as unvisited,
2. Mark the picked starting node with a current distance of 0 and the rest nodes with
infinity,
3. Now, fix the starting node as the current node,
4. For the current node, analyze all of its unvisited neighbors and measure their
distances by adding the current distance of the current node to the weight of the
edge that connects the neighbor node and current node,
5. Compare the recently measured distance with the current distance assigned to the
neighboring node and make it as the new current distance of the neighboring
node,
6. After that, consider all of the unvisited neighbors of the current node, mark the
current node as visited,
7. If the destination node has been marked visited then stop, an algorithm has ended,
8. Else, choose the unvisited node that is marked with the least distance, fix it as the
new current node, and repeat the process again from step 4.
A→ B C D E F G H

75
Shortest path
• next

76

You might also like