Daa Unt 1
Daa Unt 1
• Correctness:-
Correct: Algorithms must produce correct result.
Approximation algorithm: Exact solution is not found,
but near optimal solution can be found out. (Applied to
optimization problem.)
• Less resource usage:
Algorithms should use less resources (time and space).
To analyse an algorithm
• Code and execute, find actual time.
• What does the total time depend upon
§ Algorithm
§ Number of inputs
§ Count the number of primitive operations like
assignment, function call, control transfer,
arithmetic etc.
• Solution to all issues is : Asymptotic analysis of
algorithms.
5
Analyzing pseudo-code (by counting)
1. For each line of pseudo-code, count the number of
primitive operations in it.
Pay attention to the word "primitive" here; sorting an array is
not a primitive operation.
2. Multiply this count with the number of times this line
is executed.
3. Sum up over all lines.
Analysis of Algorithms 6
Proving Loop Invariants
• Proving loop invariants works like induction
• Initialization (base case):
– It is true prior to the first iteration of the loop
• Maintenance (inductive step):
– If it is true before an iteration of the loop, it remains true before
the next iteration
• Termination:
– When the loop terminates, the invariant gives us a useful
property that helps show that the algorithm is correct
– Stop the induction when the loop terminates
7
Analysis of Insertion Sort
INSERTION-SORT(A) cost times
for j ← 2 to n c1 n
do key ← A[ j ] c2 n-1
Insert A[ j ] into the sorted sequence A[1 . . j -1] 0 n-1
i←j-1 c4 n-1
å
n
while i > 0 and A[i] > key c5 j =2 j
t
å
n
do A[i + 1] ← A[i] c6 j =2
(t j - 1)
å
n
i←i–1 c7 j =2
(t j - 1)
A[i + 1] ← key c8 n-1
tj: # of times the while statement is executed at iteration j
T (n) = c1n + c2 (n - 1) + c4 (n - 1) + c5 å t j + c6 å (t j - 1) + c7 å (t j - 1) + c8 (n - 1)
n n n
j =2 j =2 j =2
8
Best Case Analysis
• The array is already sorted “while i > 0 and A[i] > key”
– A[i] ≤ key upon the first time the while loop test is run
(when i = j -1)
– tj = 1
• T(n) = c1n + c2(n -1) + c4(n -1) + c5(n -1) + c8(n-1) = (c1 + c2
+ c4 + c5 + c8)n + (c2 + c4 + c5 + c8)
= an + b = Q(n)
T (n) = c1n + c2 (n - 1) + c4 (n - 1) + c5 å t j + c6 å (t j - 1) + c7 å (t j - 1) + c8 (n - 1)
n n n
j =2 j =2 j =2
9
Worst Case Analysis
• The array is in reverse sorted order“while i > 0 and A[i] > key”
– Always A[i] > key in while loop test
– Have to compare key with all elements to the left of the j-th
position Þ compare with j-1 elements Þ tj = j
n
n(n + 1) n
n(n + 1) n
n(n - 1)
using å
j =1
j =
2
=> å
j =2
j =
2
- 1 => å ( j -1) =
j =2 2
we have:
= an 2 + bn + c a quadratic function of n
j =2 j =2 j =2 10
Comparisons and Exchanges in Insertion
Sort
cost times
INSERTION-SORT(A)
c1 n
for j ← 2 to n
c2 n-1
do key ← A[ j ]
Insert A[ j ] into the sorted sequence A[1 . . j -1] 0 n-1
c6 å
n
do A[i + 1] ← A[i] j =2
(t j - 1)
i←i–1
exchanges c7 å
n
» n2/2 j =2
(t j - 1)
A[i + 1] ← key
c8 n-1
11
Time complexity analysis-some
general rules.
12
n
(sorted)
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display.
Q(1)
T(n/2)
T(n/2)
Q(n)
T(n) = cn(lg n + 1)
= cnlg n + cn
T(n) is Q(n lg n)
Asymptotic Notation
a) Big-Oh
b) Omega
c) Theta
d) Little-Oh, and
e) Little-Omega Notation
Symbol Meaning
• 2n2= Q(n2)
• C1=1, C2=3 (or C1=C2=2) and n0=0
O-notation
For function f(n), we define O(g(n)),
big-O of n, as the set:
O(g(n)) = {f(n) :
$ positive constants c and n0,
such that "n ³ n0,
we have 0 £ f(n) £ cg(n) }
Intuitively: Set of all functions
whose rate of growth is the same as
or lower than that of g(n).
g(n) is an asymptotic upper bound for f(n).
f(n) = Q(g(n)) Þ f(n) = O(g(n)).
Q(g(n)) Ì O(g(n)).
Comp 122
w Given functions f(n) and g(n), we say that f(n) is O(g(n)) if
there are positive constants
c and n0 such that
f(n) £ cg(n) for n ³ n0
w Example: 2n + 10 is O(n)
s 2n + 10 £ cn
s (c - 2) n ³ 10
s n ³ 10/(c - 2)
s Pick c = 3 and n0 = 10
Comp 122
More Big-Oh Examples
7n-2
7n-2 is O(n)
need c > 0 and n0 ³ 1 such that 7n-2 £ c•n for n ³ n0
this is true for c = 7 and n0 = 1
n 3n3 + 20n2 + 5
3n3 + 20n2 + 5 is O(n3)
need c > 0 and n0 ³ 1 such that 3n3 + 20n2 + 5 £ c•n3 for n ³ n0
this is true for c = 4 and n0 = 21
n 3 log n + 5
3 log n + 5 is O(log n)
need c > 0 and n0 ³ 1 such that 3 log n + 5 £ c•log n for n ³ n0
this is true for c = 8 and n0 = 2
Analysis of Algorithms 31
Big-Oh Rules (shortcuts)
If f(n) is a polynomial of degree d, then f(n) is
O(nd), i.e.,
1. Drop lower-order terms
2. Drop constant factors
Use the smallest possible class of functions
– Say “2n is O(n)” instead of “2n is O(n2)”
Use the simplest expression of the class
– Say “3n + 5 is O(n)” instead of “3n + 5 is O(3n)”
Analysis of Algorithms 32
Examples
O(g(n)) = {f(n) : $ positive constants c and n0,
such that "n ³ n0, we have 0 £ f(n) £ cg(n) }
2n2 =O(n3 )
2n2 = O(n2 )
• 2n2 =O(n3 )
• C=1 and n0=2
• 2n2 = O(n2 )
• C=2 and n0=0
W -notation
For function f(n), we define W(g(n)),
big-Omega of n, as the set:
W(g(n)) = {f(n) :
$ positive constants c and n0,
such that "n ³ n0,
we have 0 £ cg(n) £ f(n)}
Intuitively: Set of all functions
whose rate of growth is the same
as or higher than that of g(n).
g(n) is an asymptotic lower bound for f(n).
f(n) = Q(g(n)) Þ f(n) = W(g(n)).
Q(g(n)) Ì W(g(n)).
Comp 122
Example
W(g(n)) = {f(n) : $ positive constants c and n0, such
that "n ³ n0, we have 0 £ cg(n) £ f(n)}
• Solution: T(n)=O(n)
1 if n=1
T(n) =
2T(n/2) + n if n>1
Solution: T(n)=O(nlgn)
1 if n=1
T(n) =
T(n/3) + T(2n/3)+n if n>1
Solution: T(n)=O(nlgn)
Solving recurrence equations.
T(n)=3T(n−2)
My first step was to iteratively substitute terms to arrive at a general
form:
T(n−2)=3T(n−2−2)
=3T(n−4)
T(n)=3∗3T(n−4)
Leading to the general form:
T(n)=3k T(n-2k)
n−2k=1 for k, which is the point where the recurrence stops
(where T(1)) and
Insert that value (n/2−1/2=k) into the general form:
T(n)=3n/2-1/2
Recursion Tree Method to Solve Recurrence Relations
Recursion Tree is another method for solving the recurrence
relations.
A recursion tree is a tree where each node represents the cost
of a certain recursive sub-problem.
We sum up the values in each node to get the cost of the
entire algorithm.
Steps in Recursion Tree Method to Solve Recurrence
Relations
Step-01:
Draw a recursion tree based on the given recurrence
relation.
Step-02:
Determine-
– Cost of each level
– Total number of levels in the recursion tree
– Number of nodes in the last level
– Cost of the last level
Step-03:
Add cost of all the levels of the recursion tree and simplify
the expression so obtained in terms of asymptotic notation.
The Recursion Tree
Draw the recursion tree for the recurrence relation and look for a
pattern:
ì b if n < 2
T (n) = í
î2T (n / 2) + bn if n ³ 2
time
depth T’s size
0 1 n bn
1 2 n/2 bn
i 2i n/2i bn
… … … …
Total time = bn + bn log n
T ( n ) = 2T (n / 2) + bn log n
= 2(c( n / 2) log(n / 2)) + bn log n
= cn (log n - log 2) + bn log n
= cn log n - cn + bn log n
Divide-and-Conquer 53
Guess-and-Test Method, Part 2
Recall the recurrence equation:
ì b if n < 2
T (n) = í
î2T (n / 2) + bn log n if n ³ 2
Guess #2: T(n) < cn log n.
2
T (n) = 2T (n / 2) + bn log n
= 2(c(n / 2) log 2 (n / 2)) + bn log n
= cn(log n - log 2) 2 + bn log n
= cn log 2 n - 2cn log n + cn + bn log n
– if c > b. £ cn log 2 n
So, T(n) is O(n log2 n).
In general, to use this method, you need to have a good guess and you
need to be good at induction proofs.
Divide-and-Conquer 54
Master Method
Many divide-and-conquer recurrence equations have the
form:
ì c if n < d
T (n) = í
îaT (n / b) + f (n) if n ³ d
The Master Theorem:
1. if f (n) is O(n logb a -e ), then T (n) is Q(n logb a )
2. if f (n) is Q(n logb a log k n), then T (n) is Q(n logb a log k +1 n)
logb a +e
3. if f (n) is W(n ), then T (n) is Q( f (n)),
provided af (n / b) £ df (n) for some d < 1.
Divide-and-Conquer 55
Master Method, Example 1
The form: ì c if n < d
T (n) = í
îaT (n / b) + f (n) if n ³ d
The Master Theorem:
1. if f (n) is O(n logb a -e ), then T (n) is Q(n logb a )
2. if f (n) is Q(n logb a log k n), then T (n) is Q(n logb a log k +1 n)
3. if f (n) is W(n logb a +e ), then T (n) is Q( f (n)),
provided af (n / b) £ df (n) for some d < 1.
Example:
T (n) = 4T (n / 2) + n
Solution: logba=2, so case 1 says T(n) is O(n2).
Divide-and-Conquer 56
Master Method, Example 2
The form: ì c if n < d
T (n) = í
îaT (n / b) + f (n) if n ³ d
The Master Theorem:
1. if f (n) is O(n logb a -e ), then T (n) is Q(n logb a )
2. if f (n) is Q(n logb a log k n), then T (n) is Q(n logb a log k +1 n)
3. if f (n) is W(n logb a +e ), then T (n) is Q( f (n)),
provided af (n / b) £ df (n) for some d < 1.
Example:
T (n) = 2T (n / 2) + n log n
Solution: logba=1, so case 2 says T(n) is O(n log2 n).
Divide-and-Conquer 57
Master Method, Example 3
The form: ì c if n < d
T (n) = í
îaT (n / b) + f (n) if n ³ d
The Master Theorem:
1. if f (n) is O(n logb a -e ), then T (n) is Q(n logb a )
2. if f (n) is Q(n logb a log k n), then T (n) is Q(n logb a log k +1 n)
3. if f (n) is W(n logb a +e ), then T (n) is Q( f (n)),
provided af (n / b) £ df (n) for some d < 1.
Example:
T (n) = T (n / 3) + n log n
Solution: logba=0, so case 3 says T(n) is O(n log n).
Divide-and-Conquer 58
Master Method, Example 4
The form: ì c if n < d
T (n) = í
îaT (n / b) + f (n) if n ³ d
The Master Theorem:
1. if f (n) is O(n logb a -e ), then T (n) is Q(n logb a )
2. if f (n) is Q(n logb a log k n), then T (n) is Q(n logb a log k +1 n)
3. if f (n) is W(n logb a +e ), then T (n) is Q( f (n)),
provided af (n / b) £ df (n) for some d < 1.
Example:
T (n) = 8T (n / 2) + n 2
Divide-and-Conquer 59
Master Method, Example 5
The form: ì c if n < d
T (n) = í
îaT (n / b) + f (n) if n ³ d
The Master Theorem:
1. if f (n) is O(n logb a -e ), then T (n) is Q(n logb a )
2. if f (n) is Q(n logb a log k n), then T (n) is Q(n logb a log k +1 n)
3. if f (n) is W(n logb a +e ), then T (n) is Q( f (n)),
provided af (n / b) £ df (n) for some d < 1.
Example:
T (n) = 9T (n / 3) + n 3
Divide-and-Conquer 60
Master Method, Example 6
The form: ì c if n < d
T (n) = í
îaT (n / b) + f (n) if n ³ d
The Master Theorem:
1. if f (n) is O(n logb a -e ), then T (n) is Q(n logb a )
2. if f (n) is Q(n logb a log k n), then T (n) is Q(n logb a log k +1 n)
3. if f (n) is W(n logb a +e ), then T (n) is Q( f (n)),
provided af (n / b) £ df (n) for some d < 1.
Example:
T (n) = T (n / 2) + 1 (binary search)
Divide-and-Conquer 61
Master Method, Example 7
The form: ì c if n < d
T (n) = í
îaT (n / b) + f (n) if n ³ d
The Master Theorem:
1. if f (n) is O(n logb a -e ), then T (n) is Q(n logb a )
2. if f (n) is Q(n logb a log k n), then T (n) is Q(n logb a log k +1 n)
3. if f (n) is W(n logb a +e ), then T (n) is Q( f (n)),
provided af (n / b) £ df (n) for some d < 1.
Example:
Divide-and-Conquer 62
Solve:
T(n) = 9T(n/3)+n.
Example 1: T(n) = 9T(n/3)+n.
Here a = 9, b = 3, f(n) = n, and nlogb a = nlog3 9 = Θ(n2). Since
=1
2
we have )
So case 3 applies if we can show that
af(n/b) ≤ cf(n) for some c < 1 and all sufficiently large n.
Solution is T(n)=O(nlogn)
Changing variables
• Examble: Consider the recurrence