DSA2 - Chap2 - Algorithm Analysis
DSA2 - Chap2 - Algorithm Analysis
Chapter 2
Algorithm Analysis
❖ Algorithms efficiency
- Machine-dependent vs Machine-independent
❖ Function ordering
- Order of growth
- Weak Order;
- Landau symbols Big-Oh ; Big-omega ; big theta and
Little-oh .
n! ≅ (n/e)n
➔ It will be impossible to run the algorithm for n = 30
5
Algorithm efficiency
Subroutine 2: V = a, W= b
While W > 1
V →V + a; W →W-1
Output V
6
Algorithm efficiency
Ideally, we would like to program all choices and run all of them in
the machine we are going to use and find which is efficient!
7
Non-basic Operations:
Sorting, Searching
In fact, we will not worry about the exact values, but will
look at “broad classes” of values.
Let there be n inputs.
If an algorithm needs n basic operations and another
needs 2n basic operations, we will consider them to be
in the same efficiency category.
However, we distinguish between exp(n), n, log(n)
Function Ordering 9
Quadratic Growth
Consider the two functions
f(n) = n2 and g(n) = n2 – 3n + 2
Around n = 0, they look very different
11
Quadratic Growth
Quadratic Growth
The absolute difference is large, for example,
f(1000) = 1 000 000
g(1000) = 997 002
but the relative difference is very small
Polynomial Growth
To demonstrate with another example,
f(n) = n6 and g(n) = n6 – 23n5+193n4 –729n3+1206n2 – 648n
Polynomial Growth
Polynomial Growth
where
○ If f(n) < g(n), then you can never purchase a computer fast enough
so that the second function always runs in less time than the first
Definition 1
f(n) = O(g(n)) if there are a number n0 and
a nonnegative c such that
Intuitively, (not exactly) f(n) is O(g(n)) means f(n) ≤ g(n) for all n
beyond some value n0; i.e. g(n) is an upper bound for f(n).
20
Example Functions
limn→∝ 2n /n = 2, 2n is O(n)
21
limn→∝ ln(n) /n = 0, ln(n) is O(n)
Ω “Omega” Notation
Now a lower bound notation, Ω
Definition 2
f(n) = Ω(g(n)) if there are a number n0 and a nonnegative
c such that
for all n ≥ n0 , f(n) ≥ cg(n).
If >0 if exists
Definition 3
f(n) = θ(g(n)) if and only if f(n) is O(g(n)) and Ω(g(n))
f(n) = θ(g(n)) if there exist positive n0, c1, and c2 such that
c1 g(n) ≤ f(n) ≤ c2 g(n) whenever n ≥ n0
Definition 4
f(n) = o(g(n)) if for all positive constant c, there exists
an n0 such that :
f(n) < cg(n) when n > n0
= 0 if exists
26
Function orders “Landau Symbols”
Repeat as necessary…
Note: the kth derivative will always be shown as
30
Big-Θ as an Equivalence Relation
2. f(n) = Θ(f(n))
limn→∝ 2n /n = 2, 2n is θ(n)
34
limn→∝ ln(n) /n = 0, ln(n) is o(n)
n is θ(n+sqrt(n)),
limn→∝ n/(sqrt(n)+n) = 1,
Algorithms Analysis
Rule 1
Rule 2
If T(n) is a polynomial of degree k, then T(n) = θ(nk).
Rule 3
• logk n = O(n) for any constant k.
This tells us that logarithms grow very slowly.
Rules for arithmetic with big-O symbols 38
Rule 4
If f(n) = O(g(n)), then
c * f(n) = O(g(n)) for any constant c.
Rule 5
If f1(n) = O(g (n)) but f2(n) = o(g(n)), then
f1(n) + f2(n) = O(g(n)).
Rule 6
If f(n) = O(g(n)), and g(n) = o(h(n)), then
f(n) = o(h(n)). (complexity of f o g )
These are not all of the rules, but they’re enough for most purposes.
Algorithm Complexity Analysis 39
Example
This is O(N)
Algorithm Complexity Analysis 41
General Rules
Rule 1- Consecutive Statements:
This just add , which means that the maximum is that counts .
Example
1. sum = 0;
2. for (i=0; i < N; i++) Outer loop: N iterations
3. for (j=0; j < N; j++) Inner loop: O(N)
4. sum → sum + 1; Overall: O(N2)
If (yes)
print(1,2,….1000N)
else print(1,2,….N2) overall O(N2)
the basic strategy is analyzing from the inside (or deepest part ) out . If
there are function calls , these must be analyzed first .
Algorithm Complexity Analysis 44
Analysis of recursion
• If the recursion is really just a for loop , the analysis is usually trivial
.
if n>=2
T(n) = cost of constant op at line 1 + cost of line 3 work
(for convenience, the maximum subsequence sums is 0 if all integers are negative)
Example
for the input -2, 11,-4,13,-5,-2 the answer is 20
subsequence sum
( seconds)
Algorithm 1
50
Complexity of Algorithm 1
We have
inner loop
Outer Loop
Overall: O(N3)
51
Analysis of Algorithm 1
in Algorithm 1 can be made more efficient leading to O(N2).
Thus , the cubic running time can be avoid by removing the innermost
for loop, because :
52
Maximum Subsequence Problem
Algorithm 2
/**
* Quadratic maximum contiguous subsequence sum algorithm.
*/
int maxSubSum2( const vector<int> & a )
{
int maxSum = 0;
for( int i = 0; i < a.size( ); ++i )
{
int thisSum = 0;
for( int j = i; j < a.size( ); ++j )
{
thisSum += a[ j ];
Complexity of Algorithm 2
O(N2 )
54
Algorithm 3
Divide and Conquer
❖ Divide the array into two parts: left part, right part each to
be solved recursively
Example
First half Second half
4 –3 5 –2 -1 2 6 -2
Max subsequence sum for first half = 6 (elements A1–A3)
second half = 8 (elements A5–A7)
Max subsequence sum for first half ending at the last
element (4th elements included) is 4 (elements A1–A4)
Max subsequence sum for second half starting at the first
element (5th element included) is 7 (elements A5–A7)
Max subsequence sum spanning the middle is 4 + 7 = 11
Max subsequence spans the middle
Maximum Subsequence Problem 57
Algorithm 3 : divide and conquer
58
Complexity analysis
Algorithm 3 59
• N>1: 2 recursive calls, 2 for loops, some bookkeeping ops (e.g. lines
14, 34)
– Lines 8, 14, 18, 26, 34: constant time; ignored compared to O(N)
2 * T(N/2)
Algorithm 3
T(1)=1
T(n) = 2T(n/2) + cn
= 2.(cn/2 + 2T(n/4) )+ cn
= 4T(n/4) + 2cn
= 8T(n/8) + 3cn
=…………..
= 2iT(n/2i) + icn
=………………… (reach a point when n = 2i i=log n
= n.T(1) + c n log n
= n + c n logn = O(n logn)
Complexity analysis 61
Algorithm 4
Binary Search
• Given an integer X and integers A0,A1,A2, …..,An-1 which
are presorted.
• find i such that Ai = X, or
• return i= -1 if X is not in the input.
Solution 1
➔ Scanning through the list from left to right. Runs in linear
time .
➔ this algorithm does not take advantage of the fact that the
list is sorted .
Solution 2 (better)
➔ Check if X is the middle. If so, the answer is found .
➔ If X < the middle , we can apply the same strategy to
the sorted subarray to the left;
➔ likewise, if X > middle, we look to the right half.
Complexity analysis 63
Binary Search
Algorithm 1
64
Algorithm 2
Search(num, A[],left, right)
{
if (left = right)
{
if (A[left ]=num) return(left) and exit;
else conclude NOT PRESENT and exit;
}
center =⎣ (left + right)/2⎦;
If (A[center] < num)
Search(num, A[], center + 1, right);
If (A[center]>num)
Search(num, A[], left, center );
If (A[center]=num) return(center) and exit;
}
Complexity analysis 65
Binary Search
Algorithm 1
work done inside the loop takes O(1) per iteration
number of iterations ?
The number of iterations continues until the search space is
reduced to 1 (or the target is found). The relationship can be
described by:
n,n/2,n/4,…,1
The number of iterations needed to reduce n to 1 is log2n.
Algorithm 2
T(n) = T(n/2) +C
Examples
Complexity analysis 71
Recursion
There are three methods for solving recurrences—that is, for obtaining
asymptotic “Θ” or “O” bounds on the solution: