Algorithms:
analysis, complexity
Click to add Text
Based on the lecture note Algorithms: analysis, complexity, MIT
Plan
Introduction
Analysis of algorithms
Recursive algorithm analysis
2
Algorithms
Finite set of instructions that solves a given
problem
(a)
Characteristics:
Input. Zero or more quantities are supplied
Output. At least one quantity is computed
Definiteness. Each instruction is computable
Finiteness. The algorithm terminates with the
answer or by telling us no answer exists
3
Algorithms: forms of analysis
How to devise an algorithm
How to validate the algorithm is correct -
Correctness proofs
How to analyze running time and space of algorithm
Complexity analysis: asymptotic, empirical, others
How to choose or modify an algorithm to solve a
problem
How to implement and test an algorithm in a
program
Keep program code short and correspond closely
to algorithm steps
4
Plan
Introduction
Analysis of algorithms
Recursive algorithm analysis
5
Analysis of algorithms
Time complexity of a given algorithm
How does time depend on problem size?
Does time depend on problem instance or details?
Is this the fastest algorithm?
How much does speed matter for this problem? •
Space complexity
How much memory is required for a given problem size?
Assumptions on computer word size, processor
Fixed word/register size
Single or multi (grid, hypercube) processor
Solution quality
Exact or approximate/bounded
Guaranteed optimal or heuristic
6
Methods of complexity analysis
Asymptotic analysis
Create recurrence relation and solve
This relates problem size of original problem to number and size of sub-
problems solved
Different performance measures are of interest
Worst case (often easiest to analyze; need one ‘bad’ example)
Best case (often easy for same reason)
Data-specific case (usually difficult, but most useful)
Write implementation of algorithm (on paper)
Create table (on paper) of frequency and cost of steps
Sum up the steps; relate them to problem size
Implement algorithm
Count steps executed with counter variables, or use timer
Vary problem size and analyze the performance
7
Asymptotic notation: upper bound O(..)
f(n)= O(g(n)) if and only if
f(n) ≤ c * g(n)
where c > 0
for all n > n0
Example
f(n)= 6n + 4√n
g(n)= n
c= 10 (not unique)
f(n)= c * g(n) when n= 1
f(n) < g(n) when n > 1
Thus, f(n)= O(n)
O(..) is worst case (upper bound) notation for an algorithm’s
complexity (running time)
8
Asymptotic notation: lower bound Ω(..)
f(n)= Ω(g(n)) if and only if
f(n) ≥ c * g(n)
where c > 0
for all n > n0
Example
f(n)= 6n + 4√n
g(n)= n
c= 6 (again, not unique)
f(n)= c * g(n) when n=0
f(n) > g(n) when n > 0
Thus, f(n)= Ω (n)
Ω(..) is best case (lower bound) notation for an algorithm’s
complexity (running time)
9
Asymptotic notation
Worst case or upper bound: O(..)
f(n)= O(g(n)) if f(n) ≤ c* g(n)
Best case or lower bound: Ω(..)
f(n)= Ω(g(n)) if f(n) ≥ c* g(n)
Composite bound: Θ(..)
f(n)= Θ(g(n)) if c1* g(n) ≤f(n) ≤ c2* g(n)
Average or typical case notation is less
formal
We generally say “average case is O(n)”
10
Example performance of some common
algorithms
11
Analysis of Algorithms
Analysis of algorithms is used to describe
approaches to the study of the performance of
algorithms.
the worst-case runtime complexity of the algorithm is the function
defined by the maximum number of steps taken on any instance
of size a.
the best-case runtime complexity of the algorithm is the function
defined by the minimum number of steps taken on any instance
of size a.
the average case runtime complexity of the algorithm is the
function defined by an average number of steps taken on any
instance of size a.
the amortized runtime complexity of the algorithm is the function
defined by a sequence of operations applied to the input of size a
and averaged over time.
12
Analysis of Algorithms - Example
Sequential searching in an array of size n
i=0;
Found=0;
while(i<=n-1 && !Found)
if (A[i] == x) Found=1;
else i=i+1;
Its worst-case runtime complexity is O(n)
Its best-case runtime complexity is Ω (1)
Its average case runtime complexity is
O(n/2)=O(n)
13
Analysis of time complexity
Time complexity estimates the time to run an algorithm. It's
calculated by counting elementary operations.
Execution time depends only on the algorithm and its input.
This can be achieved by choosing an elementary operation,
which the algorithm performs repeatedly
The time complexity T(n) as the number of such operations the
algorithm performs given an array of length n.
The worst case time complexity:
Let T1(n), T2(n), … be the execution times for all possible inputs
of size n.
The worst-case time complexity W(n) is then defined as
W(n) = max(T1(n), T2(n), …).
14
Examples
Example 1
/*1*/ Sum=0;
/*2*/ for(i=1;i<=n;i++) {
/*3*/ scanf(“%d”,&x);
/*4*/ Sum=Sum+x;
}
Example 2
/*1*/ Sum1=0;
/*2*/ k=1;
/*3*/ while (k<=n) {
/*4*/ for(j=1;j<=n;j++)
/*5*/ Sum1=Sum1+1;
/*6*/ k=k*2;
}
15
Plan
Introduction
Analysis of algorithms
Recursive algorithm analysis
16
Recursive algorithm analysis
Create and solve recurrence equation
Recurrence equation
C(n)
T(n)
F(T(k)) d(n)
C(n): time complexity of the trivial case
F(T(k)):a polynomial function of T(k)
d(n): time complexity of dividing, synthesizing sub
problems
17
Example: Counting number of bits
Input: A positive decimal integer n.
Output: The number of binary digits in n’s binary
representation.
18
Example: Counting number of bits
Input: A positive decimal integer n.
Output: The number of binary digits in n’s binary
representation.
NON-RECURSIVECOUNT(n)
count= 1
while n >1
count = count+ 1
n= n/2
return count
RECURSIVECOUNT(n)
if n= 1 return 1
else return RecursiveCount(n/2) + 1
19
Solving recurrence equation
Iterative method
Expand (iterate) the recurrence and express it as a
summation of terms depending only on n and the
initial conditions.
Example:
T(n)= 2 if n= 0
T(n)= 2 + T(n-1) if n> 0
To solve for T(n)
T(n) = 2 + T(n-1)
= 2 + 2 + T(n-2)
= 2*2 + T(n-2)
…
= n*2 + T(0)
= 2n + 2
Thus, T(n) = O(n)
20
Solving recurrence equation
Substitution method
Idea: Make a guess for the form of the
solution and prove by induction
No “recipe” for the guess
Try iteratively O(n3), O(n2), and so on
Can be used to prove both upper bounds
O() and lower bounds Ω()
Example:
T(n)= 1 if n= 1
T(n)= 2T(n/2) + n if n>1
21
Solving recurrence equation
Substitution method - Example
Guess T(n)≤ cnlogn for some constant c
(that is, T(n) = O(nlogn))
Proof
Base case: we need to show that our guess holds for some base
case . Ok, since function constant for small constant n.
n
Assume holds for :
2
T( n2 ) ≤ c n2 log n2 (Question: Why not n-1?)
Prove that holds for n: T(n)≤ cnlogn
n
T(n) = 2T( 2 ) +n
≤ 2(c n log n ) +n= cnlog n + n
2 2 2
= cnlogn−cnlog2 +n = nlogn-cn+n
So ok if c≥1
22
Solving recurrence equation
Master theorem
Solving recurrences of this form
c, n 1
T(n) n
aT( ) (nd ), n 1
b
- n=bk, for positive integer k
- a≥1, b≥2, c>0, d≥0 are constants
O(n d ), a b d
T(n) O(n d log b n), a b d
log b a
O(n ), a b d
23
Solving recurrence equation
Master theorem - Examples
Solving
equations, T(1)=1:
– T(n) = 4T(n/2) + n
– T(n) = 4T(n/2) + n2
– T(n) = 4T(n/2) + n3
24
Thanks for your attention!
25