Topic 2 and 3
Topic 2 and 3
Time Efficiency :
• how fast the algorithm runs
• The time taken by a program to complete its task depends on the number of steps in an algorithm
Two types:
Compilation time – time for compilation
Run Time – Execution time depends on the size of the algorithm
Space Efficiency :
• The number of units the algorithm requires for memory storage
General Framework:
• Ex:
• i) problem of evaluating a polynomial p(x) = anxn + ... +a0 :
◦ input's size – polynomial's degree or number of coefficients
• ii) computing the product of two n-by-n matrices
◦ input's size – total number of elements N in the matrices
• Measuring size of the inputs by the number of bits in the n's binary representation:
• number of bits b; b=լlog2n˩+1
• Ex:
n Log2n լLog2n˩ b
1 0.0000 0 1
9 3.1699 3 4
15 3.9069 3 4
• Ex:
i ←0
while i < n and A[i] ≠ K do
i ←i + 1
if i < n return i
else return −1
• Worst-case Efficiency – The worst-case efficiency of an algorithm is its efficiency for the worst-case input of
size n, which is an input (or inputs) of size n for which the algorithm runs the longest among all possible
inputs of that size.
C worst (n) = n
• Best-case Efficiency - The best-case efficiency of an algorithm is its efficiency for the best-case input of size
n, which is an input (or inputs) of size n for which the algorithm runs the fastest among all possible inputs of
that size.
C best (n) = 1
n+ 1
• Successful search: p=1, The average number of key comparisons is 2
• Unsuccessful search: p=0, The average number of key comparisons is n
• the average-case efficiency cannot be obtained by taking the average of the worst-case and the best-case
efficiencies.
• Amortized efficiency:
◦ It applies not to a single run of an algorithm but rather to a sequence of operations performed on the same
data structure.
◦ The total time for an entire sequence of n such operations is always significantly better than the worst-
case efficiency of that single operation multiplied by n.
CS 8451 – DESIGN AND ANALYSIS OF ALGORITHMS (UNIT - 1) 12
Definition:
• A function t(n) is said to be in O(g(n)) denoted as t(n) ϵ O(g(n)), if t(n) is bounded above by some
constant multiple of g(n) for all large n i.e) if there exists some positive constant C and some non-
negative integer n0 such that
Ex:
t(n) = 4n; g(n) = 5n
Definition:
• A function t(n) is said to be in Ω (g(n)) denoted as t(n) ϵ Ω(g(n)), if t(n) is bounded below by some
positive constant multiple of g(n) for all large n i.e) if there exists some positive constant C and some
non-negative integer n0, such that
• Diagram:
Ex:
t(n) = 5n; g(n) = 4n
Note:
Θ(g(n)) = o(g(n)) ∩ Ω(g(n))
Properties:
• Ex:
1
t1(n) = n(n-1) , t2(n) = n-1
2
t1(n) ∈ O(n2) , t2(n) ∈ O(n) ; i.e) g1(n) = n2, g2(n) = n
t1(n) + t2(n) ∈ O(max{g1(n), g2(n)})
So, t1(n) + t2(n) ∈ O(max{n2, n}) = O(n2)
Using Limits for Comparing Orders of Growth:
Three principal cases
L’Hospital’s rule :
Stirling’s formula:
1
EXAMPLE 1: Compare the orders of growth of n(n-1) and n2.
2
• Limit is equal to a constant, the functions have the same order of growth or, symbolically,
1
n(n-1) ∈ Θ(n2).
2
EXAMPLE 2 Compare the orders of growth of log2 n and √n.
1. If there are 2 functions t1(n) and t2(n), such that t1(n) ∈O(g1(n)) and t2(n) ∈ O(g2(n)) then
t1(n) + t2(n) = O(max {g1(n), g2(n)})
2. t(n) ∈O(t(n))
3. If there are 2 functions t1(n) and t2(n), such that t1(n) ∈O(g1(n)) and t2(n)∈O (g2(n)) then
t1(n)* t2(n) = O (g1(n)*g2(n))
4. If t(n) ∈O(g(n)) and g(n) ∈O(h(n)) then t(n) ∈O(h(n))
5. In a polynomial the highest power term dominates other terms i.e) maximum degree is considered
Eg: for 3n3+2n2+10
Time complexity is O(n3)
CS 8451 – DESIGN AND ANALYSIS OF ALGORITHMS (UNIT - 1) 15
6. Any constant values leads to O(1) time complexity. ie, if t(n) = c, then it belongs to O(1) time complexity
7. O(1) < O(log n)< O(n) < O(n2)< O(2n)
8. t(n) = Θ(g(n)) iff t(n) = O(g(n)) and t(n) = Ω(g(n))